Sep 9 05:36:50.809850 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 03:39:34 -00 2025 Sep 9 05:36:50.809870 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:36:50.809881 kernel: BIOS-provided physical RAM map: Sep 9 05:36:50.809888 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 9 05:36:50.809894 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 9 05:36:50.809901 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 9 05:36:50.809908 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 9 05:36:50.809915 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 9 05:36:50.809922 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 9 05:36:50.809930 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 9 05:36:50.809936 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 9 05:36:50.809943 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 9 05:36:50.809949 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 9 05:36:50.809956 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 9 05:36:50.809964 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 9 05:36:50.809973 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 9 05:36:50.809981 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 9 05:36:50.809988 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 9 05:36:50.809995 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 9 05:36:50.810002 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 9 05:36:50.810009 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 9 05:36:50.810016 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 9 05:36:50.810023 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 9 05:36:50.810030 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 05:36:50.810036 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 9 05:36:50.810045 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 9 05:36:50.810052 kernel: NX (Execute Disable) protection: active Sep 9 05:36:50.810059 kernel: APIC: Static calls initialized Sep 9 05:36:50.810066 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 9 05:36:50.810073 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 9 05:36:50.810080 kernel: extended physical RAM map: Sep 9 05:36:50.810087 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 9 05:36:50.810095 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 9 05:36:50.810102 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 9 05:36:50.810109 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 9 05:36:50.810116 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 9 05:36:50.810125 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 9 05:36:50.810132 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 9 05:36:50.810139 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 9 05:36:50.810146 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 9 05:36:50.810156 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 9 05:36:50.810163 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 9 05:36:50.810172 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 9 05:36:50.810180 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 9 05:36:50.810187 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 9 05:36:50.810194 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 9 05:36:50.810202 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 9 05:36:50.810209 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 9 05:36:50.810216 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 9 05:36:50.810224 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 9 05:36:50.810231 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 9 05:36:50.810238 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 9 05:36:50.810247 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 9 05:36:50.810254 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 9 05:36:50.810262 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 9 05:36:50.810269 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 05:36:50.810276 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 9 05:36:50.810283 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 9 05:36:50.810291 kernel: efi: EFI v2.7 by EDK II Sep 9 05:36:50.810298 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 9 05:36:50.810305 kernel: random: crng init done Sep 9 05:36:50.810313 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 9 05:36:50.810320 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 9 05:36:50.810329 kernel: secureboot: Secure boot disabled Sep 9 05:36:50.810336 kernel: SMBIOS 2.8 present. Sep 9 05:36:50.810343 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 9 05:36:50.810351 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:36:50.810358 kernel: Hypervisor detected: KVM Sep 9 05:36:50.810365 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 05:36:50.810372 kernel: kvm-clock: using sched offset of 3570932906 cycles Sep 9 05:36:50.810380 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 05:36:50.810388 kernel: tsc: Detected 2794.750 MHz processor Sep 9 05:36:50.810395 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 05:36:50.810403 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 05:36:50.810412 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 9 05:36:50.810420 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 9 05:36:50.810427 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 05:36:50.810434 kernel: Using GB pages for direct mapping Sep 9 05:36:50.810442 kernel: ACPI: Early table checksum verification disabled Sep 9 05:36:50.810449 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 9 05:36:50.810457 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 9 05:36:50.810464 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810472 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810481 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 9 05:36:50.810489 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810496 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810504 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810511 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:36:50.810519 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 9 05:36:50.810526 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 9 05:36:50.810534 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 9 05:36:50.810541 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 9 05:36:50.810551 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 9 05:36:50.810558 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 9 05:36:50.810565 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 9 05:36:50.810573 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 9 05:36:50.810580 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 9 05:36:50.810587 kernel: No NUMA configuration found Sep 9 05:36:50.810595 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 9 05:36:50.810602 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 9 05:36:50.810610 kernel: Zone ranges: Sep 9 05:36:50.810619 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 05:36:50.810627 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 9 05:36:50.810634 kernel: Normal empty Sep 9 05:36:50.810641 kernel: Device empty Sep 9 05:36:50.810649 kernel: Movable zone start for each node Sep 9 05:36:50.810656 kernel: Early memory node ranges Sep 9 05:36:50.810663 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 9 05:36:50.810671 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 9 05:36:50.810678 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 9 05:36:50.810718 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 9 05:36:50.810729 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 9 05:36:50.810736 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 9 05:36:50.810744 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 9 05:36:50.810751 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 9 05:36:50.810767 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 9 05:36:50.810774 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:36:50.810782 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 9 05:36:50.810799 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 9 05:36:50.810806 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:36:50.810814 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 9 05:36:50.810822 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 9 05:36:50.810829 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 9 05:36:50.810839 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 9 05:36:50.810847 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 9 05:36:50.810854 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 05:36:50.810862 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 05:36:50.810870 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 05:36:50.810879 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 05:36:50.810887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 05:36:50.810895 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 05:36:50.810903 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 05:36:50.810910 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 05:36:50.810918 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 05:36:50.810926 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 9 05:36:50.810934 kernel: TSC deadline timer available Sep 9 05:36:50.810941 kernel: CPU topo: Max. logical packages: 1 Sep 9 05:36:50.810951 kernel: CPU topo: Max. logical dies: 1 Sep 9 05:36:50.810959 kernel: CPU topo: Max. dies per package: 1 Sep 9 05:36:50.810966 kernel: CPU topo: Max. threads per core: 1 Sep 9 05:36:50.810974 kernel: CPU topo: Num. cores per package: 4 Sep 9 05:36:50.810981 kernel: CPU topo: Num. threads per package: 4 Sep 9 05:36:50.810989 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 9 05:36:50.810997 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 05:36:50.811004 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 9 05:36:50.811012 kernel: kvm-guest: setup PV sched yield Sep 9 05:36:50.811022 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 9 05:36:50.811030 kernel: Booting paravirtualized kernel on KVM Sep 9 05:36:50.811037 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 05:36:50.811045 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 9 05:36:50.811053 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 9 05:36:50.811061 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 9 05:36:50.811069 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 9 05:36:50.811076 kernel: kvm-guest: PV spinlocks enabled Sep 9 05:36:50.811084 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 05:36:50.811095 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:36:50.811103 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:36:50.811111 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 05:36:50.811119 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 05:36:50.811126 kernel: Fallback order for Node 0: 0 Sep 9 05:36:50.811134 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 9 05:36:50.811142 kernel: Policy zone: DMA32 Sep 9 05:36:50.811150 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:36:50.811159 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 05:36:50.811167 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 05:36:50.811175 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 05:36:50.811182 kernel: Dynamic Preempt: voluntary Sep 9 05:36:50.811190 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:36:50.811198 kernel: rcu: RCU event tracing is enabled. Sep 9 05:36:50.811206 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 05:36:50.811214 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:36:50.811222 kernel: Rude variant of Tasks RCU enabled. Sep 9 05:36:50.811230 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:36:50.811239 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:36:50.811247 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 05:36:50.811255 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 05:36:50.811263 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 05:36:50.811271 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 05:36:50.811279 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 9 05:36:50.811287 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:36:50.811295 kernel: Console: colour dummy device 80x25 Sep 9 05:36:50.811302 kernel: printk: legacy console [ttyS0] enabled Sep 9 05:36:50.811312 kernel: ACPI: Core revision 20240827 Sep 9 05:36:50.811320 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 9 05:36:50.811328 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 05:36:50.811335 kernel: x2apic enabled Sep 9 05:36:50.811343 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 05:36:50.811351 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 9 05:36:50.811359 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 9 05:36:50.811366 kernel: kvm-guest: setup PV IPIs Sep 9 05:36:50.811374 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 9 05:36:50.811384 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 9 05:36:50.811396 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 9 05:36:50.811404 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 05:36:50.811411 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 9 05:36:50.811419 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 9 05:36:50.811427 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 05:36:50.811435 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 05:36:50.811442 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 05:36:50.811452 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 9 05:36:50.811460 kernel: active return thunk: retbleed_return_thunk Sep 9 05:36:50.811468 kernel: RETBleed: Mitigation: untrained return thunk Sep 9 05:36:50.811476 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 05:36:50.811483 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 05:36:50.811491 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 9 05:36:50.811500 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 9 05:36:50.811508 kernel: active return thunk: srso_return_thunk Sep 9 05:36:50.811516 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 9 05:36:50.811527 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 05:36:50.811536 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 05:36:50.811544 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 05:36:50.811554 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 05:36:50.811561 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 9 05:36:50.811569 kernel: Freeing SMP alternatives memory: 32K Sep 9 05:36:50.811577 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:36:50.811585 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:36:50.811592 kernel: landlock: Up and running. Sep 9 05:36:50.811602 kernel: SELinux: Initializing. Sep 9 05:36:50.811610 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 05:36:50.811617 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 05:36:50.811625 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 9 05:36:50.811633 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 9 05:36:50.811641 kernel: ... version: 0 Sep 9 05:36:50.811648 kernel: ... bit width: 48 Sep 9 05:36:50.811656 kernel: ... generic registers: 6 Sep 9 05:36:50.811664 kernel: ... value mask: 0000ffffffffffff Sep 9 05:36:50.811673 kernel: ... max period: 00007fffffffffff Sep 9 05:36:50.811745 kernel: ... fixed-purpose events: 0 Sep 9 05:36:50.811754 kernel: ... event mask: 000000000000003f Sep 9 05:36:50.811772 kernel: signal: max sigframe size: 1776 Sep 9 05:36:50.811779 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:36:50.811787 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:36:50.811795 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 05:36:50.811803 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:36:50.811810 kernel: smpboot: x86: Booting SMP configuration: Sep 9 05:36:50.811821 kernel: .... node #0, CPUs: #1 #2 #3 Sep 9 05:36:50.811829 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 05:36:50.811836 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 9 05:36:50.811845 kernel: Memory: 2422676K/2565800K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 137196K reserved, 0K cma-reserved) Sep 9 05:36:50.811852 kernel: devtmpfs: initialized Sep 9 05:36:50.811860 kernel: x86/mm: Memory block size: 128MB Sep 9 05:36:50.811868 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 9 05:36:50.811875 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 9 05:36:50.811883 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 9 05:36:50.811893 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 9 05:36:50.811901 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 9 05:36:50.811909 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 9 05:36:50.811916 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:36:50.811924 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 05:36:50.811932 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:36:50.811940 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:36:50.811947 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:36:50.811955 kernel: audit: type=2000 audit(1757396208.958:1): state=initialized audit_enabled=0 res=1 Sep 9 05:36:50.811965 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:36:50.811972 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 05:36:50.811980 kernel: cpuidle: using governor menu Sep 9 05:36:50.811988 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:36:50.811995 kernel: dca service started, version 1.12.1 Sep 9 05:36:50.812003 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 9 05:36:50.812011 kernel: PCI: Using configuration type 1 for base access Sep 9 05:36:50.812019 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 05:36:50.812028 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:36:50.812036 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:36:50.812044 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:36:50.812052 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:36:50.812059 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:36:50.812067 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:36:50.812075 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:36:50.812082 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 05:36:50.812096 kernel: ACPI: Interpreter enabled Sep 9 05:36:50.812103 kernel: ACPI: PM: (supports S0 S3 S5) Sep 9 05:36:50.812113 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 05:36:50.812121 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 05:36:50.812129 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 05:36:50.812136 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 05:36:50.812144 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:36:50.812317 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:36:50.812437 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 9 05:36:50.812554 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 9 05:36:50.812565 kernel: PCI host bridge to bus 0000:00 Sep 9 05:36:50.812697 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 05:36:50.812835 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 05:36:50.812940 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 05:36:50.813043 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 9 05:36:50.813146 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 9 05:36:50.813253 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 9 05:36:50.813357 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:36:50.813490 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:36:50.813614 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 9 05:36:50.813745 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 9 05:36:50.813870 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 9 05:36:50.813988 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 9 05:36:50.814102 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 05:36:50.814228 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 05:36:50.814344 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 9 05:36:50.814460 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 9 05:36:50.814578 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 9 05:36:50.814720 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 9 05:36:50.814852 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 9 05:36:50.814967 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 9 05:36:50.815080 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 9 05:36:50.815202 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 05:36:50.815319 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 9 05:36:50.815432 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 9 05:36:50.815545 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 9 05:36:50.815662 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 9 05:36:50.815840 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 05:36:50.815956 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 05:36:50.816084 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 05:36:50.816197 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 9 05:36:50.816311 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 9 05:36:50.816434 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 05:36:50.816554 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 9 05:36:50.816565 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 05:36:50.816573 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 05:36:50.816581 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 05:36:50.816589 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 05:36:50.816597 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 05:36:50.816605 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 05:36:50.816612 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 05:36:50.816623 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 05:36:50.816631 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 05:36:50.816638 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 05:36:50.816646 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 05:36:50.816654 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 05:36:50.816662 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 05:36:50.816670 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 05:36:50.816678 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 05:36:50.816698 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 05:36:50.816708 kernel: iommu: Default domain type: Translated Sep 9 05:36:50.816716 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 05:36:50.816724 kernel: efivars: Registered efivars operations Sep 9 05:36:50.816732 kernel: PCI: Using ACPI for IRQ routing Sep 9 05:36:50.816740 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 05:36:50.816747 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 9 05:36:50.816755 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 9 05:36:50.816770 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 9 05:36:50.816778 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 9 05:36:50.816788 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 9 05:36:50.816795 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 9 05:36:50.816803 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 9 05:36:50.816811 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 9 05:36:50.816928 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 05:36:50.817041 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 05:36:50.817154 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 05:36:50.817164 kernel: vgaarb: loaded Sep 9 05:36:50.817175 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 9 05:36:50.817183 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 9 05:36:50.817191 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 05:36:50.817198 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:36:50.817206 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:36:50.817214 kernel: pnp: PnP ACPI init Sep 9 05:36:50.817350 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 9 05:36:50.817365 kernel: pnp: PnP ACPI: found 6 devices Sep 9 05:36:50.817375 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 05:36:50.817383 kernel: NET: Registered PF_INET protocol family Sep 9 05:36:50.817391 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 05:36:50.817400 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 05:36:50.817408 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:36:50.817416 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 05:36:50.817424 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 05:36:50.817432 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 05:36:50.817440 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 05:36:50.817450 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 05:36:50.817458 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:36:50.817466 kernel: NET: Registered PF_XDP protocol family Sep 9 05:36:50.817587 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 9 05:36:50.817717 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 9 05:36:50.817835 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 05:36:50.817942 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 05:36:50.818047 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 05:36:50.818157 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 9 05:36:50.818265 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 9 05:36:50.818370 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 9 05:36:50.818380 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:36:50.818389 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 9 05:36:50.818397 kernel: Initialise system trusted keyrings Sep 9 05:36:50.818408 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 05:36:50.818416 kernel: Key type asymmetric registered Sep 9 05:36:50.818424 kernel: Asymmetric key parser 'x509' registered Sep 9 05:36:50.818432 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 05:36:50.818441 kernel: io scheduler mq-deadline registered Sep 9 05:36:50.818449 kernel: io scheduler kyber registered Sep 9 05:36:50.818457 kernel: io scheduler bfq registered Sep 9 05:36:50.818465 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 05:36:50.818476 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 05:36:50.818484 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 05:36:50.818492 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 05:36:50.818500 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:36:50.818508 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 05:36:50.818517 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 05:36:50.818525 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 05:36:50.818533 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 05:36:50.818544 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 05:36:50.818668 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 9 05:36:50.818803 kernel: rtc_cmos 00:04: registered as rtc0 Sep 9 05:36:50.818912 kernel: rtc_cmos 00:04: setting system clock to 2025-09-09T05:36:50 UTC (1757396210) Sep 9 05:36:50.819018 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 9 05:36:50.819029 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 9 05:36:50.819037 kernel: efifb: probing for efifb Sep 9 05:36:50.819045 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 9 05:36:50.819054 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 9 05:36:50.819065 kernel: efifb: scrolling: redraw Sep 9 05:36:50.819073 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 9 05:36:50.819081 kernel: Console: switching to colour frame buffer device 160x50 Sep 9 05:36:50.819089 kernel: fb0: EFI VGA frame buffer device Sep 9 05:36:50.819097 kernel: pstore: Using crash dump compression: deflate Sep 9 05:36:50.819106 kernel: pstore: Registered efi_pstore as persistent store backend Sep 9 05:36:50.819114 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:36:50.819122 kernel: Segment Routing with IPv6 Sep 9 05:36:50.819132 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:36:50.819142 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:36:50.819150 kernel: Key type dns_resolver registered Sep 9 05:36:50.819158 kernel: IPI shorthand broadcast: enabled Sep 9 05:36:50.819166 kernel: sched_clock: Marking stable (2717005159, 151116699)->(2882379851, -14257993) Sep 9 05:36:50.819175 kernel: registered taskstats version 1 Sep 9 05:36:50.819183 kernel: Loading compiled-in X.509 certificates Sep 9 05:36:50.819191 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 884b9ad6a330f59ae6e6488b20a5491e41ff24a3' Sep 9 05:36:50.819199 kernel: Demotion targets for Node 0: null Sep 9 05:36:50.819207 kernel: Key type .fscrypt registered Sep 9 05:36:50.819217 kernel: Key type fscrypt-provisioning registered Sep 9 05:36:50.819225 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 05:36:50.819233 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:36:50.819241 kernel: ima: No architecture policies found Sep 9 05:36:50.819249 kernel: clk: Disabling unused clocks Sep 9 05:36:50.819257 kernel: Warning: unable to open an initial console. Sep 9 05:36:50.819265 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 9 05:36:50.819274 kernel: Write protecting the kernel read-only data: 24576k Sep 9 05:36:50.819282 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 05:36:50.819292 kernel: Run /init as init process Sep 9 05:36:50.819300 kernel: with arguments: Sep 9 05:36:50.819308 kernel: /init Sep 9 05:36:50.819316 kernel: with environment: Sep 9 05:36:50.819324 kernel: HOME=/ Sep 9 05:36:50.819332 kernel: TERM=linux Sep 9 05:36:50.819340 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:36:50.819349 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:36:50.819363 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:36:50.819372 systemd[1]: Detected virtualization kvm. Sep 9 05:36:50.819380 systemd[1]: Detected architecture x86-64. Sep 9 05:36:50.819388 systemd[1]: Running in initrd. Sep 9 05:36:50.819397 systemd[1]: No hostname configured, using default hostname. Sep 9 05:36:50.819406 systemd[1]: Hostname set to . Sep 9 05:36:50.819414 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:36:50.819422 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:36:50.819433 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:36:50.819442 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:36:50.819451 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:36:50.819460 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:36:50.819469 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:36:50.819478 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:36:50.819490 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:36:50.819499 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:36:50.819507 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:36:50.819516 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:36:50.819525 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:36:50.819533 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:36:50.819543 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:36:50.819553 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:36:50.819562 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:36:50.819574 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:36:50.819583 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:36:50.819591 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:36:50.819600 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:36:50.819609 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:36:50.819617 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:36:50.819626 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:36:50.819634 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:36:50.819643 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:36:50.819654 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:36:50.819663 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:36:50.819671 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:36:50.819680 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:36:50.819704 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:36:50.819713 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:50.819721 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:36:50.819733 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:36:50.819742 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:36:50.819751 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:36:50.819789 systemd-journald[218]: Collecting audit messages is disabled. Sep 9 05:36:50.819812 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:50.819821 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:36:50.819830 systemd-journald[218]: Journal started Sep 9 05:36:50.819851 systemd-journald[218]: Runtime Journal (/run/log/journal/6c487885abc542cf8778de5c4c49faf1) is 6M, max 48.4M, 42.4M free. Sep 9 05:36:50.807371 systemd-modules-load[221]: Inserted module 'overlay' Sep 9 05:36:50.823708 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:36:50.833707 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:36:50.835720 kernel: Bridge firewalling registered Sep 9 05:36:50.835706 systemd-modules-load[221]: Inserted module 'br_netfilter' Sep 9 05:36:50.836851 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:36:50.837156 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:36:50.837605 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:36:50.844501 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:36:50.847315 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:36:50.849646 systemd-tmpfiles[241]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:36:50.856743 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:36:50.857246 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:36:50.860714 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:36:50.863039 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:36:50.866375 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:36:50.867727 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:36:50.895423 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:36:50.913988 systemd-resolved[263]: Positive Trust Anchors: Sep 9 05:36:50.914004 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:36:50.914034 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:36:50.916388 systemd-resolved[263]: Defaulting to hostname 'linux'. Sep 9 05:36:50.917438 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:36:50.924363 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:36:51.011727 kernel: SCSI subsystem initialized Sep 9 05:36:51.023712 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:36:51.034716 kernel: iscsi: registered transport (tcp) Sep 9 05:36:51.055097 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:36:51.055125 kernel: QLogic iSCSI HBA Driver Sep 9 05:36:51.075904 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:36:51.110822 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:36:51.112395 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:36:51.171330 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:36:51.173890 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:36:51.243712 kernel: raid6: avx2x4 gen() 23525 MB/s Sep 9 05:36:51.260708 kernel: raid6: avx2x2 gen() 30839 MB/s Sep 9 05:36:51.277761 kernel: raid6: avx2x1 gen() 25993 MB/s Sep 9 05:36:51.277787 kernel: raid6: using algorithm avx2x2 gen() 30839 MB/s Sep 9 05:36:51.295769 kernel: raid6: .... xor() 20000 MB/s, rmw enabled Sep 9 05:36:51.295787 kernel: raid6: using avx2x2 recovery algorithm Sep 9 05:36:51.315718 kernel: xor: automatically using best checksumming function avx Sep 9 05:36:51.475721 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:36:51.484677 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:36:51.487350 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:36:51.514494 systemd-udevd[473]: Using default interface naming scheme 'v255'. Sep 9 05:36:51.519954 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:36:51.522184 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:36:51.549010 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Sep 9 05:36:51.582002 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:36:51.584511 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:36:51.657145 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:36:51.661048 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:36:51.695721 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 05:36:51.705710 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 05:36:51.712718 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 9 05:36:51.718132 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 05:36:51.720714 kernel: AES CTR mode by8 optimization enabled Sep 9 05:36:51.722141 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:36:51.722354 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:51.725649 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:51.731839 kernel: libata version 3.00 loaded. Sep 9 05:36:51.735234 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:51.742071 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:36:51.742103 kernel: GPT:9289727 != 19775487 Sep 9 05:36:51.742115 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:36:51.742769 kernel: GPT:9289727 != 19775487 Sep 9 05:36:51.742810 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:36:51.742821 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:36:51.743701 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 05:36:51.743892 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 05:36:51.748270 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 05:36:51.748447 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 05:36:51.748586 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 05:36:51.751715 kernel: scsi host0: ahci Sep 9 05:36:51.753807 kernel: scsi host1: ahci Sep 9 05:36:51.756191 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:36:51.761903 kernel: scsi host2: ahci Sep 9 05:36:51.756325 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:51.760840 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:51.771702 kernel: scsi host3: ahci Sep 9 05:36:51.771885 kernel: scsi host4: ahci Sep 9 05:36:51.772035 kernel: scsi host5: ahci Sep 9 05:36:51.772175 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 9 05:36:51.772187 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 9 05:36:51.773242 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 9 05:36:51.773262 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 9 05:36:51.774992 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 9 05:36:51.775010 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 9 05:36:51.791258 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 05:36:51.793078 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:51.802948 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 05:36:51.811770 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:36:51.819572 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 05:36:51.819642 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 05:36:51.825079 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:36:51.854775 disk-uuid[635]: Primary Header is updated. Sep 9 05:36:51.854775 disk-uuid[635]: Secondary Entries is updated. Sep 9 05:36:51.854775 disk-uuid[635]: Secondary Header is updated. Sep 9 05:36:51.858716 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:36:51.863713 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:36:52.087717 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 05:36:52.087800 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 05:36:52.087811 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 05:36:52.088714 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 05:36:52.089724 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 05:36:52.090723 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 9 05:36:52.090755 kernel: ata3.00: LPM support broken, forcing max_power Sep 9 05:36:52.091944 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 9 05:36:52.091959 kernel: ata3.00: applying bridge limits Sep 9 05:36:52.093067 kernel: ata3.00: LPM support broken, forcing max_power Sep 9 05:36:52.093083 kernel: ata3.00: configured for UDMA/100 Sep 9 05:36:52.095714 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 9 05:36:52.151713 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 9 05:36:52.151938 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 05:36:52.170710 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 9 05:36:52.549613 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:36:52.550510 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:36:52.552891 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:36:52.555208 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:36:52.556243 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:36:52.589730 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:36:52.864738 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:36:52.864944 disk-uuid[636]: The operation has completed successfully. Sep 9 05:36:52.890851 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:36:52.890986 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:36:52.928112 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:36:52.948884 sh[664]: Success Sep 9 05:36:52.965930 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:36:52.965965 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:36:52.966967 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:36:52.975718 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 9 05:36:53.002230 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:36:53.006083 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:36:53.021240 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:36:53.027713 kernel: BTRFS: device fsid 9ca60a92-6b53-4529-adc0-1f4392d2ad56 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (676) Sep 9 05:36:53.029716 kernel: BTRFS info (device dm-0): first mount of filesystem 9ca60a92-6b53-4529-adc0-1f4392d2ad56 Sep 9 05:36:53.029737 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:36:53.034064 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:36:53.034081 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:36:53.035221 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:36:53.037239 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:36:53.039382 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:36:53.041839 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:36:53.044296 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:36:53.073387 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (709) Sep 9 05:36:53.073451 kernel: BTRFS info (device vda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:36:53.073465 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:36:53.076936 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:36:53.077009 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:36:53.081716 kernel: BTRFS info (device vda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:36:53.082085 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:36:53.084311 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:36:53.166723 ignition[748]: Ignition 2.22.0 Sep 9 05:36:53.166738 ignition[748]: Stage: fetch-offline Sep 9 05:36:53.166775 ignition[748]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:53.166784 ignition[748]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:53.166878 ignition[748]: parsed url from cmdline: "" Sep 9 05:36:53.166881 ignition[748]: no config URL provided Sep 9 05:36:53.166886 ignition[748]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:36:53.166893 ignition[748]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:36:53.166916 ignition[748]: op(1): [started] loading QEMU firmware config module Sep 9 05:36:53.166921 ignition[748]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 05:36:53.175616 ignition[748]: op(1): [finished] loading QEMU firmware config module Sep 9 05:36:53.187882 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:36:53.190353 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:36:53.216664 ignition[748]: parsing config with SHA512: 799674b0406f40d31d7e9e950119b52fddf5d6fa5d1fbe5e9d625eac34dcb75fa93f51cee22d64d2719cf54e3b910944f8cd15caad4c9ec8507448b9faba39ca Sep 9 05:36:53.223263 unknown[748]: fetched base config from "system" Sep 9 05:36:53.223281 unknown[748]: fetched user config from "qemu" Sep 9 05:36:53.223644 ignition[748]: fetch-offline: fetch-offline passed Sep 9 05:36:53.223725 ignition[748]: Ignition finished successfully Sep 9 05:36:53.227100 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:36:53.251429 systemd-networkd[854]: lo: Link UP Sep 9 05:36:53.251439 systemd-networkd[854]: lo: Gained carrier Sep 9 05:36:53.254153 systemd-networkd[854]: Enumeration completed Sep 9 05:36:53.254341 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:36:53.256070 systemd-networkd[854]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:36:53.256080 systemd-networkd[854]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:36:53.257274 systemd[1]: Reached target network.target - Network. Sep 9 05:36:53.257961 systemd-networkd[854]: eth0: Link UP Sep 9 05:36:53.258092 systemd-networkd[854]: eth0: Gained carrier Sep 9 05:36:53.258101 systemd-networkd[854]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:36:53.260214 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 05:36:53.261105 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:36:53.274769 systemd-networkd[854]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 05:36:53.290791 ignition[859]: Ignition 2.22.0 Sep 9 05:36:53.290802 ignition[859]: Stage: kargs Sep 9 05:36:53.290924 ignition[859]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:53.290933 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:53.291627 ignition[859]: kargs: kargs passed Sep 9 05:36:53.291661 ignition[859]: Ignition finished successfully Sep 9 05:36:53.295490 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:36:53.297505 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:36:53.344786 ignition[868]: Ignition 2.22.0 Sep 9 05:36:53.344799 ignition[868]: Stage: disks Sep 9 05:36:53.344950 ignition[868]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:53.344960 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:53.345775 ignition[868]: disks: disks passed Sep 9 05:36:53.345821 ignition[868]: Ignition finished successfully Sep 9 05:36:53.350041 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:36:53.352199 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:36:53.352281 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:36:53.354275 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:36:53.354593 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:36:53.355073 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:36:53.361486 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:36:53.382834 systemd-resolved[263]: Detected conflict on linux IN A 10.0.0.118 Sep 9 05:36:53.382847 systemd-resolved[263]: Hostname conflict, changing published hostname from 'linux' to 'linux10'. Sep 9 05:36:53.384957 systemd-fsck[878]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 05:36:53.393093 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:36:53.395519 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:36:53.497716 kernel: EXT4-fs (vda9): mounted filesystem d2d7815e-fa16-4396-ab9d-ac540c1d8856 r/w with ordered data mode. Quota mode: none. Sep 9 05:36:53.497928 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:36:53.499940 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:36:53.503061 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:36:53.505397 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:36:53.507232 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 05:36:53.507272 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:36:53.508889 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:36:53.518358 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:36:53.521834 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:36:53.522942 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (886) Sep 9 05:36:53.525477 kernel: BTRFS info (device vda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:36:53.525502 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:36:53.529713 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:36:53.529747 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:36:53.531787 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:36:53.561607 initrd-setup-root[910]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:36:53.566184 initrd-setup-root[917]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:36:53.570137 initrd-setup-root[924]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:36:53.573977 initrd-setup-root[931]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:36:53.660053 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:36:53.662169 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:36:53.663773 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:36:53.687714 kernel: BTRFS info (device vda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:36:53.698869 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:36:53.718322 ignition[999]: INFO : Ignition 2.22.0 Sep 9 05:36:53.718322 ignition[999]: INFO : Stage: mount Sep 9 05:36:53.719973 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:53.719973 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:53.719973 ignition[999]: INFO : mount: mount passed Sep 9 05:36:53.719973 ignition[999]: INFO : Ignition finished successfully Sep 9 05:36:53.726012 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:36:53.728879 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:36:54.028143 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:36:54.030080 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:36:54.056140 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1013) Sep 9 05:36:54.056169 kernel: BTRFS info (device vda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:36:54.056180 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:36:54.059763 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:36:54.059783 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:36:54.061628 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:36:54.102799 ignition[1030]: INFO : Ignition 2.22.0 Sep 9 05:36:54.102799 ignition[1030]: INFO : Stage: files Sep 9 05:36:54.104466 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:54.104466 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:54.104466 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:36:54.108180 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:36:54.108180 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:36:54.112269 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:36:54.113722 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:36:54.115347 unknown[1030]: wrote ssh authorized keys file for user: core Sep 9 05:36:54.116435 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:36:54.117817 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 05:36:54.120057 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 9 05:36:54.160878 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:36:54.313448 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 05:36:54.313448 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:36:54.317743 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:36:54.331850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 9 05:36:54.774317 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:36:55.096816 systemd-networkd[854]: eth0: Gained IPv6LL Sep 9 05:36:55.164783 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:36:55.164783 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:36:55.168459 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:36:55.174751 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:36:55.174751 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:36:55.174751 ignition[1030]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 05:36:55.178920 ignition[1030]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 05:36:55.178920 ignition[1030]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 05:36:55.182647 ignition[1030]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 05:36:55.182647 ignition[1030]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 05:36:55.201177 ignition[1030]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 05:36:55.207222 ignition[1030]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:36:55.208786 ignition[1030]: INFO : files: files passed Sep 9 05:36:55.208786 ignition[1030]: INFO : Ignition finished successfully Sep 9 05:36:55.216248 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:36:55.219072 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:36:55.222896 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:36:55.230903 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:36:55.231115 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:36:55.232944 initrd-setup-root-after-ignition[1059]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 05:36:55.237591 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:36:55.237591 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:36:55.241672 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:36:55.244954 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:36:55.246323 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:36:55.249557 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:36:55.315303 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:36:55.315419 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:36:55.317707 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:36:55.318733 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:36:55.319081 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:36:55.319845 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:36:55.352157 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:36:55.356502 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:36:55.380203 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:36:55.382459 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:36:55.382702 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:36:55.386471 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:36:55.386643 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:36:55.389937 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:36:55.390114 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:36:55.392005 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:36:55.395550 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:36:55.395758 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:36:55.400101 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:36:55.400281 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:36:55.402384 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:36:55.404290 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:36:55.406548 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:36:55.410309 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:36:55.410447 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:36:55.410629 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:36:55.414929 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:36:55.416062 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:36:55.417072 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:36:55.419193 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:36:55.420146 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:36:55.420258 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:36:55.422941 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:36:55.423048 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:36:55.423359 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:36:55.423615 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:36:55.432805 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:36:55.434102 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:36:55.436389 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:36:55.438128 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:36:55.438217 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:36:55.439915 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:36:55.439994 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:36:55.440992 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:36:55.441103 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:36:55.443679 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:36:55.443801 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:36:55.446658 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:36:55.448635 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:36:55.448770 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:36:55.451806 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:36:55.455068 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:36:55.456097 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:36:55.461168 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:36:55.461350 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:36:55.472404 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:36:55.473746 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:36:55.480063 ignition[1085]: INFO : Ignition 2.22.0 Sep 9 05:36:55.480063 ignition[1085]: INFO : Stage: umount Sep 9 05:36:55.481902 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:36:55.481902 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:36:55.481902 ignition[1085]: INFO : umount: umount passed Sep 9 05:36:55.481902 ignition[1085]: INFO : Ignition finished successfully Sep 9 05:36:55.485043 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:36:55.485601 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:36:55.485735 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:36:55.487277 systemd[1]: Stopped target network.target - Network. Sep 9 05:36:55.488427 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:36:55.488482 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:36:55.490602 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:36:55.490654 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:36:55.492911 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:36:55.492964 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:36:55.494136 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:36:55.494223 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:36:55.496427 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:36:55.499819 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:36:55.504937 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:36:55.505062 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:36:55.508779 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:36:55.509258 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:36:55.509303 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:36:55.515522 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:36:55.526666 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:36:55.526806 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:36:55.531008 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:36:55.531187 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:36:55.532407 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:36:55.532453 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:36:55.535252 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:36:55.536456 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:36:55.536516 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:36:55.538349 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:36:55.538404 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:36:55.542449 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:36:55.543870 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:36:55.545120 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:36:55.548372 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:36:55.567439 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:36:55.567636 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:36:55.568803 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:36:55.568857 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:36:55.570910 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:36:55.570955 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:36:55.572866 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:36:55.572924 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:36:55.576135 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:36:55.576195 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:36:55.577034 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:36:55.577096 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:36:55.578742 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:36:55.583277 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:36:55.583339 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:36:55.587466 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:36:55.587522 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:36:55.590895 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:36:55.590947 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:55.618139 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:36:55.618262 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:36:55.619155 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:36:55.619276 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:36:55.653092 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:36:55.653238 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:36:55.655278 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:36:55.655978 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:36:55.656036 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:36:55.658760 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:36:55.682154 systemd[1]: Switching root. Sep 9 05:36:55.727066 systemd-journald[218]: Journal stopped Sep 9 05:36:56.938162 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Sep 9 05:36:56.938225 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:36:56.938239 kernel: SELinux: policy capability open_perms=1 Sep 9 05:36:56.938250 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:36:56.938261 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:36:56.938274 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:36:56.938286 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:36:56.938302 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:36:56.938313 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:36:56.938324 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:36:56.938340 kernel: audit: type=1403 audit(1757396216.151:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:36:56.938352 systemd[1]: Successfully loaded SELinux policy in 66.043ms. Sep 9 05:36:56.938366 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.396ms. Sep 9 05:36:56.938379 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:36:56.938392 systemd[1]: Detected virtualization kvm. Sep 9 05:36:56.938406 systemd[1]: Detected architecture x86-64. Sep 9 05:36:56.938419 systemd[1]: Detected first boot. Sep 9 05:36:56.938435 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:36:56.938450 zram_generator::config[1130]: No configuration found. Sep 9 05:36:56.938472 kernel: Guest personality initialized and is inactive Sep 9 05:36:56.938484 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 05:36:56.938495 kernel: Initialized host personality Sep 9 05:36:56.938510 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:36:56.938522 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:36:56.938537 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:36:56.938549 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:36:56.938561 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:36:56.938574 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:36:56.938586 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:36:56.938599 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:36:56.938620 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:36:56.938632 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:36:56.938645 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:36:56.938661 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:36:56.938673 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:36:56.938699 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:36:56.938712 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:36:56.938725 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:36:56.938737 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:36:56.938749 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:36:56.938761 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:36:56.938776 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:36:56.938788 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 05:36:56.938801 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:36:56.938813 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:36:56.938825 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:36:56.938836 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:36:56.938848 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:36:56.938860 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:36:56.938875 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:36:56.938887 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:36:56.938900 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:36:56.938913 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:36:56.938925 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:36:56.938937 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:36:56.938949 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:36:56.938961 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:36:56.938973 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:36:56.938985 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:36:56.938998 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:36:56.939011 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:36:56.939023 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:36:56.939035 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:36:56.939047 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:56.939059 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:36:56.939077 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:36:56.939091 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:36:56.939107 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:36:56.939120 systemd[1]: Reached target machines.target - Containers. Sep 9 05:36:56.939133 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:36:56.939145 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:36:56.939157 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:36:56.939169 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:36:56.939182 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:36:56.939194 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:36:56.939208 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:36:56.939220 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:36:56.939232 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:36:56.939245 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:36:56.939256 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:36:56.939269 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:36:56.939281 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:36:56.939293 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:36:56.939306 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:36:56.939320 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:36:56.939332 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:36:56.939343 kernel: fuse: init (API version 7.41) Sep 9 05:36:56.939354 kernel: loop: module loaded Sep 9 05:36:56.939366 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:36:56.939378 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:36:56.939391 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:36:56.939405 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:36:56.939417 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:36:56.939433 systemd[1]: Stopped verity-setup.service. Sep 9 05:36:56.939452 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:56.939464 kernel: ACPI: bus type drm_connector registered Sep 9 05:36:56.939476 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:36:56.939490 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:36:56.939502 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:36:56.939514 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:36:56.939548 systemd-journald[1210]: Collecting audit messages is disabled. Sep 9 05:36:56.939571 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:36:56.939585 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:36:56.939597 systemd-journald[1210]: Journal started Sep 9 05:36:56.939628 systemd-journald[1210]: Runtime Journal (/run/log/journal/6c487885abc542cf8778de5c4c49faf1) is 6M, max 48.4M, 42.4M free. Sep 9 05:36:56.685411 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:36:56.709712 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 05:36:56.710310 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:36:56.942711 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:36:56.943733 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:36:56.945137 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:36:56.946589 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:36:56.946841 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:36:56.948217 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:36:56.948422 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:36:56.949781 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:36:56.949986 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:36:56.951278 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:36:56.951477 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:36:56.953078 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:36:56.953280 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:36:56.954567 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:36:56.954790 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:36:56.956116 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:36:56.957451 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:36:56.958928 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:36:56.960386 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:36:56.972491 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:36:56.974874 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:36:56.976841 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:36:56.978018 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:36:56.978045 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:36:56.979929 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:36:56.986773 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:36:56.987877 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:36:56.989834 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:36:56.991765 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:36:56.993778 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:36:56.995855 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:36:56.997259 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:36:56.998525 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:36:57.001673 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:36:57.004787 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:36:57.007437 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:36:57.008783 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:36:57.011662 systemd-journald[1210]: Time spent on flushing to /var/log/journal/6c487885abc542cf8778de5c4c49faf1 is 22.509ms for 1071 entries. Sep 9 05:36:57.011662 systemd-journald[1210]: System Journal (/var/log/journal/6c487885abc542cf8778de5c4c49faf1) is 8M, max 195.6M, 187.6M free. Sep 9 05:36:57.039418 systemd-journald[1210]: Received client request to flush runtime journal. Sep 9 05:36:57.039451 kernel: loop0: detected capacity change from 0 to 110984 Sep 9 05:36:57.020527 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:36:57.028534 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:36:57.030033 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:36:57.032647 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:36:57.034410 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:36:57.046715 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:36:57.046969 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:36:57.053036 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:36:57.056228 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:36:57.065705 kernel: loop1: detected capacity change from 0 to 221472 Sep 9 05:36:57.073294 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:36:57.085142 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Sep 9 05:36:57.085158 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Sep 9 05:36:57.089505 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:36:57.091717 kernel: loop2: detected capacity change from 0 to 128016 Sep 9 05:36:57.117730 kernel: loop3: detected capacity change from 0 to 110984 Sep 9 05:36:57.127704 kernel: loop4: detected capacity change from 0 to 221472 Sep 9 05:36:57.135720 kernel: loop5: detected capacity change from 0 to 128016 Sep 9 05:36:57.144358 (sd-merge)[1274]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 05:36:57.144922 (sd-merge)[1274]: Merged extensions into '/usr'. Sep 9 05:36:57.150771 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:36:57.150787 systemd[1]: Reloading... Sep 9 05:36:57.194714 zram_generator::config[1299]: No configuration found. Sep 9 05:36:57.307733 ldconfig[1244]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:36:57.400835 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:36:57.401160 systemd[1]: Reloading finished in 249 ms. Sep 9 05:36:57.435894 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:36:57.437396 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:36:57.454277 systemd[1]: Starting ensure-sysext.service... Sep 9 05:36:57.456396 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:36:57.466669 systemd[1]: Reload requested from client PID 1337 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:36:57.466783 systemd[1]: Reloading... Sep 9 05:36:57.474659 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:36:57.474722 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:36:57.475012 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:36:57.475262 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:36:57.476188 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:36:57.476460 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Sep 9 05:36:57.476538 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Sep 9 05:36:57.480708 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:36:57.480861 systemd-tmpfiles[1338]: Skipping /boot Sep 9 05:36:57.493119 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:36:57.493221 systemd-tmpfiles[1338]: Skipping /boot Sep 9 05:36:57.519743 zram_generator::config[1368]: No configuration found. Sep 9 05:36:57.690318 systemd[1]: Reloading finished in 223 ms. Sep 9 05:36:57.715130 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:36:57.735361 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:36:57.743871 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:36:57.746309 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:36:57.775104 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:36:57.779863 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:36:57.782365 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:36:57.784772 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:36:57.788453 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:57.788626 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:36:57.798133 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:36:57.800441 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:36:57.805305 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:36:57.806389 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:36:57.806498 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:36:57.808445 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:36:57.809739 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:57.811032 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:36:57.811238 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:36:57.813764 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:36:57.813977 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:36:57.817882 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:36:57.825148 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:36:57.828066 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:36:57.830477 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:36:57.830999 systemd-udevd[1409]: Using default interface naming scheme 'v255'. Sep 9 05:36:57.832992 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:36:57.833206 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:36:57.837434 augenrules[1437]: No rules Sep 9 05:36:57.839028 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:36:57.846265 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:36:57.854827 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:36:57.856871 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:36:57.859496 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:36:57.861849 systemd[1]: Finished ensure-sysext.service. Sep 9 05:36:57.867208 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:57.870387 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:36:57.871899 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:36:57.875915 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:36:57.879092 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:36:57.888678 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:36:57.890768 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:36:57.892913 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:36:57.892955 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:36:57.894838 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:36:57.900848 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 05:36:57.903828 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:36:57.903860 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:36:57.904263 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:36:57.906944 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:36:57.907192 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:36:57.908812 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:36:57.909048 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:36:57.912306 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:36:57.912538 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:36:57.918648 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:36:57.918931 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:36:57.921879 augenrules[1463]: /sbin/augenrules: No change Sep 9 05:36:57.928341 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:36:57.928756 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:36:57.929116 augenrules[1507]: No rules Sep 9 05:36:57.931176 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:36:57.931904 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:36:57.960793 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 05:36:58.002776 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 05:36:58.011378 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:36:58.014171 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:36:58.019707 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 9 05:36:58.027715 kernel: ACPI: button: Power Button [PWRF] Sep 9 05:36:58.040770 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:36:58.049084 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 9 05:36:58.049387 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 05:36:58.049540 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 05:36:58.072339 systemd-networkd[1483]: lo: Link UP Sep 9 05:36:58.072348 systemd-networkd[1483]: lo: Gained carrier Sep 9 05:36:58.073922 systemd-networkd[1483]: Enumeration completed Sep 9 05:36:58.074017 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:36:58.075315 systemd-networkd[1483]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:36:58.075324 systemd-networkd[1483]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:36:58.075846 systemd-networkd[1483]: eth0: Link UP Sep 9 05:36:58.076011 systemd-networkd[1483]: eth0: Gained carrier Sep 9 05:36:58.076024 systemd-networkd[1483]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:36:58.077401 systemd-resolved[1407]: Positive Trust Anchors: Sep 9 05:36:58.077626 systemd-resolved[1407]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:36:58.077710 systemd-resolved[1407]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:36:58.077931 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:36:58.081797 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:36:58.085798 systemd-networkd[1483]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 05:36:58.099586 systemd-resolved[1407]: Defaulting to hostname 'linux'. Sep 9 05:36:58.101132 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:58.107324 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:36:58.109168 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:36:58.111812 systemd[1]: Reached target network.target - Network. Sep 9 05:36:58.112879 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:36:58.155750 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:36:58.156035 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:58.160924 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:36:58.181885 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 05:36:58.183150 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:36:59.384461 systemd-timesyncd[1487]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 05:36:59.384488 systemd-resolved[1407]: Clock change detected. Flushing caches. Sep 9 05:36:59.384611 systemd-timesyncd[1487]: Initial clock synchronization to Tue 2025-09-09 05:36:59.384371 UTC. Sep 9 05:36:59.390007 kernel: kvm_amd: TSC scaling supported Sep 9 05:36:59.390046 kernel: kvm_amd: Nested Virtualization enabled Sep 9 05:36:59.390059 kernel: kvm_amd: Nested Paging enabled Sep 9 05:36:59.390073 kernel: kvm_amd: LBR virtualization supported Sep 9 05:36:59.391427 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 9 05:36:59.391451 kernel: kvm_amd: Virtual GIF supported Sep 9 05:36:59.421615 kernel: EDAC MC: Ver: 3.0.0 Sep 9 05:36:59.445179 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:36:59.446515 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:36:59.447654 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:36:59.448866 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:36:59.450161 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 05:36:59.451421 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:36:59.452778 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:36:59.453990 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:36:59.456662 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:36:59.456694 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:36:59.457612 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:36:59.459245 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:36:59.461805 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:36:59.464858 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:36:59.466244 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:36:59.467650 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:36:59.472034 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:36:59.473408 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:36:59.475328 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:36:59.477249 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:36:59.478210 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:36:59.479190 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:36:59.479226 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:36:59.480433 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:36:59.482645 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:36:59.484695 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:36:59.490963 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:36:59.494373 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:36:59.495362 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:36:59.496332 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 05:36:59.499620 jq[1560]: false Sep 9 05:36:59.499738 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:36:59.501444 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:36:59.504687 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:36:59.507776 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:36:59.508322 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Refreshing passwd entry cache Sep 9 05:36:59.508574 oslogin_cache_refresh[1562]: Refreshing passwd entry cache Sep 9 05:36:59.512533 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:36:59.514121 extend-filesystems[1561]: Found /dev/vda6 Sep 9 05:36:59.514432 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 05:36:59.514894 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:36:59.515430 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:36:59.517638 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Failure getting users, quitting Sep 9 05:36:59.517638 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:36:59.517638 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Refreshing group entry cache Sep 9 05:36:59.516698 oslogin_cache_refresh[1562]: Failure getting users, quitting Sep 9 05:36:59.516719 oslogin_cache_refresh[1562]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:36:59.516778 oslogin_cache_refresh[1562]: Refreshing group entry cache Sep 9 05:36:59.520729 extend-filesystems[1561]: Found /dev/vda9 Sep 9 05:36:59.521709 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:36:59.525783 extend-filesystems[1561]: Checking size of /dev/vda9 Sep 9 05:36:59.526276 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:36:59.528556 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:36:59.529330 oslogin_cache_refresh[1562]: Failure getting groups, quitting Sep 9 05:36:59.529753 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Failure getting groups, quitting Sep 9 05:36:59.529753 google_oslogin_nss_cache[1562]: oslogin_cache_refresh[1562]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:36:59.528989 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:36:59.529341 oslogin_cache_refresh[1562]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:36:59.529337 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:36:59.529945 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:36:59.531354 jq[1575]: true Sep 9 05:36:59.531444 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 05:36:59.532025 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 05:36:59.534204 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:36:59.534638 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:36:59.547006 extend-filesystems[1561]: Resized partition /dev/vda9 Sep 9 05:36:59.548922 update_engine[1574]: I20250909 05:36:59.548621 1574 main.cc:92] Flatcar Update Engine starting Sep 9 05:36:59.550158 (ntainerd)[1587]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:36:59.554294 extend-filesystems[1599]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 05:36:59.560629 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 05:36:59.562836 jq[1586]: true Sep 9 05:36:59.583606 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 05:36:59.602723 tar[1584]: linux-amd64/helm Sep 9 05:36:59.603516 extend-filesystems[1599]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 05:36:59.603516 extend-filesystems[1599]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 05:36:59.603516 extend-filesystems[1599]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 05:36:59.607310 extend-filesystems[1561]: Resized filesystem in /dev/vda9 Sep 9 05:36:59.613344 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:36:59.613642 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:36:59.620829 systemd-logind[1571]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 05:36:59.621180 systemd-logind[1571]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 05:36:59.623158 systemd-logind[1571]: New seat seat0. Sep 9 05:36:59.626209 dbus-daemon[1558]: [system] SELinux support is enabled Sep 9 05:36:59.626711 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:36:59.630565 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:36:59.631991 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:36:59.634608 update_engine[1574]: I20250909 05:36:59.631961 1574 update_check_scheduler.cc:74] Next update check in 8m48s Sep 9 05:36:59.632016 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:36:59.633273 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:36:59.633291 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:36:59.635662 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:36:59.638404 dbus-daemon[1558]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 05:36:59.638813 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:36:59.653743 bash[1621]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:36:59.657441 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:36:59.659455 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 05:36:59.694319 locksmithd[1622]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:36:59.756429 containerd[1587]: time="2025-09-09T05:36:59Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:36:59.757545 containerd[1587]: time="2025-09-09T05:36:59.757286869Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:36:59.768540 containerd[1587]: time="2025-09-09T05:36:59.768482240Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="17.072µs" Sep 9 05:36:59.768540 containerd[1587]: time="2025-09-09T05:36:59.768528667Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:36:59.768540 containerd[1587]: time="2025-09-09T05:36:59.768548675Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:36:59.768812 containerd[1587]: time="2025-09-09T05:36:59.768759260Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:36:59.768812 containerd[1587]: time="2025-09-09T05:36:59.768779969Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:36:59.768812 containerd[1587]: time="2025-09-09T05:36:59.768804555Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769007 containerd[1587]: time="2025-09-09T05:36:59.768866330Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769007 containerd[1587]: time="2025-09-09T05:36:59.768881880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769186 containerd[1587]: time="2025-09-09T05:36:59.769161013Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769186 containerd[1587]: time="2025-09-09T05:36:59.769181081Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769230 containerd[1587]: time="2025-09-09T05:36:59.769191310Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769230 containerd[1587]: time="2025-09-09T05:36:59.769199124Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769540 containerd[1587]: time="2025-09-09T05:36:59.769282250Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769540 containerd[1587]: time="2025-09-09T05:36:59.769526398Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769595 containerd[1587]: time="2025-09-09T05:36:59.769553529Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:36:59.769595 containerd[1587]: time="2025-09-09T05:36:59.769563077Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:36:59.769642 containerd[1587]: time="2025-09-09T05:36:59.769623270Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:36:59.770138 containerd[1587]: time="2025-09-09T05:36:59.770094894Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:36:59.770278 containerd[1587]: time="2025-09-09T05:36:59.770246979Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.777996790Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778063235Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778080086Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778091778Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778106045Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778116464Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:36:59.778125 containerd[1587]: time="2025-09-09T05:36:59.778127555Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:36:59.778302 containerd[1587]: time="2025-09-09T05:36:59.778140379Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:36:59.778302 containerd[1587]: time="2025-09-09T05:36:59.778153694Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:36:59.778302 containerd[1587]: time="2025-09-09T05:36:59.778165066Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:36:59.778302 containerd[1587]: time="2025-09-09T05:36:59.778176297Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:36:59.778302 containerd[1587]: time="2025-09-09T05:36:59.778189151Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778330987Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778349832Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778363217Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778379147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778389236Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:36:59.778398 containerd[1587]: time="2025-09-09T05:36:59.778399896Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:36:59.778517 containerd[1587]: time="2025-09-09T05:36:59.778411878Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:36:59.778517 containerd[1587]: time="2025-09-09T05:36:59.778423991Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:36:59.778517 containerd[1587]: time="2025-09-09T05:36:59.778439931Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:36:59.778517 containerd[1587]: time="2025-09-09T05:36:59.778454879Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:36:59.778517 containerd[1587]: time="2025-09-09T05:36:59.778476610Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:36:59.778624 containerd[1587]: time="2025-09-09T05:36:59.778547753Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:36:59.778624 containerd[1587]: time="2025-09-09T05:36:59.778561619Z" level=info msg="Start snapshots syncer" Sep 9 05:36:59.778624 containerd[1587]: time="2025-09-09T05:36:59.778598889Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:36:59.779293 containerd[1587]: time="2025-09-09T05:36:59.778848667Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:36:59.779293 containerd[1587]: time="2025-09-09T05:36:59.778914391Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.778976727Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779075012Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779096362Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779111330Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779126108Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779143771Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779162386Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779173787Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779197281Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779208863Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779218962Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779253597Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779267132Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:36:59.779411 containerd[1587]: time="2025-09-09T05:36:59.779276029Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779286108Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779294153Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779304312Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779315543Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779333506Z" level=info msg="runtime interface created" Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779342513Z" level=info msg="created NRI interface" Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779360727Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779370706Z" level=info msg="Connect containerd service" Sep 9 05:36:59.779673 containerd[1587]: time="2025-09-09T05:36:59.779392287Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:36:59.780396 containerd[1587]: time="2025-09-09T05:36:59.780193198Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:36:59.809452 sshd_keygen[1600]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:36:59.833600 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:36:59.837280 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:36:59.855678 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:36:59.856681 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858567721Z" level=info msg="Start subscribing containerd event" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858661787Z" level=info msg="Start recovering state" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858764580Z" level=info msg="Start event monitor" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858788264Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858794997Z" level=info msg="Start streaming server" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858815144Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858822749Z" level=info msg="runtime interface starting up..." Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858828820Z" level=info msg="starting plugins..." Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858857023Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.858915342Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.859066776Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:36:59.859233 containerd[1587]: time="2025-09-09T05:36:59.859219172Z" level=info msg="containerd successfully booted in 0.103410s" Sep 9 05:36:59.860040 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:36:59.861373 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:36:59.881529 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:36:59.884303 tar[1584]: linux-amd64/LICENSE Sep 9 05:36:59.884383 tar[1584]: linux-amd64/README.md Sep 9 05:36:59.884873 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:36:59.887120 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 05:36:59.888314 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:36:59.907696 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:37:01.288795 systemd-networkd[1483]: eth0: Gained IPv6LL Sep 9 05:37:01.291707 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:37:01.293452 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:37:01.295887 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 05:37:01.298185 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:01.300324 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:37:01.321847 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 05:37:01.322141 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 05:37:01.323924 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:37:01.326047 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:37:02.012245 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:02.013849 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:37:02.015078 systemd[1]: Startup finished in 2.774s (kernel) + 5.509s (initrd) + 4.727s (userspace) = 13.010s. Sep 9 05:37:02.023924 (kubelet)[1692]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:37:02.417286 kubelet[1692]: E0909 05:37:02.417164 1692 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:37:02.421247 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:37:02.421459 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:37:02.421876 systemd[1]: kubelet.service: Consumed 954ms CPU time, 266.1M memory peak. Sep 9 05:37:05.477028 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:37:05.478267 systemd[1]: Started sshd@0-10.0.0.118:22-10.0.0.1:35112.service - OpenSSH per-connection server daemon (10.0.0.1:35112). Sep 9 05:37:05.547471 sshd[1705]: Accepted publickey for core from 10.0.0.1 port 35112 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:05.549536 sshd-session[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:05.555942 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:37:05.557016 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:37:05.563713 systemd-logind[1571]: New session 1 of user core. Sep 9 05:37:05.581066 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:37:05.584259 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:37:05.597871 (systemd)[1710]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:37:05.600203 systemd-logind[1571]: New session c1 of user core. Sep 9 05:37:05.738991 systemd[1710]: Queued start job for default target default.target. Sep 9 05:37:05.762740 systemd[1710]: Created slice app.slice - User Application Slice. Sep 9 05:37:05.762764 systemd[1710]: Reached target paths.target - Paths. Sep 9 05:37:05.762800 systemd[1710]: Reached target timers.target - Timers. Sep 9 05:37:05.764109 systemd[1710]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:37:05.774067 systemd[1710]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:37:05.774120 systemd[1710]: Reached target sockets.target - Sockets. Sep 9 05:37:05.774153 systemd[1710]: Reached target basic.target - Basic System. Sep 9 05:37:05.774197 systemd[1710]: Reached target default.target - Main User Target. Sep 9 05:37:05.774227 systemd[1710]: Startup finished in 167ms. Sep 9 05:37:05.775022 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:37:05.777101 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:37:05.846557 systemd[1]: Started sshd@1-10.0.0.118:22-10.0.0.1:35116.service - OpenSSH per-connection server daemon (10.0.0.1:35116). Sep 9 05:37:05.900120 sshd[1721]: Accepted publickey for core from 10.0.0.1 port 35116 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:05.901401 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:05.905433 systemd-logind[1571]: New session 2 of user core. Sep 9 05:37:05.920712 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:37:05.973195 sshd[1724]: Connection closed by 10.0.0.1 port 35116 Sep 9 05:37:05.973603 sshd-session[1721]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:05.986133 systemd[1]: sshd@1-10.0.0.118:22-10.0.0.1:35116.service: Deactivated successfully. Sep 9 05:37:05.987806 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:37:05.988489 systemd-logind[1571]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:37:05.990811 systemd[1]: Started sshd@2-10.0.0.118:22-10.0.0.1:35124.service - OpenSSH per-connection server daemon (10.0.0.1:35124). Sep 9 05:37:05.991554 systemd-logind[1571]: Removed session 2. Sep 9 05:37:06.050210 sshd[1730]: Accepted publickey for core from 10.0.0.1 port 35124 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:06.051447 sshd-session[1730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:06.055719 systemd-logind[1571]: New session 3 of user core. Sep 9 05:37:06.071703 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:37:06.121804 sshd[1733]: Connection closed by 10.0.0.1 port 35124 Sep 9 05:37:06.122184 sshd-session[1730]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:06.133901 systemd[1]: sshd@2-10.0.0.118:22-10.0.0.1:35124.service: Deactivated successfully. Sep 9 05:37:06.135569 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 05:37:06.136250 systemd-logind[1571]: Session 3 logged out. Waiting for processes to exit. Sep 9 05:37:06.138683 systemd[1]: Started sshd@3-10.0.0.118:22-10.0.0.1:35126.service - OpenSSH per-connection server daemon (10.0.0.1:35126). Sep 9 05:37:06.139186 systemd-logind[1571]: Removed session 3. Sep 9 05:37:06.197435 sshd[1739]: Accepted publickey for core from 10.0.0.1 port 35126 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:06.198632 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:06.202826 systemd-logind[1571]: New session 4 of user core. Sep 9 05:37:06.212710 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:37:06.265944 sshd[1742]: Connection closed by 10.0.0.1 port 35126 Sep 9 05:37:06.266162 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:06.279071 systemd[1]: sshd@3-10.0.0.118:22-10.0.0.1:35126.service: Deactivated successfully. Sep 9 05:37:06.280860 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 05:37:06.281632 systemd-logind[1571]: Session 4 logged out. Waiting for processes to exit. Sep 9 05:37:06.284245 systemd[1]: Started sshd@4-10.0.0.118:22-10.0.0.1:35136.service - OpenSSH per-connection server daemon (10.0.0.1:35136). Sep 9 05:37:06.284966 systemd-logind[1571]: Removed session 4. Sep 9 05:37:06.322281 sshd[1748]: Accepted publickey for core from 10.0.0.1 port 35136 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:06.323536 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:06.327442 systemd-logind[1571]: New session 5 of user core. Sep 9 05:37:06.336712 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:37:06.393666 sudo[1752]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:37:06.393978 sudo[1752]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:37:06.410145 sudo[1752]: pam_unix(sudo:session): session closed for user root Sep 9 05:37:06.411754 sshd[1751]: Connection closed by 10.0.0.1 port 35136 Sep 9 05:37:06.412119 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:06.426223 systemd[1]: sshd@4-10.0.0.118:22-10.0.0.1:35136.service: Deactivated successfully. Sep 9 05:37:06.427919 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:37:06.428730 systemd-logind[1571]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:37:06.431511 systemd[1]: Started sshd@5-10.0.0.118:22-10.0.0.1:35146.service - OpenSSH per-connection server daemon (10.0.0.1:35146). Sep 9 05:37:06.432268 systemd-logind[1571]: Removed session 5. Sep 9 05:37:06.486173 sshd[1758]: Accepted publickey for core from 10.0.0.1 port 35146 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:06.487421 sshd-session[1758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:06.491976 systemd-logind[1571]: New session 6 of user core. Sep 9 05:37:06.501737 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:37:06.554917 sudo[1763]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:37:06.555226 sudo[1763]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:37:06.853409 sudo[1763]: pam_unix(sudo:session): session closed for user root Sep 9 05:37:06.859760 sudo[1762]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:37:06.860063 sudo[1762]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:37:06.870012 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:37:06.913208 augenrules[1785]: No rules Sep 9 05:37:06.914996 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:37:06.915266 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:37:06.916572 sudo[1762]: pam_unix(sudo:session): session closed for user root Sep 9 05:37:06.918110 sshd[1761]: Connection closed by 10.0.0.1 port 35146 Sep 9 05:37:06.918458 sshd-session[1758]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:06.933180 systemd[1]: sshd@5-10.0.0.118:22-10.0.0.1:35146.service: Deactivated successfully. Sep 9 05:37:06.934859 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:37:06.935642 systemd-logind[1571]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:37:06.938047 systemd[1]: Started sshd@6-10.0.0.118:22-10.0.0.1:35148.service - OpenSSH per-connection server daemon (10.0.0.1:35148). Sep 9 05:37:06.938579 systemd-logind[1571]: Removed session 6. Sep 9 05:37:06.992317 sshd[1794]: Accepted publickey for core from 10.0.0.1 port 35148 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:37:06.993619 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:37:06.998019 systemd-logind[1571]: New session 7 of user core. Sep 9 05:37:07.010748 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:37:07.063534 sudo[1798]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:37:07.063868 sudo[1798]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:37:07.353202 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:37:07.370921 (dockerd)[1818]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:37:07.603619 dockerd[1818]: time="2025-09-09T05:37:07.603432841Z" level=info msg="Starting up" Sep 9 05:37:07.604528 dockerd[1818]: time="2025-09-09T05:37:07.604488340Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:37:07.618161 dockerd[1818]: time="2025-09-09T05:37:07.618112545Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:37:07.679616 dockerd[1818]: time="2025-09-09T05:37:07.679524382Z" level=info msg="Loading containers: start." Sep 9 05:37:07.689607 kernel: Initializing XFRM netlink socket Sep 9 05:37:07.960641 systemd-networkd[1483]: docker0: Link UP Sep 9 05:37:07.966463 dockerd[1818]: time="2025-09-09T05:37:07.966407779Z" level=info msg="Loading containers: done." Sep 9 05:37:07.979475 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck691064968-merged.mount: Deactivated successfully. Sep 9 05:37:07.982102 dockerd[1818]: time="2025-09-09T05:37:07.982053444Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:37:07.982165 dockerd[1818]: time="2025-09-09T05:37:07.982147400Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:37:07.982271 dockerd[1818]: time="2025-09-09T05:37:07.982238812Z" level=info msg="Initializing buildkit" Sep 9 05:37:08.011415 dockerd[1818]: time="2025-09-09T05:37:08.011356998Z" level=info msg="Completed buildkit initialization" Sep 9 05:37:08.018674 dockerd[1818]: time="2025-09-09T05:37:08.018636067Z" level=info msg="Daemon has completed initialization" Sep 9 05:37:08.018825 dockerd[1818]: time="2025-09-09T05:37:08.018701429Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:37:08.018814 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:37:08.764763 containerd[1587]: time="2025-09-09T05:37:08.764712279Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 05:37:09.418961 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1076780876.mount: Deactivated successfully. Sep 9 05:37:10.270539 containerd[1587]: time="2025-09-09T05:37:10.270473070Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:10.271361 containerd[1587]: time="2025-09-09T05:37:10.271288720Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 9 05:37:10.272608 containerd[1587]: time="2025-09-09T05:37:10.272529656Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:10.274941 containerd[1587]: time="2025-09-09T05:37:10.274909068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:10.275868 containerd[1587]: time="2025-09-09T05:37:10.275821348Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 1.511068644s" Sep 9 05:37:10.275868 containerd[1587]: time="2025-09-09T05:37:10.275855182Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 9 05:37:10.276455 containerd[1587]: time="2025-09-09T05:37:10.276422135Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 05:37:12.671885 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:37:12.673422 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:12.871646 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:12.876866 (kubelet)[2102]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:37:12.913498 kubelet[2102]: E0909 05:37:12.913442 2102 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:37:12.920097 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:37:12.920310 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:37:12.920708 systemd[1]: kubelet.service: Consumed 213ms CPU time, 111M memory peak. Sep 9 05:37:12.958968 containerd[1587]: time="2025-09-09T05:37:12.958847381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:13.124845 containerd[1587]: time="2025-09-09T05:37:13.124794021Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 9 05:37:13.126322 containerd[1587]: time="2025-09-09T05:37:13.126255271Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:13.131997 containerd[1587]: time="2025-09-09T05:37:13.131955970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:13.132828 containerd[1587]: time="2025-09-09T05:37:13.132794322Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 2.856334837s" Sep 9 05:37:13.132828 containerd[1587]: time="2025-09-09T05:37:13.132825671Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 9 05:37:13.133426 containerd[1587]: time="2025-09-09T05:37:13.133389959Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 05:37:15.220833 containerd[1587]: time="2025-09-09T05:37:15.220769465Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:15.221631 containerd[1587]: time="2025-09-09T05:37:15.221602948Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 9 05:37:15.222716 containerd[1587]: time="2025-09-09T05:37:15.222674376Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:15.225069 containerd[1587]: time="2025-09-09T05:37:15.225038649Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:15.225927 containerd[1587]: time="2025-09-09T05:37:15.225881540Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 2.092461254s" Sep 9 05:37:15.225927 containerd[1587]: time="2025-09-09T05:37:15.225913470Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 9 05:37:15.226323 containerd[1587]: time="2025-09-09T05:37:15.226292541Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 05:37:16.251146 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3814991284.mount: Deactivated successfully. Sep 9 05:37:17.299250 containerd[1587]: time="2025-09-09T05:37:17.299191405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:17.300024 containerd[1587]: time="2025-09-09T05:37:17.299992958Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 9 05:37:17.301304 containerd[1587]: time="2025-09-09T05:37:17.301263260Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:17.303135 containerd[1587]: time="2025-09-09T05:37:17.303104993Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:17.303600 containerd[1587]: time="2025-09-09T05:37:17.303545609Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 2.077228122s" Sep 9 05:37:17.303632 containerd[1587]: time="2025-09-09T05:37:17.303604620Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 9 05:37:17.304048 containerd[1587]: time="2025-09-09T05:37:17.304017334Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 05:37:17.828481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount532872421.mount: Deactivated successfully. Sep 9 05:37:18.708534 containerd[1587]: time="2025-09-09T05:37:18.708465865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:18.709367 containerd[1587]: time="2025-09-09T05:37:18.709288918Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 9 05:37:18.710571 containerd[1587]: time="2025-09-09T05:37:18.710530295Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:18.713283 containerd[1587]: time="2025-09-09T05:37:18.713255956Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:18.714395 containerd[1587]: time="2025-09-09T05:37:18.714350789Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.410295474s" Sep 9 05:37:18.714395 containerd[1587]: time="2025-09-09T05:37:18.714387007Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 05:37:18.714804 containerd[1587]: time="2025-09-09T05:37:18.714775034Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:37:19.125081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2525504705.mount: Deactivated successfully. Sep 9 05:37:19.130635 containerd[1587]: time="2025-09-09T05:37:19.130573426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:37:19.131364 containerd[1587]: time="2025-09-09T05:37:19.131331027Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 9 05:37:19.132473 containerd[1587]: time="2025-09-09T05:37:19.132436549Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:37:19.134292 containerd[1587]: time="2025-09-09T05:37:19.134255700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:37:19.134888 containerd[1587]: time="2025-09-09T05:37:19.134842120Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 419.961708ms" Sep 9 05:37:19.134888 containerd[1587]: time="2025-09-09T05:37:19.134875893Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 05:37:19.135330 containerd[1587]: time="2025-09-09T05:37:19.135301792Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 05:37:19.664665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2849465674.mount: Deactivated successfully. Sep 9 05:37:23.170791 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:37:23.172430 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:23.643695 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:23.647694 (kubelet)[2243]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:37:23.681940 kubelet[2243]: E0909 05:37:23.681851 2243 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:37:23.685842 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:37:23.686036 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:37:23.686415 systemd[1]: kubelet.service: Consumed 196ms CPU time, 108.9M memory peak. Sep 9 05:37:24.222188 containerd[1587]: time="2025-09-09T05:37:24.222121758Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:24.223086 containerd[1587]: time="2025-09-09T05:37:24.223050299Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 9 05:37:24.224320 containerd[1587]: time="2025-09-09T05:37:24.224280445Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:24.227047 containerd[1587]: time="2025-09-09T05:37:24.227010384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:24.227977 containerd[1587]: time="2025-09-09T05:37:24.227948593Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.092618759s" Sep 9 05:37:24.228029 containerd[1587]: time="2025-09-09T05:37:24.227978159Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 9 05:37:26.496283 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:26.496505 systemd[1]: kubelet.service: Consumed 196ms CPU time, 108.9M memory peak. Sep 9 05:37:26.499086 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:26.524891 systemd[1]: Reload requested from client PID 2280 ('systemctl') (unit session-7.scope)... Sep 9 05:37:26.524908 systemd[1]: Reloading... Sep 9 05:37:26.592763 zram_generator::config[2322]: No configuration found. Sep 9 05:37:26.841044 systemd[1]: Reloading finished in 315 ms. Sep 9 05:37:26.903198 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 05:37:26.903309 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 05:37:26.903634 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:26.903679 systemd[1]: kubelet.service: Consumed 150ms CPU time, 98.3M memory peak. Sep 9 05:37:26.905155 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:27.070319 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:27.080873 (kubelet)[2370]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:37:27.117637 kubelet[2370]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:37:27.117637 kubelet[2370]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:37:27.117637 kubelet[2370]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:37:27.117637 kubelet[2370]: I0909 05:37:27.117577 2370 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:37:27.867989 kubelet[2370]: I0909 05:37:27.867935 2370 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:37:27.867989 kubelet[2370]: I0909 05:37:27.867968 2370 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:37:27.868245 kubelet[2370]: I0909 05:37:27.868220 2370 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:37:27.891212 kubelet[2370]: E0909 05:37:27.891167 2370 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:37:27.892471 kubelet[2370]: I0909 05:37:27.892448 2370 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:37:27.897425 kubelet[2370]: I0909 05:37:27.897405 2370 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:37:27.903727 kubelet[2370]: I0909 05:37:27.903696 2370 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:37:27.904248 kubelet[2370]: I0909 05:37:27.904224 2370 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:37:27.904413 kubelet[2370]: I0909 05:37:27.904373 2370 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:37:27.904556 kubelet[2370]: I0909 05:37:27.904406 2370 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:37:27.904671 kubelet[2370]: I0909 05:37:27.904561 2370 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:37:27.904671 kubelet[2370]: I0909 05:37:27.904569 2370 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:37:27.904733 kubelet[2370]: I0909 05:37:27.904700 2370 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:37:27.906715 kubelet[2370]: I0909 05:37:27.906691 2370 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:37:27.906715 kubelet[2370]: I0909 05:37:27.906714 2370 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:37:27.906809 kubelet[2370]: I0909 05:37:27.906748 2370 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:37:27.906809 kubelet[2370]: I0909 05:37:27.906781 2370 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:37:28.068611 kubelet[2370]: I0909 05:37:28.068567 2370 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:37:28.068982 kubelet[2370]: I0909 05:37:28.068968 2370 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:37:28.069065 kubelet[2370]: W0909 05:37:28.069026 2370 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:37:28.070917 kubelet[2370]: I0909 05:37:28.070889 2370 server.go:1274] "Started kubelet" Sep 9 05:37:28.073133 kubelet[2370]: I0909 05:37:28.073095 2370 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:37:28.074605 kubelet[2370]: I0909 05:37:28.073882 2370 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:37:28.074605 kubelet[2370]: W0909 05:37:28.073906 2370 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Sep 9 05:37:28.074605 kubelet[2370]: E0909 05:37:28.073960 2370 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:37:28.075352 kubelet[2370]: W0909 05:37:28.075208 2370 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Sep 9 05:37:28.075352 kubelet[2370]: E0909 05:37:28.075245 2370 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:37:28.075469 kubelet[2370]: I0909 05:37:28.075311 2370 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:37:28.075798 kubelet[2370]: I0909 05:37:28.075775 2370 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:37:28.077192 kubelet[2370]: I0909 05:37:28.077177 2370 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:37:28.077576 kubelet[2370]: E0909 05:37:28.077558 2370 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:37:28.077637 kubelet[2370]: I0909 05:37:28.077619 2370 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:37:28.079001 kubelet[2370]: E0909 05:37:28.077947 2370 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.118:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.118:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863869397de50d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 05:37:28.070869208 +0000 UTC m=+0.986611086,LastTimestamp:2025-09-09 05:37:28.070869208 +0000 UTC m=+0.986611086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 05:37:28.079356 kubelet[2370]: E0909 05:37:28.079266 2370 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:37:28.079356 kubelet[2370]: I0909 05:37:28.079297 2370 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:37:28.079436 kubelet[2370]: I0909 05:37:28.079422 2370 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:37:28.079475 kubelet[2370]: I0909 05:37:28.079454 2370 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:37:28.080020 kubelet[2370]: W0909 05:37:28.079693 2370 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Sep 9 05:37:28.080020 kubelet[2370]: E0909 05:37:28.079725 2370 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:37:28.080020 kubelet[2370]: I0909 05:37:28.079846 2370 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:37:28.080020 kubelet[2370]: I0909 05:37:28.079904 2370 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:37:28.080020 kubelet[2370]: E0909 05:37:28.079922 2370 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="200ms" Sep 9 05:37:28.080830 kubelet[2370]: I0909 05:37:28.080809 2370 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:37:28.091319 kubelet[2370]: I0909 05:37:28.091281 2370 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:37:28.091469 kubelet[2370]: I0909 05:37:28.091441 2370 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:37:28.091469 kubelet[2370]: I0909 05:37:28.091466 2370 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:37:28.144647 kubelet[2370]: I0909 05:37:28.144618 2370 policy_none.go:49] "None policy: Start" Sep 9 05:37:28.145422 kubelet[2370]: I0909 05:37:28.145399 2370 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:37:28.145458 kubelet[2370]: I0909 05:37:28.145425 2370 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:37:28.148934 kubelet[2370]: I0909 05:37:28.148896 2370 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:37:28.150545 kubelet[2370]: I0909 05:37:28.150524 2370 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:37:28.150545 kubelet[2370]: I0909 05:37:28.150545 2370 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:37:28.150671 kubelet[2370]: I0909 05:37:28.150563 2370 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:37:28.150671 kubelet[2370]: E0909 05:37:28.150619 2370 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:37:28.152269 kubelet[2370]: W0909 05:37:28.152232 2370 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Sep 9 05:37:28.152269 kubelet[2370]: E0909 05:37:28.152263 2370 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:37:28.155271 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:37:28.168462 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:37:28.171898 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:37:28.179653 kubelet[2370]: E0909 05:37:28.179623 2370 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:37:28.183637 kubelet[2370]: I0909 05:37:28.183600 2370 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:37:28.183851 kubelet[2370]: I0909 05:37:28.183823 2370 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:37:28.183912 kubelet[2370]: I0909 05:37:28.183847 2370 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:37:28.184063 kubelet[2370]: I0909 05:37:28.184042 2370 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:37:28.185438 kubelet[2370]: E0909 05:37:28.185414 2370 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 05:37:28.258986 systemd[1]: Created slice kubepods-burstable-podd4b04db1db121b3dc80f6e397e08f515.slice - libcontainer container kubepods-burstable-podd4b04db1db121b3dc80f6e397e08f515.slice. Sep 9 05:37:28.279059 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 9 05:37:28.280545 kubelet[2370]: E0909 05:37:28.280506 2370 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="400ms" Sep 9 05:37:28.283301 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 9 05:37:28.285105 kubelet[2370]: I0909 05:37:28.285054 2370 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:37:28.285414 kubelet[2370]: E0909 05:37:28.285387 2370 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Sep 9 05:37:28.380750 kubelet[2370]: I0909 05:37:28.380705 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:28.380750 kubelet[2370]: I0909 05:37:28.380735 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:28.380750 kubelet[2370]: I0909 05:37:28.380775 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:28.380993 kubelet[2370]: I0909 05:37:28.380797 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:28.380993 kubelet[2370]: I0909 05:37:28.380816 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:28.380993 kubelet[2370]: I0909 05:37:28.380835 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:28.380993 kubelet[2370]: I0909 05:37:28.380855 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:28.380993 kubelet[2370]: I0909 05:37:28.380869 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 05:37:28.381103 kubelet[2370]: I0909 05:37:28.380884 2370 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:28.487681 kubelet[2370]: I0909 05:37:28.487562 2370 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:37:28.487953 kubelet[2370]: E0909 05:37:28.487920 2370 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Sep 9 05:37:28.576432 containerd[1587]: time="2025-09-09T05:37:28.576381313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d4b04db1db121b3dc80f6e397e08f515,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:28.582206 containerd[1587]: time="2025-09-09T05:37:28.582169156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:28.586791 containerd[1587]: time="2025-09-09T05:37:28.586744785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:28.606121 containerd[1587]: time="2025-09-09T05:37:28.605838756Z" level=info msg="connecting to shim 6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e" address="unix:///run/containerd/s/151818c919e4e13670ff6183977534d24c081a577d89702339770a7505f409ee" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:28.621933 containerd[1587]: time="2025-09-09T05:37:28.621874714Z" level=info msg="connecting to shim 43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16" address="unix:///run/containerd/s/110bff12ae6142d6adc0e6adddccaf06830280cabbe4e4a704e9c4b4d27eca5b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:28.636858 containerd[1587]: time="2025-09-09T05:37:28.635740722Z" level=info msg="connecting to shim 908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d" address="unix:///run/containerd/s/b4fe278ccc8262c6339a206adde32464694e41f78c4c5f52b2439cc9d26e0693" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:28.643949 systemd[1]: Started cri-containerd-6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e.scope - libcontainer container 6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e. Sep 9 05:37:28.654327 systemd[1]: Started cri-containerd-43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16.scope - libcontainer container 43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16. Sep 9 05:37:28.657707 systemd[1]: Started cri-containerd-908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d.scope - libcontainer container 908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d. Sep 9 05:37:28.681603 kubelet[2370]: E0909 05:37:28.681540 2370 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="800ms" Sep 9 05:37:28.705839 containerd[1587]: time="2025-09-09T05:37:28.705151307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16\"" Sep 9 05:37:28.710503 containerd[1587]: time="2025-09-09T05:37:28.710467765Z" level=info msg="CreateContainer within sandbox \"43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:37:28.710654 containerd[1587]: time="2025-09-09T05:37:28.710201837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d4b04db1db121b3dc80f6e397e08f515,Namespace:kube-system,Attempt:0,} returns sandbox id \"6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e\"" Sep 9 05:37:28.713553 containerd[1587]: time="2025-09-09T05:37:28.713518275Z" level=info msg="CreateContainer within sandbox \"6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:37:28.717319 containerd[1587]: time="2025-09-09T05:37:28.717279838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d\"" Sep 9 05:37:28.720345 containerd[1587]: time="2025-09-09T05:37:28.719826954Z" level=info msg="CreateContainer within sandbox \"908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:37:28.723322 containerd[1587]: time="2025-09-09T05:37:28.723287813Z" level=info msg="Container 1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:28.728626 containerd[1587]: time="2025-09-09T05:37:28.728228236Z" level=info msg="Container 6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:28.737127 containerd[1587]: time="2025-09-09T05:37:28.737086806Z" level=info msg="CreateContainer within sandbox \"43f63cee6fdfaef02e60d40d5e94ccd5b29d1589a145f6e12543097573df4a16\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6\"" Sep 9 05:37:28.737580 containerd[1587]: time="2025-09-09T05:37:28.737547700Z" level=info msg="StartContainer for \"1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6\"" Sep 9 05:37:28.738719 containerd[1587]: time="2025-09-09T05:37:28.738646751Z" level=info msg="connecting to shim 1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6" address="unix:///run/containerd/s/110bff12ae6142d6adc0e6adddccaf06830280cabbe4e4a704e9c4b4d27eca5b" protocol=ttrpc version=3 Sep 9 05:37:28.738874 containerd[1587]: time="2025-09-09T05:37:28.738839082Z" level=info msg="CreateContainer within sandbox \"6779314ca7bf2a24e54a86750252ae28a095988e47ae083965b176fccf38aa1e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134\"" Sep 9 05:37:28.739403 containerd[1587]: time="2025-09-09T05:37:28.739373864Z" level=info msg="StartContainer for \"6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134\"" Sep 9 05:37:28.740069 containerd[1587]: time="2025-09-09T05:37:28.740035605Z" level=info msg="Container abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:28.740407 containerd[1587]: time="2025-09-09T05:37:28.740376494Z" level=info msg="connecting to shim 6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134" address="unix:///run/containerd/s/151818c919e4e13670ff6183977534d24c081a577d89702339770a7505f409ee" protocol=ttrpc version=3 Sep 9 05:37:28.748014 containerd[1587]: time="2025-09-09T05:37:28.747981323Z" level=info msg="CreateContainer within sandbox \"908059a64955b93fa2c3eb33e349bae1953a5557effec83f08cfddeacf6ae12d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433\"" Sep 9 05:37:28.748661 containerd[1587]: time="2025-09-09T05:37:28.748632685Z" level=info msg="StartContainer for \"abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433\"" Sep 9 05:37:28.749917 containerd[1587]: time="2025-09-09T05:37:28.749669118Z" level=info msg="connecting to shim abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433" address="unix:///run/containerd/s/b4fe278ccc8262c6339a206adde32464694e41f78c4c5f52b2439cc9d26e0693" protocol=ttrpc version=3 Sep 9 05:37:28.758721 systemd[1]: Started cri-containerd-1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6.scope - libcontainer container 1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6. Sep 9 05:37:28.762064 systemd[1]: Started cri-containerd-6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134.scope - libcontainer container 6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134. Sep 9 05:37:28.767268 systemd[1]: Started cri-containerd-abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433.scope - libcontainer container abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433. Sep 9 05:37:28.810622 containerd[1587]: time="2025-09-09T05:37:28.810564707Z" level=info msg="StartContainer for \"6d6b3308428b736f5df3d985e0c5ba6772d7bffc28fe141f50c5345d3dd41134\" returns successfully" Sep 9 05:37:28.816472 containerd[1587]: time="2025-09-09T05:37:28.816434834Z" level=info msg="StartContainer for \"1e863c7cba2b7093f849582614ebdd87d18dc166621c85e890dfff9c5cd1ded6\" returns successfully" Sep 9 05:37:28.821122 containerd[1587]: time="2025-09-09T05:37:28.821081606Z" level=info msg="StartContainer for \"abacd2295b9c6564d24e3a3b09fa3ba257b3e82e5e6361ffc597681b5cd41433\" returns successfully" Sep 9 05:37:28.889581 kubelet[2370]: I0909 05:37:28.889513 2370 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:37:29.660007 kubelet[2370]: E0909 05:37:29.659966 2370 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 05:37:29.818319 kubelet[2370]: I0909 05:37:29.817916 2370 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 05:37:29.818319 kubelet[2370]: E0909 05:37:29.817970 2370 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 9 05:37:29.908220 kubelet[2370]: I0909 05:37:29.908185 2370 apiserver.go:52] "Watching apiserver" Sep 9 05:37:29.979778 kubelet[2370]: I0909 05:37:29.979663 2370 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:37:30.172631 kubelet[2370]: E0909 05:37:30.172582 2370 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:31.511300 systemd[1]: Reload requested from client PID 2643 ('systemctl') (unit session-7.scope)... Sep 9 05:37:31.511315 systemd[1]: Reloading... Sep 9 05:37:31.578639 zram_generator::config[2688]: No configuration found. Sep 9 05:37:31.793958 systemd[1]: Reloading finished in 282 ms. Sep 9 05:37:31.821021 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:31.834924 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:37:31.835241 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:31.835295 systemd[1]: kubelet.service: Consumed 741ms CPU time, 134.4M memory peak. Sep 9 05:37:31.837154 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:37:32.029058 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:37:32.038934 (kubelet)[2731]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:37:32.074517 kubelet[2731]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:37:32.074517 kubelet[2731]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:37:32.074517 kubelet[2731]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:37:32.074895 kubelet[2731]: I0909 05:37:32.074505 2731 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:37:32.080882 kubelet[2731]: I0909 05:37:32.080849 2731 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:37:32.080882 kubelet[2731]: I0909 05:37:32.080871 2731 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:37:32.081095 kubelet[2731]: I0909 05:37:32.081078 2731 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:37:32.082200 kubelet[2731]: I0909 05:37:32.082181 2731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 05:37:32.083933 kubelet[2731]: I0909 05:37:32.083918 2731 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:37:32.088626 kubelet[2731]: I0909 05:37:32.087978 2731 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:37:32.092241 kubelet[2731]: I0909 05:37:32.092222 2731 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:37:32.092328 kubelet[2731]: I0909 05:37:32.092315 2731 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:37:32.092454 kubelet[2731]: I0909 05:37:32.092428 2731 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:37:32.092621 kubelet[2731]: I0909 05:37:32.092454 2731 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:37:32.092703 kubelet[2731]: I0909 05:37:32.092625 2731 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:37:32.092703 kubelet[2731]: I0909 05:37:32.092632 2731 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:37:32.092703 kubelet[2731]: I0909 05:37:32.092654 2731 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:37:32.092773 kubelet[2731]: I0909 05:37:32.092754 2731 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:37:32.092773 kubelet[2731]: I0909 05:37:32.092764 2731 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:37:32.092813 kubelet[2731]: I0909 05:37:32.092792 2731 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:37:32.092813 kubelet[2731]: I0909 05:37:32.092802 2731 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:37:32.093563 kubelet[2731]: I0909 05:37:32.093549 2731 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:37:32.093913 kubelet[2731]: I0909 05:37:32.093894 2731 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:37:32.094228 kubelet[2731]: I0909 05:37:32.094212 2731 server.go:1274] "Started kubelet" Sep 9 05:37:32.095112 kubelet[2731]: I0909 05:37:32.095077 2731 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:37:32.096598 kubelet[2731]: I0909 05:37:32.095848 2731 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:37:32.097697 kubelet[2731]: I0909 05:37:32.097659 2731 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:37:32.097854 kubelet[2731]: I0909 05:37:32.097834 2731 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:37:32.099799 kubelet[2731]: E0909 05:37:32.099777 2731 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:37:32.099988 kubelet[2731]: I0909 05:37:32.099970 2731 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:37:32.101550 kubelet[2731]: I0909 05:37:32.101514 2731 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:37:32.105654 kubelet[2731]: I0909 05:37:32.105621 2731 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:37:32.105907 kubelet[2731]: I0909 05:37:32.105888 2731 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:37:32.106991 kubelet[2731]: I0909 05:37:32.106707 2731 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:37:32.107153 kubelet[2731]: I0909 05:37:32.107134 2731 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:37:32.107545 kubelet[2731]: I0909 05:37:32.107256 2731 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:37:32.109840 kubelet[2731]: I0909 05:37:32.109784 2731 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:37:32.115978 kubelet[2731]: I0909 05:37:32.115653 2731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:37:32.116974 kubelet[2731]: I0909 05:37:32.116952 2731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:37:32.116974 kubelet[2731]: I0909 05:37:32.116971 2731 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:37:32.117041 kubelet[2731]: I0909 05:37:32.116986 2731 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:37:32.117041 kubelet[2731]: E0909 05:37:32.117021 2731 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:37:32.138183 kubelet[2731]: I0909 05:37:32.138155 2731 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:37:32.138183 kubelet[2731]: I0909 05:37:32.138174 2731 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:37:32.138183 kubelet[2731]: I0909 05:37:32.138191 2731 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:37:32.138363 kubelet[2731]: I0909 05:37:32.138311 2731 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:37:32.138363 kubelet[2731]: I0909 05:37:32.138320 2731 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:37:32.138363 kubelet[2731]: I0909 05:37:32.138337 2731 policy_none.go:49] "None policy: Start" Sep 9 05:37:32.138855 kubelet[2731]: I0909 05:37:32.138835 2731 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:37:32.138904 kubelet[2731]: I0909 05:37:32.138861 2731 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:37:32.139051 kubelet[2731]: I0909 05:37:32.139026 2731 state_mem.go:75] "Updated machine memory state" Sep 9 05:37:32.143093 kubelet[2731]: I0909 05:37:32.143072 2731 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:37:32.143237 kubelet[2731]: I0909 05:37:32.143225 2731 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:37:32.143259 kubelet[2731]: I0909 05:37:32.143238 2731 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:37:32.143655 kubelet[2731]: I0909 05:37:32.143620 2731 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:37:32.226614 kubelet[2731]: E0909 05:37:32.225823 2731 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:32.248039 kubelet[2731]: I0909 05:37:32.248004 2731 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:37:32.254414 kubelet[2731]: I0909 05:37:32.254379 2731 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 9 05:37:32.254514 kubelet[2731]: I0909 05:37:32.254469 2731 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 05:37:32.307958 kubelet[2731]: I0909 05:37:32.307926 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:32.307958 kubelet[2731]: I0909 05:37:32.307951 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:32.308050 kubelet[2731]: I0909 05:37:32.307970 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 05:37:32.308050 kubelet[2731]: I0909 05:37:32.307985 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:32.308050 kubelet[2731]: I0909 05:37:32.308000 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:32.308050 kubelet[2731]: I0909 05:37:32.308014 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4b04db1db121b3dc80f6e397e08f515-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d4b04db1db121b3dc80f6e397e08f515\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:32.308050 kubelet[2731]: I0909 05:37:32.308032 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:32.308164 kubelet[2731]: I0909 05:37:32.308050 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:32.308164 kubelet[2731]: I0909 05:37:32.308098 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:37:33.093298 kubelet[2731]: I0909 05:37:33.093241 2731 apiserver.go:52] "Watching apiserver" Sep 9 05:37:33.106486 kubelet[2731]: I0909 05:37:33.106453 2731 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:37:33.133819 kubelet[2731]: E0909 05:37:33.133786 2731 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 05:37:33.133948 kubelet[2731]: E0909 05:37:33.133786 2731 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 05:37:33.145534 kubelet[2731]: I0909 05:37:33.145474 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.145454819 podStartE2EDuration="1.145454819s" podCreationTimestamp="2025-09-09 05:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:37:33.145235428 +0000 UTC m=+1.102624298" watchObservedRunningTime="2025-09-09 05:37:33.145454819 +0000 UTC m=+1.102843689" Sep 9 05:37:33.158198 kubelet[2731]: I0909 05:37:33.158127 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.158108924 podStartE2EDuration="1.158108924s" podCreationTimestamp="2025-09-09 05:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:37:33.151751533 +0000 UTC m=+1.109140403" watchObservedRunningTime="2025-09-09 05:37:33.158108924 +0000 UTC m=+1.115497784" Sep 9 05:37:38.071559 kubelet[2731]: I0909 05:37:38.071524 2731 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:37:38.072111 kubelet[2731]: I0909 05:37:38.071944 2731 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:37:38.072157 containerd[1587]: time="2025-09-09T05:37:38.071802865Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:37:38.492535 kubelet[2731]: I0909 05:37:38.492472 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=7.492453337 podStartE2EDuration="7.492453337s" podCreationTimestamp="2025-09-09 05:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:37:33.158381958 +0000 UTC m=+1.115770828" watchObservedRunningTime="2025-09-09 05:37:38.492453337 +0000 UTC m=+6.449842207" Sep 9 05:37:38.499635 systemd[1]: Created slice kubepods-besteffort-pod0a040442_537b_46cc_8bce_6c359b9b80e2.slice - libcontainer container kubepods-besteffort-pod0a040442_537b_46cc_8bce_6c359b9b80e2.slice. Sep 9 05:37:38.550922 kubelet[2731]: I0909 05:37:38.550868 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmzz\" (UniqueName: \"kubernetes.io/projected/0a040442-537b-46cc-8bce-6c359b9b80e2-kube-api-access-qxmzz\") pod \"kube-proxy-hd9br\" (UID: \"0a040442-537b-46cc-8bce-6c359b9b80e2\") " pod="kube-system/kube-proxy-hd9br" Sep 9 05:37:38.550922 kubelet[2731]: I0909 05:37:38.550917 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0a040442-537b-46cc-8bce-6c359b9b80e2-kube-proxy\") pod \"kube-proxy-hd9br\" (UID: \"0a040442-537b-46cc-8bce-6c359b9b80e2\") " pod="kube-system/kube-proxy-hd9br" Sep 9 05:37:38.551085 kubelet[2731]: I0909 05:37:38.550941 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0a040442-537b-46cc-8bce-6c359b9b80e2-xtables-lock\") pod \"kube-proxy-hd9br\" (UID: \"0a040442-537b-46cc-8bce-6c359b9b80e2\") " pod="kube-system/kube-proxy-hd9br" Sep 9 05:37:38.551085 kubelet[2731]: I0909 05:37:38.550959 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a040442-537b-46cc-8bce-6c359b9b80e2-lib-modules\") pod \"kube-proxy-hd9br\" (UID: \"0a040442-537b-46cc-8bce-6c359b9b80e2\") " pod="kube-system/kube-proxy-hd9br" Sep 9 05:37:38.655016 kubelet[2731]: E0909 05:37:38.654978 2731 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 9 05:37:38.655016 kubelet[2731]: E0909 05:37:38.655010 2731 projected.go:194] Error preparing data for projected volume kube-api-access-qxmzz for pod kube-system/kube-proxy-hd9br: configmap "kube-root-ca.crt" not found Sep 9 05:37:38.655158 kubelet[2731]: E0909 05:37:38.655059 2731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a040442-537b-46cc-8bce-6c359b9b80e2-kube-api-access-qxmzz podName:0a040442-537b-46cc-8bce-6c359b9b80e2 nodeName:}" failed. No retries permitted until 2025-09-09 05:37:39.155040935 +0000 UTC m=+7.112429805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qxmzz" (UniqueName: "kubernetes.io/projected/0a040442-537b-46cc-8bce-6c359b9b80e2-kube-api-access-qxmzz") pod "kube-proxy-hd9br" (UID: "0a040442-537b-46cc-8bce-6c359b9b80e2") : configmap "kube-root-ca.crt" not found Sep 9 05:37:39.064108 systemd[1]: Created slice kubepods-besteffort-pod9cd5130d_36af_4ee2_bc8b_287302ccd189.slice - libcontainer container kubepods-besteffort-pod9cd5130d_36af_4ee2_bc8b_287302ccd189.slice. Sep 9 05:37:39.155331 kubelet[2731]: I0909 05:37:39.155266 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9cd5130d-36af-4ee2-bc8b-287302ccd189-var-lib-calico\") pod \"tigera-operator-58fc44c59b-r7tlj\" (UID: \"9cd5130d-36af-4ee2-bc8b-287302ccd189\") " pod="tigera-operator/tigera-operator-58fc44c59b-r7tlj" Sep 9 05:37:39.155331 kubelet[2731]: I0909 05:37:39.155309 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nbk7\" (UniqueName: \"kubernetes.io/projected/9cd5130d-36af-4ee2-bc8b-287302ccd189-kube-api-access-7nbk7\") pod \"tigera-operator-58fc44c59b-r7tlj\" (UID: \"9cd5130d-36af-4ee2-bc8b-287302ccd189\") " pod="tigera-operator/tigera-operator-58fc44c59b-r7tlj" Sep 9 05:37:39.368662 containerd[1587]: time="2025-09-09T05:37:39.368525072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-r7tlj,Uid:9cd5130d-36af-4ee2-bc8b-287302ccd189,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:37:39.387010 containerd[1587]: time="2025-09-09T05:37:39.386963086Z" level=info msg="connecting to shim 99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5" address="unix:///run/containerd/s/49c5284a544db231dd92d93e14df5ec51a320c0234113d087438f6cdf0b125e1" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:39.410161 containerd[1587]: time="2025-09-09T05:37:39.410086385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hd9br,Uid:0a040442-537b-46cc-8bce-6c359b9b80e2,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:39.415755 systemd[1]: Started cri-containerd-99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5.scope - libcontainer container 99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5. Sep 9 05:37:39.433779 containerd[1587]: time="2025-09-09T05:37:39.433686192Z" level=info msg="connecting to shim c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584" address="unix:///run/containerd/s/74a8c90612c5d41b5c44fbbb7157bfc26e51ce5b8eec3b4c9f26c13ea5b830c2" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:39.454215 systemd[1]: Started cri-containerd-c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584.scope - libcontainer container c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584. Sep 9 05:37:39.462543 containerd[1587]: time="2025-09-09T05:37:39.462493128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-r7tlj,Uid:9cd5130d-36af-4ee2-bc8b-287302ccd189,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5\"" Sep 9 05:37:39.465821 containerd[1587]: time="2025-09-09T05:37:39.465779048Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:37:39.486778 containerd[1587]: time="2025-09-09T05:37:39.483550331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hd9br,Uid:0a040442-537b-46cc-8bce-6c359b9b80e2,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584\"" Sep 9 05:37:39.489809 containerd[1587]: time="2025-09-09T05:37:39.489765601Z" level=info msg="CreateContainer within sandbox \"c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:37:39.502562 containerd[1587]: time="2025-09-09T05:37:39.502536900Z" level=info msg="Container 62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:39.510311 containerd[1587]: time="2025-09-09T05:37:39.510267496Z" level=info msg="CreateContainer within sandbox \"c7a40a151867172979184868d02779f29fa28a882ed35aee0b0e1ae13ec44584\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e\"" Sep 9 05:37:39.510725 containerd[1587]: time="2025-09-09T05:37:39.510704879Z" level=info msg="StartContainer for \"62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e\"" Sep 9 05:37:39.512970 containerd[1587]: time="2025-09-09T05:37:39.512936130Z" level=info msg="connecting to shim 62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e" address="unix:///run/containerd/s/74a8c90612c5d41b5c44fbbb7157bfc26e51ce5b8eec3b4c9f26c13ea5b830c2" protocol=ttrpc version=3 Sep 9 05:37:39.536748 systemd[1]: Started cri-containerd-62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e.scope - libcontainer container 62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e. Sep 9 05:37:39.580635 containerd[1587]: time="2025-09-09T05:37:39.580581532Z" level=info msg="StartContainer for \"62805748198b3c1db56893d8b2b1b15dd139d3e8084000d3de7567b74b0b6c1e\" returns successfully" Sep 9 05:37:40.878197 kubelet[2731]: I0909 05:37:40.878128 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hd9br" podStartSLOduration=2.878113281 podStartE2EDuration="2.878113281s" podCreationTimestamp="2025-09-09 05:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:37:40.146346293 +0000 UTC m=+8.103735173" watchObservedRunningTime="2025-09-09 05:37:40.878113281 +0000 UTC m=+8.835502151" Sep 9 05:37:41.034241 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3669974007.mount: Deactivated successfully. Sep 9 05:37:41.422436 containerd[1587]: time="2025-09-09T05:37:41.422345394Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:41.423166 containerd[1587]: time="2025-09-09T05:37:41.423112944Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 05:37:41.424219 containerd[1587]: time="2025-09-09T05:37:41.424178900Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:41.426183 containerd[1587]: time="2025-09-09T05:37:41.426141151Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:41.426690 containerd[1587]: time="2025-09-09T05:37:41.426643135Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.960820736s" Sep 9 05:37:41.426690 containerd[1587]: time="2025-09-09T05:37:41.426684023Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 05:37:41.428655 containerd[1587]: time="2025-09-09T05:37:41.428576872Z" level=info msg="CreateContainer within sandbox \"99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:37:41.435020 containerd[1587]: time="2025-09-09T05:37:41.434976470Z" level=info msg="Container 2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:41.441329 containerd[1587]: time="2025-09-09T05:37:41.441284434Z" level=info msg="CreateContainer within sandbox \"99e676b9d1e0f2fcadf1573d817d3b006ce4880e5979cb227515ef22bde15df5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721\"" Sep 9 05:37:41.441738 containerd[1587]: time="2025-09-09T05:37:41.441700315Z" level=info msg="StartContainer for \"2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721\"" Sep 9 05:37:41.442395 containerd[1587]: time="2025-09-09T05:37:41.442370378Z" level=info msg="connecting to shim 2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721" address="unix:///run/containerd/s/49c5284a544db231dd92d93e14df5ec51a320c0234113d087438f6cdf0b125e1" protocol=ttrpc version=3 Sep 9 05:37:41.486723 systemd[1]: Started cri-containerd-2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721.scope - libcontainer container 2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721. Sep 9 05:37:41.514475 containerd[1587]: time="2025-09-09T05:37:41.514437953Z" level=info msg="StartContainer for \"2f881fd5705107a0ad5b9ce0d0fb3a413f77847536827e643b921b2991ba7721\" returns successfully" Sep 9 05:37:44.224557 kubelet[2731]: I0909 05:37:44.224497 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-r7tlj" podStartSLOduration=3.261339159 podStartE2EDuration="5.224483488s" podCreationTimestamp="2025-09-09 05:37:39 +0000 UTC" firstStartedPulling="2025-09-09 05:37:39.464321421 +0000 UTC m=+7.421710291" lastFinishedPulling="2025-09-09 05:37:41.42746575 +0000 UTC m=+9.384854620" observedRunningTime="2025-09-09 05:37:42.151578806 +0000 UTC m=+10.108967686" watchObservedRunningTime="2025-09-09 05:37:44.224483488 +0000 UTC m=+12.181872348" Sep 9 05:37:45.007093 update_engine[1574]: I20250909 05:37:45.006999 1574 update_attempter.cc:509] Updating boot flags... Sep 9 05:37:46.672739 sudo[1798]: pam_unix(sudo:session): session closed for user root Sep 9 05:37:46.674923 sshd[1797]: Connection closed by 10.0.0.1 port 35148 Sep 9 05:37:46.675822 sshd-session[1794]: pam_unix(sshd:session): session closed for user core Sep 9 05:37:46.681103 systemd[1]: sshd@6-10.0.0.118:22-10.0.0.1:35148.service: Deactivated successfully. Sep 9 05:37:46.688438 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:37:46.689730 systemd[1]: session-7.scope: Consumed 4.040s CPU time, 218.8M memory peak. Sep 9 05:37:46.691659 systemd-logind[1571]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:37:46.693412 systemd-logind[1571]: Removed session 7. Sep 9 05:37:48.900207 systemd[1]: Created slice kubepods-besteffort-pod43c598fd_5165_4876_8ce6_6a92a18474e8.slice - libcontainer container kubepods-besteffort-pod43c598fd_5165_4876_8ce6_6a92a18474e8.slice. Sep 9 05:37:48.916910 kubelet[2731]: I0909 05:37:48.916848 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gmg\" (UniqueName: \"kubernetes.io/projected/43c598fd-5165-4876-8ce6-6a92a18474e8-kube-api-access-n5gmg\") pod \"calico-typha-6875c4fb9b-ln9fj\" (UID: \"43c598fd-5165-4876-8ce6-6a92a18474e8\") " pod="calico-system/calico-typha-6875c4fb9b-ln9fj" Sep 9 05:37:48.917556 kubelet[2731]: I0909 05:37:48.916929 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/43c598fd-5165-4876-8ce6-6a92a18474e8-typha-certs\") pod \"calico-typha-6875c4fb9b-ln9fj\" (UID: \"43c598fd-5165-4876-8ce6-6a92a18474e8\") " pod="calico-system/calico-typha-6875c4fb9b-ln9fj" Sep 9 05:37:48.917556 kubelet[2731]: I0909 05:37:48.916947 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43c598fd-5165-4876-8ce6-6a92a18474e8-tigera-ca-bundle\") pod \"calico-typha-6875c4fb9b-ln9fj\" (UID: \"43c598fd-5165-4876-8ce6-6a92a18474e8\") " pod="calico-system/calico-typha-6875c4fb9b-ln9fj" Sep 9 05:37:49.203268 systemd[1]: Created slice kubepods-besteffort-pod2be073de_7070_4458_8edc_45f2469b1c8a.slice - libcontainer container kubepods-besteffort-pod2be073de_7070_4458_8edc_45f2469b1c8a.slice. Sep 9 05:37:49.206053 containerd[1587]: time="2025-09-09T05:37:49.205819959Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6875c4fb9b-ln9fj,Uid:43c598fd-5165-4876-8ce6-6a92a18474e8,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:49.219694 kubelet[2731]: I0909 05:37:49.219638 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-cni-net-dir\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219694 kubelet[2731]: I0909 05:37:49.219679 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-lib-modules\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219694 kubelet[2731]: I0909 05:37:49.219694 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-xtables-lock\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219892 kubelet[2731]: I0909 05:37:49.219711 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-cni-bin-dir\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219892 kubelet[2731]: I0909 05:37:49.219726 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/2be073de-7070-4458-8edc-45f2469b1c8a-node-certs\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219892 kubelet[2731]: I0909 05:37:49.219741 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-var-run-calico\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219892 kubelet[2731]: I0909 05:37:49.219765 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-policysync\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.219892 kubelet[2731]: I0909 05:37:49.219779 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-cni-log-dir\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.220007 kubelet[2731]: I0909 05:37:49.219795 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-var-lib-calico\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.220007 kubelet[2731]: I0909 05:37:49.219809 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2be073de-7070-4458-8edc-45f2469b1c8a-tigera-ca-bundle\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.220007 kubelet[2731]: I0909 05:37:49.219847 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96sk8\" (UniqueName: \"kubernetes.io/projected/2be073de-7070-4458-8edc-45f2469b1c8a-kube-api-access-96sk8\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.220007 kubelet[2731]: I0909 05:37:49.219866 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/2be073de-7070-4458-8edc-45f2469b1c8a-flexvol-driver-host\") pod \"calico-node-trsmw\" (UID: \"2be073de-7070-4458-8edc-45f2469b1c8a\") " pod="calico-system/calico-node-trsmw" Sep 9 05:37:49.321619 kubelet[2731]: E0909 05:37:49.321542 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.321619 kubelet[2731]: W0909 05:37:49.321563 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.321619 kubelet[2731]: E0909 05:37:49.321582 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.321799 kubelet[2731]: E0909 05:37:49.321763 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.321799 kubelet[2731]: W0909 05:37:49.321771 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.321799 kubelet[2731]: E0909 05:37:49.321779 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.321916 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.322433 kubelet[2731]: W0909 05:37:49.321926 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.321934 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.322146 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.322433 kubelet[2731]: W0909 05:37:49.322153 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.322161 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.322314 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.322433 kubelet[2731]: W0909 05:37:49.322321 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.322433 kubelet[2731]: E0909 05:37:49.322328 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.324726 kubelet[2731]: E0909 05:37:49.324704 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.324726 kubelet[2731]: W0909 05:37:49.324720 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.324825 kubelet[2731]: E0909 05:37:49.324731 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.374693 kubelet[2731]: E0909 05:37:49.374652 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.374693 kubelet[2731]: W0909 05:37:49.374679 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.374881 kubelet[2731]: E0909 05:37:49.374715 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.401287 containerd[1587]: time="2025-09-09T05:37:49.401239410Z" level=info msg="connecting to shim aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59" address="unix:///run/containerd/s/4882128342cef281c141f575ef4e9759202eab2b3c1a5261828eddf6ee0dcff9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:49.429755 systemd[1]: Started cri-containerd-aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59.scope - libcontainer container aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59. Sep 9 05:37:49.479110 containerd[1587]: time="2025-09-09T05:37:49.478203498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6875c4fb9b-ln9fj,Uid:43c598fd-5165-4876-8ce6-6a92a18474e8,Namespace:calico-system,Attempt:0,} returns sandbox id \"aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59\"" Sep 9 05:37:49.479728 containerd[1587]: time="2025-09-09T05:37:49.479673617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:37:49.488431 kubelet[2731]: E0909 05:37:49.488298 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:49.507699 containerd[1587]: time="2025-09-09T05:37:49.507663477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-trsmw,Uid:2be073de-7070-4458-8edc-45f2469b1c8a,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:49.517233 kubelet[2731]: E0909 05:37:49.517206 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.517233 kubelet[2731]: W0909 05:37:49.517228 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.517343 kubelet[2731]: E0909 05:37:49.517249 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.517458 kubelet[2731]: E0909 05:37:49.517443 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.517458 kubelet[2731]: W0909 05:37:49.517454 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.517536 kubelet[2731]: E0909 05:37:49.517462 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.517658 kubelet[2731]: E0909 05:37:49.517642 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.517658 kubelet[2731]: W0909 05:37:49.517653 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.517717 kubelet[2731]: E0909 05:37:49.517661 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.517844 kubelet[2731]: E0909 05:37:49.517829 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.517844 kubelet[2731]: W0909 05:37:49.517840 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.517891 kubelet[2731]: E0909 05:37:49.517847 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518001 kubelet[2731]: E0909 05:37:49.517988 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518001 kubelet[2731]: W0909 05:37:49.517997 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518062 kubelet[2731]: E0909 05:37:49.518005 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518157 kubelet[2731]: E0909 05:37:49.518144 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518157 kubelet[2731]: W0909 05:37:49.518153 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518212 kubelet[2731]: E0909 05:37:49.518161 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518309 kubelet[2731]: E0909 05:37:49.518296 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518309 kubelet[2731]: W0909 05:37:49.518305 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518355 kubelet[2731]: E0909 05:37:49.518312 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518477 kubelet[2731]: E0909 05:37:49.518464 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518477 kubelet[2731]: W0909 05:37:49.518474 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518543 kubelet[2731]: E0909 05:37:49.518482 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518657 kubelet[2731]: E0909 05:37:49.518640 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518657 kubelet[2731]: W0909 05:37:49.518651 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518712 kubelet[2731]: E0909 05:37:49.518659 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518839 kubelet[2731]: E0909 05:37:49.518824 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518839 kubelet[2731]: W0909 05:37:49.518834 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.518896 kubelet[2731]: E0909 05:37:49.518841 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.518997 kubelet[2731]: E0909 05:37:49.518983 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.518997 kubelet[2731]: W0909 05:37:49.518994 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519038 kubelet[2731]: E0909 05:37:49.519002 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.519180 kubelet[2731]: E0909 05:37:49.519164 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.519180 kubelet[2731]: W0909 05:37:49.519173 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519234 kubelet[2731]: E0909 05:37:49.519189 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.519351 kubelet[2731]: E0909 05:37:49.519334 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.519351 kubelet[2731]: W0909 05:37:49.519343 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519351 kubelet[2731]: E0909 05:37:49.519351 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.519506 kubelet[2731]: E0909 05:37:49.519490 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.519506 kubelet[2731]: W0909 05:37:49.519500 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519506 kubelet[2731]: E0909 05:37:49.519507 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.519676 kubelet[2731]: E0909 05:37:49.519661 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.519676 kubelet[2731]: W0909 05:37:49.519670 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519725 kubelet[2731]: E0909 05:37:49.519678 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.519852 kubelet[2731]: E0909 05:37:49.519824 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.519852 kubelet[2731]: W0909 05:37:49.519834 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.519852 kubelet[2731]: E0909 05:37:49.519841 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.520004 kubelet[2731]: E0909 05:37:49.519987 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.520004 kubelet[2731]: W0909 05:37:49.519996 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.520004 kubelet[2731]: E0909 05:37:49.520003 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.520199 kubelet[2731]: E0909 05:37:49.520155 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.520199 kubelet[2731]: W0909 05:37:49.520172 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.520199 kubelet[2731]: E0909 05:37:49.520183 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.520358 kubelet[2731]: E0909 05:37:49.520340 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.520358 kubelet[2731]: W0909 05:37:49.520352 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.520408 kubelet[2731]: E0909 05:37:49.520360 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.520545 kubelet[2731]: E0909 05:37:49.520513 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.520545 kubelet[2731]: W0909 05:37:49.520530 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.520545 kubelet[2731]: E0909 05:37:49.520539 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.522873 kubelet[2731]: E0909 05:37:49.522756 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.522873 kubelet[2731]: W0909 05:37:49.522770 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.522873 kubelet[2731]: E0909 05:37:49.522780 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.522873 kubelet[2731]: I0909 05:37:49.522803 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/382d750e-f8f8-4cd7-87b5-dfa1289af050-registration-dir\") pod \"csi-node-driver-fr4sd\" (UID: \"382d750e-f8f8-4cd7-87b5-dfa1289af050\") " pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:49.522985 kubelet[2731]: E0909 05:37:49.522968 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.522985 kubelet[2731]: W0909 05:37:49.522978 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.523029 kubelet[2731]: E0909 05:37:49.523021 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.523056 kubelet[2731]: I0909 05:37:49.523037 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/382d750e-f8f8-4cd7-87b5-dfa1289af050-varrun\") pod \"csi-node-driver-fr4sd\" (UID: \"382d750e-f8f8-4cd7-87b5-dfa1289af050\") " pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:49.523333 kubelet[2731]: E0909 05:37:49.523307 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.523333 kubelet[2731]: W0909 05:37:49.523331 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.523487 kubelet[2731]: E0909 05:37:49.523360 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.523518 kubelet[2731]: I0909 05:37:49.523510 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/382d750e-f8f8-4cd7-87b5-dfa1289af050-kubelet-dir\") pod \"csi-node-driver-fr4sd\" (UID: \"382d750e-f8f8-4cd7-87b5-dfa1289af050\") " pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:49.523748 kubelet[2731]: E0909 05:37:49.523724 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.523748 kubelet[2731]: W0909 05:37:49.523737 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.523808 kubelet[2731]: E0909 05:37:49.523761 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.523808 kubelet[2731]: I0909 05:37:49.523775 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/382d750e-f8f8-4cd7-87b5-dfa1289af050-socket-dir\") pod \"csi-node-driver-fr4sd\" (UID: \"382d750e-f8f8-4cd7-87b5-dfa1289af050\") " pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:49.523958 kubelet[2731]: E0909 05:37:49.523932 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.523990 kubelet[2731]: W0909 05:37:49.523970 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.524052 kubelet[2731]: E0909 05:37:49.524001 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.524078 kubelet[2731]: I0909 05:37:49.524069 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2dxw\" (UniqueName: \"kubernetes.io/projected/382d750e-f8f8-4cd7-87b5-dfa1289af050-kube-api-access-v2dxw\") pod \"csi-node-driver-fr4sd\" (UID: \"382d750e-f8f8-4cd7-87b5-dfa1289af050\") " pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:49.524134 kubelet[2731]: E0909 05:37:49.524119 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.524134 kubelet[2731]: W0909 05:37:49.524129 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.524186 kubelet[2731]: E0909 05:37:49.524158 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.524295 kubelet[2731]: E0909 05:37:49.524281 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.524295 kubelet[2731]: W0909 05:37:49.524291 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.524339 kubelet[2731]: E0909 05:37:49.524314 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.524469 kubelet[2731]: E0909 05:37:49.524454 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.524469 kubelet[2731]: W0909 05:37:49.524464 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.524523 kubelet[2731]: E0909 05:37:49.524476 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.524694 kubelet[2731]: E0909 05:37:49.524671 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.524694 kubelet[2731]: W0909 05:37:49.524682 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.524851 kubelet[2731]: E0909 05:37:49.524710 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.524899 kubelet[2731]: E0909 05:37:49.524885 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.524899 kubelet[2731]: W0909 05:37:49.524895 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.525009 kubelet[2731]: E0909 05:37:49.524906 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.525068 kubelet[2731]: E0909 05:37:49.525054 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.525068 kubelet[2731]: W0909 05:37:49.525063 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.525119 kubelet[2731]: E0909 05:37:49.525070 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.525329 kubelet[2731]: E0909 05:37:49.525293 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.525329 kubelet[2731]: W0909 05:37:49.525315 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.525389 kubelet[2731]: E0909 05:37:49.525339 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.525601 kubelet[2731]: E0909 05:37:49.525575 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.525632 kubelet[2731]: W0909 05:37:49.525605 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.525632 kubelet[2731]: E0909 05:37:49.525615 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.525816 kubelet[2731]: E0909 05:37:49.525801 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.525816 kubelet[2731]: W0909 05:37:49.525812 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.525889 kubelet[2731]: E0909 05:37:49.525820 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.526047 kubelet[2731]: E0909 05:37:49.526012 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.526047 kubelet[2731]: W0909 05:37:49.526024 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.526047 kubelet[2731]: E0909 05:37:49.526033 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.529982 containerd[1587]: time="2025-09-09T05:37:49.529937141Z" level=info msg="connecting to shim 2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6" address="unix:///run/containerd/s/63605d651c19977c7ac54ca97409e0f6dbe82cd4f1195a870989287585d10985" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:37:49.553922 systemd[1]: Started cri-containerd-2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6.scope - libcontainer container 2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6. Sep 9 05:37:49.601236 containerd[1587]: time="2025-09-09T05:37:49.601184399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-trsmw,Uid:2be073de-7070-4458-8edc-45f2469b1c8a,Namespace:calico-system,Attempt:0,} returns sandbox id \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\"" Sep 9 05:37:49.625322 kubelet[2731]: E0909 05:37:49.625283 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.625322 kubelet[2731]: W0909 05:37:49.625304 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.625322 kubelet[2731]: E0909 05:37:49.625324 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.625529 kubelet[2731]: E0909 05:37:49.625514 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.625529 kubelet[2731]: W0909 05:37:49.625524 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.625572 kubelet[2731]: E0909 05:37:49.625542 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.625786 kubelet[2731]: E0909 05:37:49.625760 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.625786 kubelet[2731]: W0909 05:37:49.625773 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.625786 kubelet[2731]: E0909 05:37:49.625785 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.626079 kubelet[2731]: E0909 05:37:49.626046 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.626079 kubelet[2731]: W0909 05:37:49.626070 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.626127 kubelet[2731]: E0909 05:37:49.626095 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.626268 kubelet[2731]: E0909 05:37:49.626252 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.626268 kubelet[2731]: W0909 05:37:49.626263 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.626320 kubelet[2731]: E0909 05:37:49.626274 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.626428 kubelet[2731]: E0909 05:37:49.626415 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.626428 kubelet[2731]: W0909 05:37:49.626425 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.626469 kubelet[2731]: E0909 05:37:49.626436 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.626637 kubelet[2731]: E0909 05:37:49.626620 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.626637 kubelet[2731]: W0909 05:37:49.626633 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.626685 kubelet[2731]: E0909 05:37:49.626646 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.626843 kubelet[2731]: E0909 05:37:49.626818 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.626843 kubelet[2731]: W0909 05:37:49.626827 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.626843 kubelet[2731]: E0909 05:37:49.626841 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.627011 kubelet[2731]: E0909 05:37:49.626997 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.627011 kubelet[2731]: W0909 05:37:49.627006 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.627080 kubelet[2731]: E0909 05:37:49.627034 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.627247 kubelet[2731]: E0909 05:37:49.627231 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.627247 kubelet[2731]: W0909 05:37:49.627241 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.627302 kubelet[2731]: E0909 05:37:49.627270 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.627435 kubelet[2731]: E0909 05:37:49.627421 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.627435 kubelet[2731]: W0909 05:37:49.627431 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.627482 kubelet[2731]: E0909 05:37:49.627467 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.627647 kubelet[2731]: E0909 05:37:49.627631 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.627647 kubelet[2731]: W0909 05:37:49.627641 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.627695 kubelet[2731]: E0909 05:37:49.627655 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.627891 kubelet[2731]: E0909 05:37:49.627864 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.627891 kubelet[2731]: W0909 05:37:49.627887 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.627953 kubelet[2731]: E0909 05:37:49.627901 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.628110 kubelet[2731]: E0909 05:37:49.628094 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.628110 kubelet[2731]: W0909 05:37:49.628107 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.628162 kubelet[2731]: E0909 05:37:49.628129 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.628348 kubelet[2731]: E0909 05:37:49.628334 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.628348 kubelet[2731]: W0909 05:37:49.628343 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.628407 kubelet[2731]: E0909 05:37:49.628365 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.628555 kubelet[2731]: E0909 05:37:49.628539 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.628555 kubelet[2731]: W0909 05:37:49.628549 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.628619 kubelet[2731]: E0909 05:37:49.628571 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.628758 kubelet[2731]: E0909 05:37:49.628735 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.628758 kubelet[2731]: W0909 05:37:49.628755 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.628817 kubelet[2731]: E0909 05:37:49.628790 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.628928 kubelet[2731]: E0909 05:37:49.628914 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.628928 kubelet[2731]: W0909 05:37:49.628924 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.628973 kubelet[2731]: E0909 05:37:49.628948 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.629083 kubelet[2731]: E0909 05:37:49.629069 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.629083 kubelet[2731]: W0909 05:37:49.629078 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.629134 kubelet[2731]: E0909 05:37:49.629091 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.629268 kubelet[2731]: E0909 05:37:49.629254 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.629268 kubelet[2731]: W0909 05:37:49.629264 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.629312 kubelet[2731]: E0909 05:37:49.629277 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.629440 kubelet[2731]: E0909 05:37:49.629426 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.629440 kubelet[2731]: W0909 05:37:49.629436 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.629486 kubelet[2731]: E0909 05:37:49.629448 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.629693 kubelet[2731]: E0909 05:37:49.629677 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.629693 kubelet[2731]: W0909 05:37:49.629688 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.629758 kubelet[2731]: E0909 05:37:49.629701 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.629885 kubelet[2731]: E0909 05:37:49.629871 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.629885 kubelet[2731]: W0909 05:37:49.629881 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.629938 kubelet[2731]: E0909 05:37:49.629893 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.630083 kubelet[2731]: E0909 05:37:49.630069 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.630083 kubelet[2731]: W0909 05:37:49.630079 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.630133 kubelet[2731]: E0909 05:37:49.630090 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.630358 kubelet[2731]: E0909 05:37:49.630342 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.630358 kubelet[2731]: W0909 05:37:49.630352 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.630412 kubelet[2731]: E0909 05:37:49.630361 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:49.637040 kubelet[2731]: E0909 05:37:49.637004 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:49.637040 kubelet[2731]: W0909 05:37:49.637023 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:49.637040 kubelet[2731]: E0909 05:37:49.637041 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:51.117918 kubelet[2731]: E0909 05:37:51.117863 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:51.872316 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2542535722.mount: Deactivated successfully. Sep 9 05:37:52.306928 containerd[1587]: time="2025-09-09T05:37:52.306874424Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:52.307883 containerd[1587]: time="2025-09-09T05:37:52.307840578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 05:37:52.309143 containerd[1587]: time="2025-09-09T05:37:52.309105716Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:52.311256 containerd[1587]: time="2025-09-09T05:37:52.311216832Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:52.311917 containerd[1587]: time="2025-09-09T05:37:52.311868432Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.8321707s" Sep 9 05:37:52.311917 containerd[1587]: time="2025-09-09T05:37:52.311911774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 05:37:52.312830 containerd[1587]: time="2025-09-09T05:37:52.312801844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:37:52.322559 containerd[1587]: time="2025-09-09T05:37:52.322517397Z" level=info msg="CreateContainer within sandbox \"aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:37:52.330632 containerd[1587]: time="2025-09-09T05:37:52.330595418Z" level=info msg="Container 694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:52.375760 containerd[1587]: time="2025-09-09T05:37:52.375702648Z" level=info msg="CreateContainer within sandbox \"aa29b56020d0277d254d44f72b536560cf3f1f434b259310eb522ce9be114b59\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9\"" Sep 9 05:37:52.376322 containerd[1587]: time="2025-09-09T05:37:52.376277704Z" level=info msg="StartContainer for \"694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9\"" Sep 9 05:37:52.377391 containerd[1587]: time="2025-09-09T05:37:52.377368443Z" level=info msg="connecting to shim 694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9" address="unix:///run/containerd/s/4882128342cef281c141f575ef4e9759202eab2b3c1a5261828eddf6ee0dcff9" protocol=ttrpc version=3 Sep 9 05:37:52.401737 systemd[1]: Started cri-containerd-694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9.scope - libcontainer container 694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9. Sep 9 05:37:52.447334 containerd[1587]: time="2025-09-09T05:37:52.447289410Z" level=info msg="StartContainer for \"694e2e0abd57ceec663a281f83957115e2c3f7e33d8701d2e8df64e3dddb45f9\" returns successfully" Sep 9 05:37:53.118358 kubelet[2731]: E0909 05:37:53.118293 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:53.252633 kubelet[2731]: E0909 05:37:53.252600 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.252633 kubelet[2731]: W0909 05:37:53.252621 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.252633 kubelet[2731]: E0909 05:37:53.252640 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.252855 kubelet[2731]: E0909 05:37:53.252840 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.252855 kubelet[2731]: W0909 05:37:53.252850 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.252901 kubelet[2731]: E0909 05:37:53.252858 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.253032 kubelet[2731]: E0909 05:37:53.253017 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.253032 kubelet[2731]: W0909 05:37:53.253027 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.253099 kubelet[2731]: E0909 05:37:53.253035 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.253259 kubelet[2731]: E0909 05:37:53.253227 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.253259 kubelet[2731]: W0909 05:37:53.253248 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.253409 kubelet[2731]: E0909 05:37:53.253272 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.253549 kubelet[2731]: E0909 05:37:53.253533 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.253549 kubelet[2731]: W0909 05:37:53.253543 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.253618 kubelet[2731]: E0909 05:37:53.253551 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.253746 kubelet[2731]: E0909 05:37:53.253732 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.253746 kubelet[2731]: W0909 05:37:53.253741 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.253791 kubelet[2731]: E0909 05:37:53.253749 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.253950 kubelet[2731]: E0909 05:37:53.253927 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.253950 kubelet[2731]: W0909 05:37:53.253938 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.253950 kubelet[2731]: E0909 05:37:53.253946 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.254120 kubelet[2731]: E0909 05:37:53.254106 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.254120 kubelet[2731]: W0909 05:37:53.254115 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.254179 kubelet[2731]: E0909 05:37:53.254122 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.254323 kubelet[2731]: E0909 05:37:53.254310 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.254323 kubelet[2731]: W0909 05:37:53.254319 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.254375 kubelet[2731]: E0909 05:37:53.254327 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.254490 kubelet[2731]: E0909 05:37:53.254477 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.254490 kubelet[2731]: W0909 05:37:53.254486 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.254535 kubelet[2731]: E0909 05:37:53.254495 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.254681 kubelet[2731]: E0909 05:37:53.254667 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.254681 kubelet[2731]: W0909 05:37:53.254677 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.254743 kubelet[2731]: E0909 05:37:53.254685 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.254893 kubelet[2731]: E0909 05:37:53.254865 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.254893 kubelet[2731]: W0909 05:37:53.254876 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.254893 kubelet[2731]: E0909 05:37:53.254883 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.255113 kubelet[2731]: E0909 05:37:53.255056 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.255113 kubelet[2731]: W0909 05:37:53.255063 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.255113 kubelet[2731]: E0909 05:37:53.255069 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.255235 kubelet[2731]: E0909 05:37:53.255218 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.255235 kubelet[2731]: W0909 05:37:53.255228 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.255235 kubelet[2731]: E0909 05:37:53.255234 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.255403 kubelet[2731]: E0909 05:37:53.255388 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.255403 kubelet[2731]: W0909 05:37:53.255396 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.255403 kubelet[2731]: E0909 05:37:53.255403 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.255654 kubelet[2731]: E0909 05:37:53.255636 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.255654 kubelet[2731]: W0909 05:37:53.255647 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.255654 kubelet[2731]: E0909 05:37:53.255654 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.255854 kubelet[2731]: E0909 05:37:53.255840 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.255854 kubelet[2731]: W0909 05:37:53.255850 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.255901 kubelet[2731]: E0909 05:37:53.255879 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.256158 kubelet[2731]: E0909 05:37:53.256141 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.256158 kubelet[2731]: W0909 05:37:53.256154 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.256222 kubelet[2731]: E0909 05:37:53.256170 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.256376 kubelet[2731]: E0909 05:37:53.256362 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.256376 kubelet[2731]: W0909 05:37:53.256372 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.256430 kubelet[2731]: E0909 05:37:53.256386 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.256686 kubelet[2731]: E0909 05:37:53.256572 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.256686 kubelet[2731]: W0909 05:37:53.256596 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.256686 kubelet[2731]: E0909 05:37:53.256635 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.256871 kubelet[2731]: E0909 05:37:53.256857 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.256871 kubelet[2731]: W0909 05:37:53.256867 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.256919 kubelet[2731]: E0909 05:37:53.256880 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.257072 kubelet[2731]: E0909 05:37:53.257057 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.257072 kubelet[2731]: W0909 05:37:53.257066 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.257130 kubelet[2731]: E0909 05:37:53.257104 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.257257 kubelet[2731]: E0909 05:37:53.257241 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.257257 kubelet[2731]: W0909 05:37:53.257253 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.257318 kubelet[2731]: E0909 05:37:53.257281 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.257477 kubelet[2731]: E0909 05:37:53.257461 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.257477 kubelet[2731]: W0909 05:37:53.257471 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.257536 kubelet[2731]: E0909 05:37:53.257483 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.257693 kubelet[2731]: E0909 05:37:53.257673 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.257693 kubelet[2731]: W0909 05:37:53.257689 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.257770 kubelet[2731]: E0909 05:37:53.257711 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.257879 kubelet[2731]: E0909 05:37:53.257866 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.257879 kubelet[2731]: W0909 05:37:53.257875 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.257934 kubelet[2731]: E0909 05:37:53.257888 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.258086 kubelet[2731]: E0909 05:37:53.258063 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.258086 kubelet[2731]: W0909 05:37:53.258072 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.258141 kubelet[2731]: E0909 05:37:53.258094 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.258383 kubelet[2731]: E0909 05:37:53.258359 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.258383 kubelet[2731]: W0909 05:37:53.258373 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.258512 kubelet[2731]: E0909 05:37:53.258395 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.258614 kubelet[2731]: E0909 05:37:53.258572 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.258614 kubelet[2731]: W0909 05:37:53.258598 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.258614 kubelet[2731]: E0909 05:37:53.258612 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.258869 kubelet[2731]: E0909 05:37:53.258851 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.258869 kubelet[2731]: W0909 05:37:53.258863 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.258945 kubelet[2731]: E0909 05:37:53.258878 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.259094 kubelet[2731]: E0909 05:37:53.259080 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.259094 kubelet[2731]: W0909 05:37:53.259090 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.259187 kubelet[2731]: E0909 05:37:53.259104 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.259371 kubelet[2731]: E0909 05:37:53.259355 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.259371 kubelet[2731]: W0909 05:37:53.259368 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.259428 kubelet[2731]: E0909 05:37:53.259398 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.259625 kubelet[2731]: E0909 05:37:53.259608 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:37:53.259625 kubelet[2731]: W0909 05:37:53.259623 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:37:53.259680 kubelet[2731]: E0909 05:37:53.259635 2731 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:37:53.616324 containerd[1587]: time="2025-09-09T05:37:53.616270117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:53.617025 containerd[1587]: time="2025-09-09T05:37:53.616994094Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 05:37:53.618183 containerd[1587]: time="2025-09-09T05:37:53.618154614Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:53.620091 containerd[1587]: time="2025-09-09T05:37:53.620059790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:53.620637 containerd[1587]: time="2025-09-09T05:37:53.620578548Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.307744132s" Sep 9 05:37:53.620665 containerd[1587]: time="2025-09-09T05:37:53.620636859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 05:37:53.622613 containerd[1587]: time="2025-09-09T05:37:53.622518619Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:37:53.631149 containerd[1587]: time="2025-09-09T05:37:53.631089023Z" level=info msg="Container b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:53.640029 containerd[1587]: time="2025-09-09T05:37:53.639991463Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\"" Sep 9 05:37:53.640496 containerd[1587]: time="2025-09-09T05:37:53.640451320Z" level=info msg="StartContainer for \"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\"" Sep 9 05:37:53.641767 containerd[1587]: time="2025-09-09T05:37:53.641735434Z" level=info msg="connecting to shim b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3" address="unix:///run/containerd/s/63605d651c19977c7ac54ca97409e0f6dbe82cd4f1195a870989287585d10985" protocol=ttrpc version=3 Sep 9 05:37:53.659813 systemd[1]: Started cri-containerd-b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3.scope - libcontainer container b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3. Sep 9 05:37:53.700294 containerd[1587]: time="2025-09-09T05:37:53.700121346Z" level=info msg="StartContainer for \"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\" returns successfully" Sep 9 05:37:53.709476 systemd[1]: cri-containerd-b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3.scope: Deactivated successfully. Sep 9 05:37:53.710058 systemd[1]: cri-containerd-b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3.scope: Consumed 35ms CPU time, 6.3M memory peak, 4.6M written to disk. Sep 9 05:37:53.712826 containerd[1587]: time="2025-09-09T05:37:53.712772291Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\" id:\"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\" pid:3446 exited_at:{seconds:1757396273 nanos:712182738}" Sep 9 05:37:53.712981 containerd[1587]: time="2025-09-09T05:37:53.712923987Z" level=info msg="received exit event container_id:\"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\" id:\"b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3\" pid:3446 exited_at:{seconds:1757396273 nanos:712182738}" Sep 9 05:37:53.736907 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b882f4bbbf8b809fd04087bb6228ba51d4a79200f71ae7e5fd606581193cfaf3-rootfs.mount: Deactivated successfully. Sep 9 05:37:54.167705 kubelet[2731]: I0909 05:37:54.167369 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:37:54.168851 containerd[1587]: time="2025-09-09T05:37:54.168823718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:37:54.181970 kubelet[2731]: I0909 05:37:54.181557 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6875c4fb9b-ln9fj" podStartSLOduration=3.348139051 podStartE2EDuration="6.18153712s" podCreationTimestamp="2025-09-09 05:37:48 +0000 UTC" firstStartedPulling="2025-09-09 05:37:49.479336419 +0000 UTC m=+17.436725289" lastFinishedPulling="2025-09-09 05:37:52.312734488 +0000 UTC m=+20.270123358" observedRunningTime="2025-09-09 05:37:53.17232974 +0000 UTC m=+21.129718610" watchObservedRunningTime="2025-09-09 05:37:54.18153712 +0000 UTC m=+22.138925990" Sep 9 05:37:55.118203 kubelet[2731]: E0909 05:37:55.118142 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:56.675158 containerd[1587]: time="2025-09-09T05:37:56.675101864Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:56.675879 containerd[1587]: time="2025-09-09T05:37:56.675847430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 05:37:56.676989 containerd[1587]: time="2025-09-09T05:37:56.676945219Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:56.678811 containerd[1587]: time="2025-09-09T05:37:56.678779426Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:37:56.679303 containerd[1587]: time="2025-09-09T05:37:56.679273658Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 2.510396529s" Sep 9 05:37:56.679303 containerd[1587]: time="2025-09-09T05:37:56.679298635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 05:37:56.681716 containerd[1587]: time="2025-09-09T05:37:56.681674724Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:37:56.691012 containerd[1587]: time="2025-09-09T05:37:56.690985134Z" level=info msg="Container 59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:37:56.699802 containerd[1587]: time="2025-09-09T05:37:56.699765846Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\"" Sep 9 05:37:56.701609 containerd[1587]: time="2025-09-09T05:37:56.700141975Z" level=info msg="StartContainer for \"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\"" Sep 9 05:37:56.701609 containerd[1587]: time="2025-09-09T05:37:56.701376943Z" level=info msg="connecting to shim 59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d" address="unix:///run/containerd/s/63605d651c19977c7ac54ca97409e0f6dbe82cd4f1195a870989287585d10985" protocol=ttrpc version=3 Sep 9 05:37:56.720722 systemd[1]: Started cri-containerd-59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d.scope - libcontainer container 59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d. Sep 9 05:37:56.761031 containerd[1587]: time="2025-09-09T05:37:56.760994682Z" level=info msg="StartContainer for \"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\" returns successfully" Sep 9 05:37:57.117774 kubelet[2731]: E0909 05:37:57.117620 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:57.764336 containerd[1587]: time="2025-09-09T05:37:57.764293896Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:37:57.768425 systemd[1]: cri-containerd-59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d.scope: Deactivated successfully. Sep 9 05:37:57.769959 containerd[1587]: time="2025-09-09T05:37:57.769418222Z" level=info msg="received exit event container_id:\"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\" id:\"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\" pid:3505 exited_at:{seconds:1757396277 nanos:769182538}" Sep 9 05:37:57.769959 containerd[1587]: time="2025-09-09T05:37:57.769756379Z" level=info msg="TaskExit event in podsandbox handler container_id:\"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\" id:\"59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d\" pid:3505 exited_at:{seconds:1757396277 nanos:769182538}" Sep 9 05:37:57.768896 systemd[1]: cri-containerd-59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d.scope: Consumed 541ms CPU time, 179.9M memory peak, 3.4M read from disk, 171.3M written to disk. Sep 9 05:37:57.795836 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-59537d71a28f5b1b14ec463ce373e2a703ac851c1cec56f8ade70cfabeda2d1d-rootfs.mount: Deactivated successfully. Sep 9 05:37:57.864573 kubelet[2731]: I0909 05:37:57.864534 2731 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 05:37:58.062167 systemd[1]: Created slice kubepods-burstable-podf006145d_362b_4e7a_b943_f0f12d014871.slice - libcontainer container kubepods-burstable-podf006145d_362b_4e7a_b943_f0f12d014871.slice. Sep 9 05:37:58.068458 systemd[1]: Created slice kubepods-besteffort-podb4ee7a58_3044_4fd9_b673_e51540fd4be9.slice - libcontainer container kubepods-besteffort-podb4ee7a58_3044_4fd9_b673_e51540fd4be9.slice. Sep 9 05:37:58.073031 systemd[1]: Created slice kubepods-burstable-podf552aed4_9ee6_4d96_b587_80305d432ef7.slice - libcontainer container kubepods-burstable-podf552aed4_9ee6_4d96_b587_80305d432ef7.slice. Sep 9 05:37:58.077770 systemd[1]: Created slice kubepods-besteffort-podbdb5173f_1c88_48a1_904a_4de1b9173e2a.slice - libcontainer container kubepods-besteffort-podbdb5173f_1c88_48a1_904a_4de1b9173e2a.slice. Sep 9 05:37:58.083533 systemd[1]: Created slice kubepods-besteffort-podce61c8c6_8da2_4f73_a908_519f05cb6a63.slice - libcontainer container kubepods-besteffort-podce61c8c6_8da2_4f73_a908_519f05cb6a63.slice. Sep 9 05:37:58.088250 systemd[1]: Created slice kubepods-besteffort-podd6d53f50_cd74_44ff_bef9_71572dfb8e98.slice - libcontainer container kubepods-besteffort-podd6d53f50_cd74_44ff_bef9_71572dfb8e98.slice. Sep 9 05:37:58.092941 systemd[1]: Created slice kubepods-besteffort-pode24812f7_a033_4586_8f78_dd439cb8f791.slice - libcontainer container kubepods-besteffort-pode24812f7_a033_4586_8f78_dd439cb8f791.slice. Sep 9 05:37:58.182823 containerd[1587]: time="2025-09-09T05:37:58.182773696Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:37:58.195133 kubelet[2731]: I0909 05:37:58.195089 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcsd\" (UniqueName: \"kubernetes.io/projected/bdb5173f-1c88-48a1-904a-4de1b9173e2a-kube-api-access-7xcsd\") pod \"calico-kube-controllers-54c44569bd-mj58w\" (UID: \"bdb5173f-1c88-48a1-904a-4de1b9173e2a\") " pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" Sep 9 05:37:58.195133 kubelet[2731]: I0909 05:37:58.195132 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbn7p\" (UniqueName: \"kubernetes.io/projected/b4ee7a58-3044-4fd9-b673-e51540fd4be9-kube-api-access-lbn7p\") pod \"goldmane-7988f88666-45sx9\" (UID: \"b4ee7a58-3044-4fd9-b673-e51540fd4be9\") " pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.195539 kubelet[2731]: I0909 05:37:58.195151 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6gn9\" (UniqueName: \"kubernetes.io/projected/e24812f7-a033-4586-8f78-dd439cb8f791-kube-api-access-l6gn9\") pod \"calico-apiserver-64659bc7c5-hdpgx\" (UID: \"e24812f7-a033-4586-8f78-dd439cb8f791\") " pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" Sep 9 05:37:58.195539 kubelet[2731]: I0909 05:37:58.195167 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c5pb\" (UniqueName: \"kubernetes.io/projected/ce61c8c6-8da2-4f73-a908-519f05cb6a63-kube-api-access-8c5pb\") pod \"whisker-599845b544-mffqz\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " pod="calico-system/whisker-599845b544-mffqz" Sep 9 05:37:58.195539 kubelet[2731]: I0909 05:37:58.195185 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d6d53f50-cd74-44ff-bef9-71572dfb8e98-calico-apiserver-certs\") pod \"calico-apiserver-64659bc7c5-dbld7\" (UID: \"d6d53f50-cd74-44ff-bef9-71572dfb8e98\") " pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" Sep 9 05:37:58.195539 kubelet[2731]: I0909 05:37:58.195202 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbtv\" (UniqueName: \"kubernetes.io/projected/f552aed4-9ee6-4d96-b587-80305d432ef7-kube-api-access-thbtv\") pod \"coredns-7c65d6cfc9-tpcjd\" (UID: \"f552aed4-9ee6-4d96-b587-80305d432ef7\") " pod="kube-system/coredns-7c65d6cfc9-tpcjd" Sep 9 05:37:58.195539 kubelet[2731]: I0909 05:37:58.195300 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zz4w\" (UniqueName: \"kubernetes.io/projected/d6d53f50-cd74-44ff-bef9-71572dfb8e98-kube-api-access-5zz4w\") pod \"calico-apiserver-64659bc7c5-dbld7\" (UID: \"d6d53f50-cd74-44ff-bef9-71572dfb8e98\") " pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" Sep 9 05:37:58.195695 kubelet[2731]: I0909 05:37:58.195364 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdb5173f-1c88-48a1-904a-4de1b9173e2a-tigera-ca-bundle\") pod \"calico-kube-controllers-54c44569bd-mj58w\" (UID: \"bdb5173f-1c88-48a1-904a-4de1b9173e2a\") " pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" Sep 9 05:37:58.195695 kubelet[2731]: I0909 05:37:58.195379 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e24812f7-a033-4586-8f78-dd439cb8f791-calico-apiserver-certs\") pod \"calico-apiserver-64659bc7c5-hdpgx\" (UID: \"e24812f7-a033-4586-8f78-dd439cb8f791\") " pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" Sep 9 05:37:58.195695 kubelet[2731]: I0909 05:37:58.195403 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f552aed4-9ee6-4d96-b587-80305d432ef7-config-volume\") pod \"coredns-7c65d6cfc9-tpcjd\" (UID: \"f552aed4-9ee6-4d96-b587-80305d432ef7\") " pod="kube-system/coredns-7c65d6cfc9-tpcjd" Sep 9 05:37:58.195695 kubelet[2731]: I0909 05:37:58.195488 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f006145d-362b-4e7a-b943-f0f12d014871-config-volume\") pod \"coredns-7c65d6cfc9-kzl7q\" (UID: \"f006145d-362b-4e7a-b943-f0f12d014871\") " pod="kube-system/coredns-7c65d6cfc9-kzl7q" Sep 9 05:37:58.195695 kubelet[2731]: I0909 05:37:58.195580 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4ee7a58-3044-4fd9-b673-e51540fd4be9-goldmane-ca-bundle\") pod \"goldmane-7988f88666-45sx9\" (UID: \"b4ee7a58-3044-4fd9-b673-e51540fd4be9\") " pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.196255 kubelet[2731]: I0909 05:37:58.195888 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ee7a58-3044-4fd9-b673-e51540fd4be9-config\") pod \"goldmane-7988f88666-45sx9\" (UID: \"b4ee7a58-3044-4fd9-b673-e51540fd4be9\") " pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.196255 kubelet[2731]: I0909 05:37:58.195919 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b4ee7a58-3044-4fd9-b673-e51540fd4be9-goldmane-key-pair\") pod \"goldmane-7988f88666-45sx9\" (UID: \"b4ee7a58-3044-4fd9-b673-e51540fd4be9\") " pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.196255 kubelet[2731]: I0909 05:37:58.195940 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-ca-bundle\") pod \"whisker-599845b544-mffqz\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " pod="calico-system/whisker-599845b544-mffqz" Sep 9 05:37:58.196255 kubelet[2731]: I0909 05:37:58.195961 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjnvx\" (UniqueName: \"kubernetes.io/projected/f006145d-362b-4e7a-b943-f0f12d014871-kube-api-access-hjnvx\") pod \"coredns-7c65d6cfc9-kzl7q\" (UID: \"f006145d-362b-4e7a-b943-f0f12d014871\") " pod="kube-system/coredns-7c65d6cfc9-kzl7q" Sep 9 05:37:58.196255 kubelet[2731]: I0909 05:37:58.195976 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-backend-key-pair\") pod \"whisker-599845b544-mffqz\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " pod="calico-system/whisker-599845b544-mffqz" Sep 9 05:37:58.367552 containerd[1587]: time="2025-09-09T05:37:58.367440479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kzl7q,Uid:f006145d-362b-4e7a-b943-f0f12d014871,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:58.371266 containerd[1587]: time="2025-09-09T05:37:58.371216963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-45sx9,Uid:b4ee7a58-3044-4fd9-b673-e51540fd4be9,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:58.375876 containerd[1587]: time="2025-09-09T05:37:58.375830324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tpcjd,Uid:f552aed4-9ee6-4d96-b587-80305d432ef7,Namespace:kube-system,Attempt:0,}" Sep 9 05:37:58.380269 containerd[1587]: time="2025-09-09T05:37:58.380242535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54c44569bd-mj58w,Uid:bdb5173f-1c88-48a1-904a-4de1b9173e2a,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:58.387045 containerd[1587]: time="2025-09-09T05:37:58.387011016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-599845b544-mffqz,Uid:ce61c8c6-8da2-4f73-a908-519f05cb6a63,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:58.391545 containerd[1587]: time="2025-09-09T05:37:58.391509900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-dbld7,Uid:d6d53f50-cd74-44ff-bef9-71572dfb8e98,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:37:58.396089 containerd[1587]: time="2025-09-09T05:37:58.396046647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-hdpgx,Uid:e24812f7-a033-4586-8f78-dd439cb8f791,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:37:58.546207 containerd[1587]: time="2025-09-09T05:37:58.546089296Z" level=error msg="Failed to destroy network for sandbox \"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.568431 containerd[1587]: time="2025-09-09T05:37:58.568338408Z" level=error msg="Failed to destroy network for sandbox \"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.571739 containerd[1587]: time="2025-09-09T05:37:58.571693368Z" level=error msg="Failed to destroy network for sandbox \"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.575552 containerd[1587]: time="2025-09-09T05:37:58.575515990Z" level=error msg="Failed to destroy network for sandbox \"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.577911 containerd[1587]: time="2025-09-09T05:37:58.577870174Z" level=error msg="Failed to destroy network for sandbox \"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.578191 containerd[1587]: time="2025-09-09T05:37:58.578039283Z" level=error msg="Failed to destroy network for sandbox \"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.579790 containerd[1587]: time="2025-09-09T05:37:58.579752861Z" level=error msg="Failed to destroy network for sandbox \"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.658043 containerd[1587]: time="2025-09-09T05:37:58.657969074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54c44569bd-mj58w,Uid:bdb5173f-1c88-48a1-904a-4de1b9173e2a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.658892 kubelet[2731]: E0909 05:37:58.658790 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.659018 kubelet[2731]: E0909 05:37:58.658944 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" Sep 9 05:37:58.659018 kubelet[2731]: E0909 05:37:58.659010 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" Sep 9 05:37:58.659095 kubelet[2731]: E0909 05:37:58.659063 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-54c44569bd-mj58w_calico-system(bdb5173f-1c88-48a1-904a-4de1b9173e2a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-54c44569bd-mj58w_calico-system(bdb5173f-1c88-48a1-904a-4de1b9173e2a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c88c096a1435fd4eeb575fe67426a9e8e7b59b1311b24267f0ef5fe5cae77c3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" podUID="bdb5173f-1c88-48a1-904a-4de1b9173e2a" Sep 9 05:37:58.727426 containerd[1587]: time="2025-09-09T05:37:58.727383436Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tpcjd,Uid:f552aed4-9ee6-4d96-b587-80305d432ef7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.727603 kubelet[2731]: E0909 05:37:58.727567 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.727656 kubelet[2731]: E0909 05:37:58.727645 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-tpcjd" Sep 9 05:37:58.727691 kubelet[2731]: E0909 05:37:58.727663 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-tpcjd" Sep 9 05:37:58.727730 kubelet[2731]: E0909 05:37:58.727698 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-tpcjd_kube-system(f552aed4-9ee6-4d96-b587-80305d432ef7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-tpcjd_kube-system(f552aed4-9ee6-4d96-b587-80305d432ef7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"689b42af5f309b28c667af16e3dcec50d7db7fac42d78691943dcf7a3ed66d8c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-tpcjd" podUID="f552aed4-9ee6-4d96-b587-80305d432ef7" Sep 9 05:37:58.728604 containerd[1587]: time="2025-09-09T05:37:58.728536187Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-dbld7,Uid:d6d53f50-cd74-44ff-bef9-71572dfb8e98,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.728794 kubelet[2731]: E0909 05:37:58.728760 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.728794 kubelet[2731]: E0909 05:37:58.728790 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" Sep 9 05:37:58.728794 kubelet[2731]: E0909 05:37:58.728805 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" Sep 9 05:37:58.728976 kubelet[2731]: E0909 05:37:58.728834 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64659bc7c5-dbld7_calico-apiserver(d6d53f50-cd74-44ff-bef9-71572dfb8e98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64659bc7c5-dbld7_calico-apiserver(d6d53f50-cd74-44ff-bef9-71572dfb8e98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb77bc6afc78237a3ce10fd666f050a1f908a0ce9440fb9420236be35908c2d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" podUID="d6d53f50-cd74-44ff-bef9-71572dfb8e98" Sep 9 05:37:58.729911 containerd[1587]: time="2025-09-09T05:37:58.729801972Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-45sx9,Uid:b4ee7a58-3044-4fd9-b673-e51540fd4be9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.730114 kubelet[2731]: E0909 05:37:58.730067 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.730268 kubelet[2731]: E0909 05:37:58.730124 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.730268 kubelet[2731]: E0909 05:37:58.730145 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-45sx9" Sep 9 05:37:58.730268 kubelet[2731]: E0909 05:37:58.730198 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-45sx9_calico-system(b4ee7a58-3044-4fd9-b673-e51540fd4be9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-45sx9_calico-system(b4ee7a58-3044-4fd9-b673-e51540fd4be9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"069b6be44bd75b16011efb5fef7f66adc8cda635848997c48d28b62a0af397ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-45sx9" podUID="b4ee7a58-3044-4fd9-b673-e51540fd4be9" Sep 9 05:37:58.730855 containerd[1587]: time="2025-09-09T05:37:58.730781938Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-599845b544-mffqz,Uid:ce61c8c6-8da2-4f73-a908-519f05cb6a63,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.730979 kubelet[2731]: E0909 05:37:58.730937 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.731050 kubelet[2731]: E0909 05:37:58.731029 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-599845b544-mffqz" Sep 9 05:37:58.731078 kubelet[2731]: E0909 05:37:58.731050 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-599845b544-mffqz" Sep 9 05:37:58.731116 kubelet[2731]: E0909 05:37:58.731094 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-599845b544-mffqz_calico-system(ce61c8c6-8da2-4f73-a908-519f05cb6a63)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-599845b544-mffqz_calico-system(ce61c8c6-8da2-4f73-a908-519f05cb6a63)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cdade03fc54ac26bc10cd2c4aa82b52f13e8da81221bacc76cd0f1b0f7cfab55\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-599845b544-mffqz" podUID="ce61c8c6-8da2-4f73-a908-519f05cb6a63" Sep 9 05:37:58.732069 containerd[1587]: time="2025-09-09T05:37:58.732026903Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kzl7q,Uid:f006145d-362b-4e7a-b943-f0f12d014871,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.732193 kubelet[2731]: E0909 05:37:58.732166 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.732239 kubelet[2731]: E0909 05:37:58.732192 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kzl7q" Sep 9 05:37:58.732239 kubelet[2731]: E0909 05:37:58.732207 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kzl7q" Sep 9 05:37:58.732286 kubelet[2731]: E0909 05:37:58.732239 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-kzl7q_kube-system(f006145d-362b-4e7a-b943-f0f12d014871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-kzl7q_kube-system(f006145d-362b-4e7a-b943-f0f12d014871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6acdb0e085a5805e45232fbc5f0ce41c5cf62aa0d589eae59c98624b9d6d9a92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-kzl7q" podUID="f006145d-362b-4e7a-b943-f0f12d014871" Sep 9 05:37:58.733399 containerd[1587]: time="2025-09-09T05:37:58.733363842Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-hdpgx,Uid:e24812f7-a033-4586-8f78-dd439cb8f791,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.733508 kubelet[2731]: E0909 05:37:58.733480 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:58.733539 kubelet[2731]: E0909 05:37:58.733506 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" Sep 9 05:37:58.733539 kubelet[2731]: E0909 05:37:58.733520 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" Sep 9 05:37:58.733619 kubelet[2731]: E0909 05:37:58.733551 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64659bc7c5-hdpgx_calico-apiserver(e24812f7-a033-4586-8f78-dd439cb8f791)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64659bc7c5-hdpgx_calico-apiserver(e24812f7-a033-4586-8f78-dd439cb8f791)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bafcc8c92115da635f75241effbcad491ba9943939b822af687f5517db871c2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" podUID="e24812f7-a033-4586-8f78-dd439cb8f791" Sep 9 05:37:59.122872 systemd[1]: Created slice kubepods-besteffort-pod382d750e_f8f8_4cd7_87b5_dfa1289af050.slice - libcontainer container kubepods-besteffort-pod382d750e_f8f8_4cd7_87b5_dfa1289af050.slice. Sep 9 05:37:59.125816 containerd[1587]: time="2025-09-09T05:37:59.125780494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fr4sd,Uid:382d750e-f8f8-4cd7-87b5-dfa1289af050,Namespace:calico-system,Attempt:0,}" Sep 9 05:37:59.175390 containerd[1587]: time="2025-09-09T05:37:59.175333945Z" level=error msg="Failed to destroy network for sandbox \"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:59.176719 containerd[1587]: time="2025-09-09T05:37:59.176686933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fr4sd,Uid:382d750e-f8f8-4cd7-87b5-dfa1289af050,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:59.177070 kubelet[2731]: E0909 05:37:59.177008 2731 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:37:59.177070 kubelet[2731]: E0909 05:37:59.177083 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:59.177070 kubelet[2731]: E0909 05:37:59.177102 2731 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fr4sd" Sep 9 05:37:59.177301 kubelet[2731]: E0909 05:37:59.177139 2731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fr4sd_calico-system(382d750e-f8f8-4cd7-87b5-dfa1289af050)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fr4sd_calico-system(382d750e-f8f8-4cd7-87b5-dfa1289af050)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3c0238feb716643e79c9bf2dc5fdc8c6d9ee5c77d73d2cd3c6b8da2c220b5f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fr4sd" podUID="382d750e-f8f8-4cd7-87b5-dfa1289af050" Sep 9 05:37:59.177562 systemd[1]: run-netns-cni\x2d5129dc04\x2de07a\x2dfc9f\x2da121\x2d1f5077e88d1b.mount: Deactivated successfully. Sep 9 05:38:05.212732 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount154842632.mount: Deactivated successfully. Sep 9 05:38:05.619626 containerd[1587]: time="2025-09-09T05:38:05.619484888Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:05.620435 containerd[1587]: time="2025-09-09T05:38:05.620376905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 05:38:05.621728 containerd[1587]: time="2025-09-09T05:38:05.621694022Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:05.623612 containerd[1587]: time="2025-09-09T05:38:05.623558258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:05.624045 containerd[1587]: time="2025-09-09T05:38:05.624001421Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.441184194s" Sep 9 05:38:05.624045 containerd[1587]: time="2025-09-09T05:38:05.624040745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 05:38:05.631543 containerd[1587]: time="2025-09-09T05:38:05.631515382Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:38:05.665140 containerd[1587]: time="2025-09-09T05:38:05.665087747Z" level=info msg="Container bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:05.676577 containerd[1587]: time="2025-09-09T05:38:05.676542018Z" level=info msg="CreateContainer within sandbox \"2b0334088b601c21af12356ccd934f3e81960dce6be8e097589fd65bf63ce5b6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\"" Sep 9 05:38:05.677106 containerd[1587]: time="2025-09-09T05:38:05.677032120Z" level=info msg="StartContainer for \"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\"" Sep 9 05:38:05.678640 containerd[1587]: time="2025-09-09T05:38:05.678606761Z" level=info msg="connecting to shim bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e" address="unix:///run/containerd/s/63605d651c19977c7ac54ca97409e0f6dbe82cd4f1195a870989287585d10985" protocol=ttrpc version=3 Sep 9 05:38:05.701716 systemd[1]: Started cri-containerd-bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e.scope - libcontainer container bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e. Sep 9 05:38:05.755475 containerd[1587]: time="2025-09-09T05:38:05.755435274Z" level=info msg="StartContainer for \"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\" returns successfully" Sep 9 05:38:05.827751 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:38:05.828330 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:38:06.040853 kubelet[2731]: I0909 05:38:06.040807 2731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c5pb\" (UniqueName: \"kubernetes.io/projected/ce61c8c6-8da2-4f73-a908-519f05cb6a63-kube-api-access-8c5pb\") pod \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " Sep 9 05:38:06.040853 kubelet[2731]: I0909 05:38:06.040858 2731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-ca-bundle\") pod \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " Sep 9 05:38:06.040853 kubelet[2731]: I0909 05:38:06.040880 2731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-backend-key-pair\") pod \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\" (UID: \"ce61c8c6-8da2-4f73-a908-519f05cb6a63\") " Sep 9 05:38:06.041783 kubelet[2731]: I0909 05:38:06.041315 2731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ce61c8c6-8da2-4f73-a908-519f05cb6a63" (UID: "ce61c8c6-8da2-4f73-a908-519f05cb6a63"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 05:38:06.044308 kubelet[2731]: I0909 05:38:06.044262 2731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce61c8c6-8da2-4f73-a908-519f05cb6a63-kube-api-access-8c5pb" (OuterVolumeSpecName: "kube-api-access-8c5pb") pod "ce61c8c6-8da2-4f73-a908-519f05cb6a63" (UID: "ce61c8c6-8da2-4f73-a908-519f05cb6a63"). InnerVolumeSpecName "kube-api-access-8c5pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 05:38:06.044406 kubelet[2731]: I0909 05:38:06.044364 2731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ce61c8c6-8da2-4f73-a908-519f05cb6a63" (UID: "ce61c8c6-8da2-4f73-a908-519f05cb6a63"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 05:38:06.132303 systemd[1]: Removed slice kubepods-besteffort-podce61c8c6_8da2_4f73_a908_519f05cb6a63.slice - libcontainer container kubepods-besteffort-podce61c8c6_8da2_4f73_a908_519f05cb6a63.slice. Sep 9 05:38:06.141412 kubelet[2731]: I0909 05:38:06.141375 2731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c5pb\" (UniqueName: \"kubernetes.io/projected/ce61c8c6-8da2-4f73-a908-519f05cb6a63-kube-api-access-8c5pb\") on node \"localhost\" DevicePath \"\"" Sep 9 05:38:06.141412 kubelet[2731]: I0909 05:38:06.141402 2731 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 05:38:06.141412 kubelet[2731]: I0909 05:38:06.141411 2731 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ce61c8c6-8da2-4f73-a908-519f05cb6a63-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 05:38:06.213741 systemd[1]: var-lib-kubelet-pods-ce61c8c6\x2d8da2\x2d4f73\x2da908\x2d519f05cb6a63-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8c5pb.mount: Deactivated successfully. Sep 9 05:38:06.214219 systemd[1]: var-lib-kubelet-pods-ce61c8c6\x2d8da2\x2d4f73\x2da908\x2d519f05cb6a63-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:38:06.214847 kubelet[2731]: I0909 05:38:06.214790 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-trsmw" podStartSLOduration=1.193808143 podStartE2EDuration="17.214768349s" podCreationTimestamp="2025-09-09 05:37:49 +0000 UTC" firstStartedPulling="2025-09-09 05:37:49.603694795 +0000 UTC m=+17.561083665" lastFinishedPulling="2025-09-09 05:38:05.62465501 +0000 UTC m=+33.582043871" observedRunningTime="2025-09-09 05:38:06.213830395 +0000 UTC m=+34.171219265" watchObservedRunningTime="2025-09-09 05:38:06.214768349 +0000 UTC m=+34.172157219" Sep 9 05:38:06.267138 systemd[1]: Created slice kubepods-besteffort-podafdacd8f_b283_4c72_93ec_d3d547b4b0ad.slice - libcontainer container kubepods-besteffort-podafdacd8f_b283_4c72_93ec_d3d547b4b0ad.slice. Sep 9 05:38:06.444055 kubelet[2731]: I0909 05:38:06.443979 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx82\" (UniqueName: \"kubernetes.io/projected/afdacd8f-b283-4c72-93ec-d3d547b4b0ad-kube-api-access-zpx82\") pod \"whisker-7c4bd5fd7-j7zbb\" (UID: \"afdacd8f-b283-4c72-93ec-d3d547b4b0ad\") " pod="calico-system/whisker-7c4bd5fd7-j7zbb" Sep 9 05:38:06.444055 kubelet[2731]: I0909 05:38:06.444035 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afdacd8f-b283-4c72-93ec-d3d547b4b0ad-whisker-ca-bundle\") pod \"whisker-7c4bd5fd7-j7zbb\" (UID: \"afdacd8f-b283-4c72-93ec-d3d547b4b0ad\") " pod="calico-system/whisker-7c4bd5fd7-j7zbb" Sep 9 05:38:06.444055 kubelet[2731]: I0909 05:38:06.444052 2731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/afdacd8f-b283-4c72-93ec-d3d547b4b0ad-whisker-backend-key-pair\") pod \"whisker-7c4bd5fd7-j7zbb\" (UID: \"afdacd8f-b283-4c72-93ec-d3d547b4b0ad\") " pod="calico-system/whisker-7c4bd5fd7-j7zbb" Sep 9 05:38:06.572098 containerd[1587]: time="2025-09-09T05:38:06.572050237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c4bd5fd7-j7zbb,Uid:afdacd8f-b283-4c72-93ec-d3d547b4b0ad,Namespace:calico-system,Attempt:0,}" Sep 9 05:38:06.710696 systemd-networkd[1483]: cali590b0112233: Link UP Sep 9 05:38:06.710913 systemd-networkd[1483]: cali590b0112233: Gained carrier Sep 9 05:38:06.724061 containerd[1587]: 2025-09-09 05:38:06.596 [INFO][3880] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:06.724061 containerd[1587]: 2025-09-09 05:38:06.612 [INFO][3880] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0 whisker-7c4bd5fd7- calico-system afdacd8f-b283-4c72-93ec-d3d547b4b0ad 884 0 2025-09-09 05:38:06 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7c4bd5fd7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7c4bd5fd7-j7zbb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali590b0112233 [] [] }} ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-" Sep 9 05:38:06.724061 containerd[1587]: 2025-09-09 05:38:06.612 [INFO][3880] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.724061 containerd[1587]: 2025-09-09 05:38:06.669 [INFO][3894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" HandleID="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Workload="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.669 [INFO][3894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" HandleID="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Workload="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000db420), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7c4bd5fd7-j7zbb", "timestamp":"2025-09-09 05:38:06.669378307 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.669 [INFO][3894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.670 [INFO][3894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.670 [INFO][3894] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.677 [INFO][3894] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" host="localhost" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.683 [INFO][3894] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.687 [INFO][3894] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.689 [INFO][3894] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.691 [INFO][3894] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:06.724547 containerd[1587]: 2025-09-09 05:38:06.691 [INFO][3894] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" host="localhost" Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.692 [INFO][3894] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.696 [INFO][3894] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" host="localhost" Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.700 [INFO][3894] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" host="localhost" Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.700 [INFO][3894] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" host="localhost" Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.700 [INFO][3894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:06.724812 containerd[1587]: 2025-09-09 05:38:06.700 [INFO][3894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" HandleID="k8s-pod-network.17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Workload="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.724936 containerd[1587]: 2025-09-09 05:38:06.703 [INFO][3880] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0", GenerateName:"whisker-7c4bd5fd7-", Namespace:"calico-system", SelfLink:"", UID:"afdacd8f-b283-4c72-93ec-d3d547b4b0ad", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 38, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c4bd5fd7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7c4bd5fd7-j7zbb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali590b0112233", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:06.724936 containerd[1587]: 2025-09-09 05:38:06.703 [INFO][3880] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.725113 containerd[1587]: 2025-09-09 05:38:06.703 [INFO][3880] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali590b0112233 ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.725113 containerd[1587]: 2025-09-09 05:38:06.710 [INFO][3880] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.725156 containerd[1587]: 2025-09-09 05:38:06.712 [INFO][3880] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0", GenerateName:"whisker-7c4bd5fd7-", Namespace:"calico-system", SelfLink:"", UID:"afdacd8f-b283-4c72-93ec-d3d547b4b0ad", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 38, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c4bd5fd7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a", Pod:"whisker-7c4bd5fd7-j7zbb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali590b0112233", MAC:"7a:cb:b3:7d:69:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:06.725267 containerd[1587]: 2025-09-09 05:38:06.721 [INFO][3880] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" Namespace="calico-system" Pod="whisker-7c4bd5fd7-j7zbb" WorkloadEndpoint="localhost-k8s-whisker--7c4bd5fd7--j7zbb-eth0" Sep 9 05:38:06.834030 containerd[1587]: time="2025-09-09T05:38:06.833962826Z" level=info msg="connecting to shim 17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a" address="unix:///run/containerd/s/fbce43a0f36b58e35a1c5a1c28b30114a625acfe28a2d808f361dc842aa950f6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:06.868722 systemd[1]: Started cri-containerd-17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a.scope - libcontainer container 17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a. Sep 9 05:38:06.880795 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:06.913241 containerd[1587]: time="2025-09-09T05:38:06.913188129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c4bd5fd7-j7zbb,Uid:afdacd8f-b283-4c72-93ec-d3d547b4b0ad,Namespace:calico-system,Attempt:0,} returns sandbox id \"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a\"" Sep 9 05:38:06.914605 containerd[1587]: time="2025-09-09T05:38:06.914557413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:38:07.202265 kubelet[2731]: I0909 05:38:07.202209 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:07.720799 systemd-networkd[1483]: cali590b0112233: Gained IPv6LL Sep 9 05:38:08.121219 kubelet[2731]: I0909 05:38:08.121114 2731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce61c8c6-8da2-4f73-a908-519f05cb6a63" path="/var/lib/kubelet/pods/ce61c8c6-8da2-4f73-a908-519f05cb6a63/volumes" Sep 9 05:38:08.494184 containerd[1587]: time="2025-09-09T05:38:08.494129248Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:08.495130 containerd[1587]: time="2025-09-09T05:38:08.495087559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 05:38:08.496311 containerd[1587]: time="2025-09-09T05:38:08.496277596Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:08.498452 containerd[1587]: time="2025-09-09T05:38:08.498399574Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:08.499024 containerd[1587]: time="2025-09-09T05:38:08.498992419Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.583324467s" Sep 9 05:38:08.499024 containerd[1587]: time="2025-09-09T05:38:08.499020231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 05:38:08.502751 containerd[1587]: time="2025-09-09T05:38:08.502602334Z" level=info msg="CreateContainer within sandbox \"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:38:08.508968 containerd[1587]: time="2025-09-09T05:38:08.508929687Z" level=info msg="Container dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:08.516801 containerd[1587]: time="2025-09-09T05:38:08.516755707Z" level=info msg="CreateContainer within sandbox \"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd\"" Sep 9 05:38:08.517246 containerd[1587]: time="2025-09-09T05:38:08.517213067Z" level=info msg="StartContainer for \"dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd\"" Sep 9 05:38:08.518206 containerd[1587]: time="2025-09-09T05:38:08.518182578Z" level=info msg="connecting to shim dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd" address="unix:///run/containerd/s/fbce43a0f36b58e35a1c5a1c28b30114a625acfe28a2d808f361dc842aa950f6" protocol=ttrpc version=3 Sep 9 05:38:08.537713 systemd[1]: Started cri-containerd-dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd.scope - libcontainer container dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd. Sep 9 05:38:08.597561 containerd[1587]: time="2025-09-09T05:38:08.597512849Z" level=info msg="StartContainer for \"dd128b041cde7028aab2ea59c66cb6d7ead09619623ec9289f83eb3862dd1bbd\" returns successfully" Sep 9 05:38:08.599351 containerd[1587]: time="2025-09-09T05:38:08.599329383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:38:10.750421 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1525750386.mount: Deactivated successfully. Sep 9 05:38:11.082828 containerd[1587]: time="2025-09-09T05:38:11.082692255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:11.084613 containerd[1587]: time="2025-09-09T05:38:11.084550105Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 05:38:11.085966 containerd[1587]: time="2025-09-09T05:38:11.085910291Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:11.118577 containerd[1587]: time="2025-09-09T05:38:11.118438495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54c44569bd-mj58w,Uid:bdb5173f-1c88-48a1-904a-4de1b9173e2a,Namespace:calico-system,Attempt:0,}" Sep 9 05:38:11.118577 containerd[1587]: time="2025-09-09T05:38:11.118559673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kzl7q,Uid:f006145d-362b-4e7a-b943-f0f12d014871,Namespace:kube-system,Attempt:0,}" Sep 9 05:38:11.118832 containerd[1587]: time="2025-09-09T05:38:11.118691911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-45sx9,Uid:b4ee7a58-3044-4fd9-b673-e51540fd4be9,Namespace:calico-system,Attempt:0,}" Sep 9 05:38:11.126459 containerd[1587]: time="2025-09-09T05:38:11.126407968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:11.127343 containerd[1587]: time="2025-09-09T05:38:11.127285567Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.527931327s" Sep 9 05:38:11.127343 containerd[1587]: time="2025-09-09T05:38:11.127344347Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 05:38:11.130320 containerd[1587]: time="2025-09-09T05:38:11.129568246Z" level=info msg="CreateContainer within sandbox \"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:38:11.161097 containerd[1587]: time="2025-09-09T05:38:11.160765298Z" level=info msg="Container 5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:11.174564 containerd[1587]: time="2025-09-09T05:38:11.174517827Z" level=info msg="CreateContainer within sandbox \"17e26361d3f7c3e9eaad2af4865cb9245e0c0c84966fc0f34fc653d5acf2de3a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b\"" Sep 9 05:38:11.176253 containerd[1587]: time="2025-09-09T05:38:11.176144523Z" level=info msg="StartContainer for \"5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b\"" Sep 9 05:38:11.177628 containerd[1587]: time="2025-09-09T05:38:11.177581062Z" level=info msg="connecting to shim 5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b" address="unix:///run/containerd/s/fbce43a0f36b58e35a1c5a1c28b30114a625acfe28a2d808f361dc842aa950f6" protocol=ttrpc version=3 Sep 9 05:38:11.259763 systemd[1]: Started cri-containerd-5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b.scope - libcontainer container 5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b. Sep 9 05:38:11.310324 systemd-networkd[1483]: cali745e6808854: Link UP Sep 9 05:38:11.310944 systemd-networkd[1483]: cali745e6808854: Gained carrier Sep 9 05:38:11.336734 systemd[1]: Started sshd@7-10.0.0.118:22-10.0.0.1:37712.service - OpenSSH per-connection server daemon (10.0.0.1:37712). Sep 9 05:38:11.338353 containerd[1587]: 2025-09-09 05:38:11.173 [INFO][4199] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:11.338353 containerd[1587]: 2025-09-09 05:38:11.187 [INFO][4199] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0 coredns-7c65d6cfc9- kube-system f006145d-362b-4e7a-b943-f0f12d014871 807 0 2025-09-09 05:37:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-kzl7q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali745e6808854 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-" Sep 9 05:38:11.338353 containerd[1587]: 2025-09-09 05:38:11.187 [INFO][4199] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.338353 containerd[1587]: 2025-09-09 05:38:11.220 [INFO][4237] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" HandleID="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Workload="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.220 [INFO][4237] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" HandleID="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Workload="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ecd0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-kzl7q", "timestamp":"2025-09-09 05:38:11.220446156 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.220 [INFO][4237] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.220 [INFO][4237] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.220 [INFO][4237] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.245 [INFO][4237] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" host="localhost" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.258 [INFO][4237] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.274 [INFO][4237] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.276 [INFO][4237] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.281 [INFO][4237] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.338525 containerd[1587]: 2025-09-09 05:38:11.281 [INFO][4237] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" host="localhost" Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.286 [INFO][4237] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.293 [INFO][4237] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" host="localhost" Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.299 [INFO][4237] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" host="localhost" Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.299 [INFO][4237] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" host="localhost" Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.299 [INFO][4237] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:11.339191 containerd[1587]: 2025-09-09 05:38:11.299 [INFO][4237] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" HandleID="k8s-pod-network.51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Workload="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.339337 containerd[1587]: 2025-09-09 05:38:11.302 [INFO][4199] cni-plugin/k8s.go 418: Populated endpoint ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f006145d-362b-4e7a-b943-f0f12d014871", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-kzl7q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali745e6808854", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.339439 containerd[1587]: 2025-09-09 05:38:11.302 [INFO][4199] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.339439 containerd[1587]: 2025-09-09 05:38:11.302 [INFO][4199] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali745e6808854 ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.339439 containerd[1587]: 2025-09-09 05:38:11.310 [INFO][4199] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.339505 containerd[1587]: 2025-09-09 05:38:11.310 [INFO][4199] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f006145d-362b-4e7a-b943-f0f12d014871", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff", Pod:"coredns-7c65d6cfc9-kzl7q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali745e6808854", MAC:"52:63:14:e5:9f:a4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.339505 containerd[1587]: 2025-09-09 05:38:11.325 [INFO][4199] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kzl7q" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--kzl7q-eth0" Sep 9 05:38:11.419756 sshd[4280]: Accepted publickey for core from 10.0.0.1 port 37712 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:11.421697 sshd-session[4280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:11.426546 systemd-logind[1571]: New session 8 of user core. Sep 9 05:38:11.436837 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:38:11.443350 containerd[1587]: time="2025-09-09T05:38:11.443295897Z" level=info msg="StartContainer for \"5e9e39564df1ce031d16a1083bd2eddddcaaf8c90e48fe140ceec53a6a49147b\" returns successfully" Sep 9 05:38:11.571556 systemd-networkd[1483]: cali84a765f9a61: Link UP Sep 9 05:38:11.571766 systemd-networkd[1483]: cali84a765f9a61: Gained carrier Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.172 [INFO][4183] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.186 [INFO][4183] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0 calico-kube-controllers-54c44569bd- calico-system bdb5173f-1c88-48a1-904a-4de1b9173e2a 810 0 2025-09-09 05:37:49 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:54c44569bd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-54c44569bd-mj58w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali84a765f9a61 [] [] }} ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.186 [INFO][4183] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.248 [INFO][4243] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" HandleID="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Workload="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.249 [INFO][4243] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" HandleID="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Workload="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fba0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-54c44569bd-mj58w", "timestamp":"2025-09-09 05:38:11.248617495 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.249 [INFO][4243] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.299 [INFO][4243] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.300 [INFO][4243] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.353 [INFO][4243] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.356 [INFO][4243] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.375 [INFO][4243] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.377 [INFO][4243] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.378 [INFO][4243] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.378 [INFO][4243] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.380 [INFO][4243] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150 Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.521 [INFO][4243] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4243] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4243] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" host="localhost" Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4243] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:11.591334 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4243] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" HandleID="k8s-pod-network.28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Workload="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.567 [INFO][4183] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0", GenerateName:"calico-kube-controllers-54c44569bd-", Namespace:"calico-system", SelfLink:"", UID:"bdb5173f-1c88-48a1-904a-4de1b9173e2a", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54c44569bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-54c44569bd-mj58w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali84a765f9a61", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.567 [INFO][4183] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.568 [INFO][4183] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali84a765f9a61 ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.570 [INFO][4183] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.571 [INFO][4183] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0", GenerateName:"calico-kube-controllers-54c44569bd-", Namespace:"calico-system", SelfLink:"", UID:"bdb5173f-1c88-48a1-904a-4de1b9173e2a", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54c44569bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150", Pod:"calico-kube-controllers-54c44569bd-mj58w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali84a765f9a61", MAC:"36:2a:f6:42:b8:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.591962 containerd[1587]: 2025-09-09 05:38:11.583 [INFO][4183] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" Namespace="calico-system" Pod="calico-kube-controllers-54c44569bd-mj58w" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--54c44569bd--mj58w-eth0" Sep 9 05:38:11.686754 sshd[4302]: Connection closed by 10.0.0.1 port 37712 Sep 9 05:38:11.688956 sshd-session[4280]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:11.692505 systemd-networkd[1483]: calied59405e1af: Link UP Sep 9 05:38:11.694071 systemd-networkd[1483]: calied59405e1af: Gained carrier Sep 9 05:38:11.699772 systemd[1]: sshd@7-10.0.0.118:22-10.0.0.1:37712.service: Deactivated successfully. Sep 9 05:38:11.704011 containerd[1587]: time="2025-09-09T05:38:11.702817766Z" level=info msg="connecting to shim 51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff" address="unix:///run/containerd/s/fa3d6550c1db407fd40e0bffe08184634961f6c1453eaed98254c46ff2eebbe4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:11.703461 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:38:11.705002 systemd-logind[1571]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:38:11.719361 systemd-logind[1571]: Removed session 8. Sep 9 05:38:11.725726 containerd[1587]: time="2025-09-09T05:38:11.725660583Z" level=info msg="connecting to shim 28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150" address="unix:///run/containerd/s/9dc9b9ca335b0c65dc2645616fd2949faa8a35e784f4dd01ccb578f33cfdf93e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.165 [INFO][4181] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.178 [INFO][4181] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--45sx9-eth0 goldmane-7988f88666- calico-system b4ee7a58-3044-4fd9-b673-e51540fd4be9 814 0 2025-09-09 05:37:49 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-45sx9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calied59405e1af [] [] }} ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.179 [INFO][4181] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.273 [INFO][4230] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" HandleID="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Workload="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.273 [INFO][4230] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" HandleID="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Workload="localhost-k8s-goldmane--7988f88666--45sx9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004913c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-45sx9", "timestamp":"2025-09-09 05:38:11.273402391 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.281 [INFO][4230] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4230] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.562 [INFO][4230] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.584 [INFO][4230] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.595 [INFO][4230] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.599 [INFO][4230] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.600 [INFO][4230] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.602 [INFO][4230] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.602 [INFO][4230] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.603 [INFO][4230] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656 Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.677 [INFO][4230] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.685 [INFO][4230] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.685 [INFO][4230] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" host="localhost" Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.685 [INFO][4230] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:11.725987 containerd[1587]: 2025-09-09 05:38:11.685 [INFO][4230] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" HandleID="k8s-pod-network.841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Workload="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.689 [INFO][4181] cni-plugin/k8s.go 418: Populated endpoint ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--45sx9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b4ee7a58-3044-4fd9-b673-e51540fd4be9", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-45sx9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calied59405e1af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.690 [INFO][4181] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.690 [INFO][4181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calied59405e1af ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.697 [INFO][4181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.703 [INFO][4181] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--45sx9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b4ee7a58-3044-4fd9-b673-e51540fd4be9", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656", Pod:"goldmane-7988f88666-45sx9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calied59405e1af", MAC:"de:31:e1:1a:91:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:11.726475 containerd[1587]: 2025-09-09 05:38:11.722 [INFO][4181] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" Namespace="calico-system" Pod="goldmane-7988f88666-45sx9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--45sx9-eth0" Sep 9 05:38:11.751620 containerd[1587]: time="2025-09-09T05:38:11.751548030Z" level=info msg="connecting to shim 841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656" address="unix:///run/containerd/s/465fd37f01ad37b4be23a5cdb050a1fc002312d77d5c3ee86b018767aa132f18" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:11.753887 systemd[1]: Started cri-containerd-51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff.scope - libcontainer container 51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff. Sep 9 05:38:11.757936 systemd[1]: Started cri-containerd-28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150.scope - libcontainer container 28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150. Sep 9 05:38:11.772248 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:11.780755 systemd[1]: Started cri-containerd-841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656.scope - libcontainer container 841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656. Sep 9 05:38:11.786956 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:11.805117 containerd[1587]: time="2025-09-09T05:38:11.805007159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kzl7q,Uid:f006145d-362b-4e7a-b943-f0f12d014871,Namespace:kube-system,Attempt:0,} returns sandbox id \"51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff\"" Sep 9 05:38:11.809304 containerd[1587]: time="2025-09-09T05:38:11.809205126Z" level=info msg="CreateContainer within sandbox \"51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:38:11.811849 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:11.823286 containerd[1587]: time="2025-09-09T05:38:11.823244614Z" level=info msg="Container c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:11.831712 containerd[1587]: time="2025-09-09T05:38:11.831581587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54c44569bd-mj58w,Uid:bdb5173f-1c88-48a1-904a-4de1b9173e2a,Namespace:calico-system,Attempt:0,} returns sandbox id \"28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150\"" Sep 9 05:38:11.832411 containerd[1587]: time="2025-09-09T05:38:11.832386139Z" level=info msg="CreateContainer within sandbox \"51b37661b7e1986cb12ec1204e6e375b0145a7be165b8574429cbbf14979e3ff\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2\"" Sep 9 05:38:11.832949 containerd[1587]: time="2025-09-09T05:38:11.832923970Z" level=info msg="StartContainer for \"c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2\"" Sep 9 05:38:11.835106 containerd[1587]: time="2025-09-09T05:38:11.834966917Z" level=info msg="connecting to shim c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2" address="unix:///run/containerd/s/fa3d6550c1db407fd40e0bffe08184634961f6c1453eaed98254c46ff2eebbe4" protocol=ttrpc version=3 Sep 9 05:38:11.835876 containerd[1587]: time="2025-09-09T05:38:11.835853423Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:38:11.851332 containerd[1587]: time="2025-09-09T05:38:11.851213301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-45sx9,Uid:b4ee7a58-3044-4fd9-b673-e51540fd4be9,Namespace:calico-system,Attempt:0,} returns sandbox id \"841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656\"" Sep 9 05:38:11.874730 systemd[1]: Started cri-containerd-c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2.scope - libcontainer container c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2. Sep 9 05:38:11.907144 containerd[1587]: time="2025-09-09T05:38:11.907100602Z" level=info msg="StartContainer for \"c8fcd380ed5513463d4d2959fa654079d5b9c92ad5c36084e21da0f425b625a2\" returns successfully" Sep 9 05:38:12.118850 containerd[1587]: time="2025-09-09T05:38:12.118718152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fr4sd,Uid:382d750e-f8f8-4cd7-87b5-dfa1289af050,Namespace:calico-system,Attempt:0,}" Sep 9 05:38:12.118850 containerd[1587]: time="2025-09-09T05:38:12.118733571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-dbld7,Uid:d6d53f50-cd74-44ff-bef9-71572dfb8e98,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:38:12.222477 systemd-networkd[1483]: cali8671a71c8d8: Link UP Sep 9 05:38:12.223803 systemd-networkd[1483]: cali8671a71c8d8: Gained carrier Sep 9 05:38:12.237776 kubelet[2731]: I0909 05:38:12.237613 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-kzl7q" podStartSLOduration=33.237576725 podStartE2EDuration="33.237576725s" podCreationTimestamp="2025-09-09 05:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:38:12.237130376 +0000 UTC m=+40.194519236" watchObservedRunningTime="2025-09-09 05:38:12.237576725 +0000 UTC m=+40.194965595" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.146 [INFO][4534] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.158 [INFO][4534] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0 calico-apiserver-64659bc7c5- calico-apiserver d6d53f50-cd74-44ff-bef9-71572dfb8e98 818 0 2025-09-09 05:37:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64659bc7c5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64659bc7c5-dbld7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8671a71c8d8 [] [] }} ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.158 [INFO][4534] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.185 [INFO][4557] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" HandleID="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Workload="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.185 [INFO][4557] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" HandleID="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Workload="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f530), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64659bc7c5-dbld7", "timestamp":"2025-09-09 05:38:12.185468837 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.185 [INFO][4557] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.185 [INFO][4557] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.185 [INFO][4557] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.193 [INFO][4557] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.197 [INFO][4557] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.200 [INFO][4557] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.203 [INFO][4557] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.205 [INFO][4557] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.205 [INFO][4557] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.206 [INFO][4557] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68 Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.209 [INFO][4557] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.215 [INFO][4557] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.215 [INFO][4557] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" host="localhost" Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.215 [INFO][4557] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:12.239686 containerd[1587]: 2025-09-09 05:38:12.215 [INFO][4557] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" HandleID="k8s-pod-network.800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Workload="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.219 [INFO][4534] cni-plugin/k8s.go 418: Populated endpoint ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0", GenerateName:"calico-apiserver-64659bc7c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"d6d53f50-cd74-44ff-bef9-71572dfb8e98", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64659bc7c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64659bc7c5-dbld7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8671a71c8d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.219 [INFO][4534] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.219 [INFO][4534] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8671a71c8d8 ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.223 [INFO][4534] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.223 [INFO][4534] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0", GenerateName:"calico-apiserver-64659bc7c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"d6d53f50-cd74-44ff-bef9-71572dfb8e98", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64659bc7c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68", Pod:"calico-apiserver-64659bc7c5-dbld7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8671a71c8d8", MAC:"b6:41:25:20:21:c3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:12.240336 containerd[1587]: 2025-09-09 05:38:12.234 [INFO][4534] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-dbld7" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--dbld7-eth0" Sep 9 05:38:12.291630 kubelet[2731]: I0909 05:38:12.291347 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7c4bd5fd7-j7zbb" podStartSLOduration=2.0775269610000002 podStartE2EDuration="6.291329503s" podCreationTimestamp="2025-09-09 05:38:06 +0000 UTC" firstStartedPulling="2025-09-09 05:38:06.91429487 +0000 UTC m=+34.871683740" lastFinishedPulling="2025-09-09 05:38:11.128097412 +0000 UTC m=+39.085486282" observedRunningTime="2025-09-09 05:38:12.290344883 +0000 UTC m=+40.247733753" watchObservedRunningTime="2025-09-09 05:38:12.291329503 +0000 UTC m=+40.248718373" Sep 9 05:38:12.307611 containerd[1587]: time="2025-09-09T05:38:12.307278894Z" level=info msg="connecting to shim 800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68" address="unix:///run/containerd/s/6132f112f6f827af5417763a82cf7e33a56b3840ee0d0e81e3b0b8f4db835475" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:12.350748 systemd[1]: Started cri-containerd-800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68.scope - libcontainer container 800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68. Sep 9 05:38:12.373095 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:12.578134 systemd-networkd[1483]: cali53f6f46c44f: Link UP Sep 9 05:38:12.578809 systemd-networkd[1483]: cali53f6f46c44f: Gained carrier Sep 9 05:38:12.583267 containerd[1587]: time="2025-09-09T05:38:12.583225316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-dbld7,Uid:d6d53f50-cd74-44ff-bef9-71572dfb8e98,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68\"" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.144 [INFO][4529] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.158 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--fr4sd-eth0 csi-node-driver- calico-system 382d750e-f8f8-4cd7-87b5-dfa1289af050 707 0 2025-09-09 05:37:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-fr4sd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali53f6f46c44f [] [] }} ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.158 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.189 [INFO][4558] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" HandleID="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Workload="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.189 [INFO][4558] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" HandleID="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Workload="localhost-k8s-csi--node--driver--fr4sd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000392960), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-fr4sd", "timestamp":"2025-09-09 05:38:12.189112862 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.189 [INFO][4558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.215 [INFO][4558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.216 [INFO][4558] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.303 [INFO][4558] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.312 [INFO][4558] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.322 [INFO][4558] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.328 [INFO][4558] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.335 [INFO][4558] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.335 [INFO][4558] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.338 [INFO][4558] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86 Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.344 [INFO][4558] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.568 [INFO][4558] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.568 [INFO][4558] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" host="localhost" Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.568 [INFO][4558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:12.603836 containerd[1587]: 2025-09-09 05:38:12.568 [INFO][4558] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" HandleID="k8s-pod-network.5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Workload="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.572 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fr4sd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"382d750e-f8f8-4cd7-87b5-dfa1289af050", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-fr4sd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali53f6f46c44f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.573 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.573 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53f6f46c44f ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.575 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.575 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fr4sd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"382d750e-f8f8-4cd7-87b5-dfa1289af050", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86", Pod:"csi-node-driver-fr4sd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali53f6f46c44f", MAC:"06:b4:07:f8:42:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:12.604439 containerd[1587]: 2025-09-09 05:38:12.599 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" Namespace="calico-system" Pod="csi-node-driver-fr4sd" WorkloadEndpoint="localhost-k8s-csi--node--driver--fr4sd-eth0" Sep 9 05:38:12.628558 containerd[1587]: time="2025-09-09T05:38:12.626287261Z" level=info msg="connecting to shim 5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86" address="unix:///run/containerd/s/7f841a7bc18e0150695c633a7fc79bf59b4aebc0800a3478cb87df7fd088c0f3" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:12.661814 systemd[1]: Started cri-containerd-5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86.scope - libcontainer container 5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86. Sep 9 05:38:12.674717 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:12.690473 containerd[1587]: time="2025-09-09T05:38:12.690438564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fr4sd,Uid:382d750e-f8f8-4cd7-87b5-dfa1289af050,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86\"" Sep 9 05:38:12.840798 systemd-networkd[1483]: calied59405e1af: Gained IPv6LL Sep 9 05:38:12.969513 systemd-networkd[1483]: cali745e6808854: Gained IPv6LL Sep 9 05:38:13.119242 containerd[1587]: time="2025-09-09T05:38:13.119184592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-hdpgx,Uid:e24812f7-a033-4586-8f78-dd439cb8f791,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:38:13.119963 containerd[1587]: time="2025-09-09T05:38:13.119919573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tpcjd,Uid:f552aed4-9ee6-4d96-b587-80305d432ef7,Namespace:kube-system,Attempt:0,}" Sep 9 05:38:13.220745 systemd-networkd[1483]: cali3bd265e66f7: Link UP Sep 9 05:38:13.220962 systemd-networkd[1483]: cali3bd265e66f7: Gained carrier Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.148 [INFO][4718] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.158 [INFO][4718] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0 coredns-7c65d6cfc9- kube-system f552aed4-9ee6-4d96-b587-80305d432ef7 816 0 2025-09-09 05:37:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-tpcjd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3bd265e66f7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.158 [INFO][4718] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.183 [INFO][4739] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" HandleID="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Workload="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.183 [INFO][4739] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" HandleID="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Workload="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000189d70), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-tpcjd", "timestamp":"2025-09-09 05:38:13.18347963 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.183 [INFO][4739] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.183 [INFO][4739] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.183 [INFO][4739] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.189 [INFO][4739] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.193 [INFO][4739] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.199 [INFO][4739] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.200 [INFO][4739] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.203 [INFO][4739] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.203 [INFO][4739] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.204 [INFO][4739] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.209 [INFO][4739] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4739] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4739] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" host="localhost" Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4739] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:13.232363 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4739] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" HandleID="k8s-pod-network.0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Workload="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.218 [INFO][4718] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f552aed4-9ee6-4d96-b587-80305d432ef7", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-tpcjd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bd265e66f7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.218 [INFO][4718] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.218 [INFO][4718] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3bd265e66f7 ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.221 [INFO][4718] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.221 [INFO][4718] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f552aed4-9ee6-4d96-b587-80305d432ef7", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda", Pod:"coredns-7c65d6cfc9-tpcjd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bd265e66f7", MAC:"ba:87:d7:29:83:28", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:13.233004 containerd[1587]: 2025-09-09 05:38:13.229 [INFO][4718] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" Namespace="kube-system" Pod="coredns-7c65d6cfc9-tpcjd" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--tpcjd-eth0" Sep 9 05:38:13.254440 containerd[1587]: time="2025-09-09T05:38:13.254375306Z" level=info msg="connecting to shim 0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda" address="unix:///run/containerd/s/cdb8f48a47d76e0a2a4166a672357eedd63ae34353e2c519d51dc04ee66648f4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:13.286005 systemd[1]: Started cri-containerd-0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda.scope - libcontainer container 0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda. Sep 9 05:38:13.307471 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:13.332535 systemd-networkd[1483]: cali745b3d8270b: Link UP Sep 9 05:38:13.333433 systemd-networkd[1483]: cali745b3d8270b: Gained carrier Sep 9 05:38:13.342513 containerd[1587]: time="2025-09-09T05:38:13.342379208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-tpcjd,Uid:f552aed4-9ee6-4d96-b587-80305d432ef7,Namespace:kube-system,Attempt:0,} returns sandbox id \"0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda\"" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.146 [INFO][4708] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.155 [INFO][4708] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0 calico-apiserver-64659bc7c5- calico-apiserver e24812f7-a033-4586-8f78-dd439cb8f791 819 0 2025-09-09 05:37:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64659bc7c5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64659bc7c5-hdpgx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali745b3d8270b [] [] }} ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.155 [INFO][4708] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.184 [INFO][4737] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" HandleID="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Workload="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.184 [INFO][4737] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" HandleID="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Workload="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138430), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64659bc7c5-hdpgx", "timestamp":"2025-09-09 05:38:13.184817664 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.185 [INFO][4737] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4737] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.214 [INFO][4737] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.291 [INFO][4737] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.298 [INFO][4737] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.303 [INFO][4737] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.304 [INFO][4737] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.306 [INFO][4737] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.306 [INFO][4737] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.308 [INFO][4737] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.313 [INFO][4737] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.321 [INFO][4737] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.321 [INFO][4737] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" host="localhost" Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.321 [INFO][4737] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:38:13.348155 containerd[1587]: 2025-09-09 05:38:13.321 [INFO][4737] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" HandleID="k8s-pod-network.45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Workload="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.327 [INFO][4708] cni-plugin/k8s.go 418: Populated endpoint ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0", GenerateName:"calico-apiserver-64659bc7c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"e24812f7-a033-4586-8f78-dd439cb8f791", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64659bc7c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64659bc7c5-hdpgx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali745b3d8270b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.328 [INFO][4708] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.328 [INFO][4708] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali745b3d8270b ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.333 [INFO][4708] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.334 [INFO][4708] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0", GenerateName:"calico-apiserver-64659bc7c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"e24812f7-a033-4586-8f78-dd439cb8f791", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 37, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64659bc7c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae", Pod:"calico-apiserver-64659bc7c5-hdpgx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali745b3d8270b", MAC:"5e:af:4e:76:9d:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:38:13.349156 containerd[1587]: 2025-09-09 05:38:13.344 [INFO][4708] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" Namespace="calico-apiserver" Pod="calico-apiserver-64659bc7c5-hdpgx" WorkloadEndpoint="localhost-k8s-calico--apiserver--64659bc7c5--hdpgx-eth0" Sep 9 05:38:13.349156 containerd[1587]: time="2025-09-09T05:38:13.349114248Z" level=info msg="CreateContainer within sandbox \"0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:38:13.358967 containerd[1587]: time="2025-09-09T05:38:13.358919125Z" level=info msg="Container fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:13.365654 containerd[1587]: time="2025-09-09T05:38:13.365609482Z" level=info msg="CreateContainer within sandbox \"0601883c74ed1218e1c294ec71e0a2e5fb837a4882b92d84ac414b17a0bbdbda\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b\"" Sep 9 05:38:13.366196 containerd[1587]: time="2025-09-09T05:38:13.366088281Z" level=info msg="StartContainer for \"fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b\"" Sep 9 05:38:13.367048 containerd[1587]: time="2025-09-09T05:38:13.367008720Z" level=info msg="connecting to shim fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b" address="unix:///run/containerd/s/cdb8f48a47d76e0a2a4166a672357eedd63ae34353e2c519d51dc04ee66648f4" protocol=ttrpc version=3 Sep 9 05:38:13.376962 containerd[1587]: time="2025-09-09T05:38:13.376678162Z" level=info msg="connecting to shim 45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae" address="unix:///run/containerd/s/c1ac2ea49a78181c9a076a5fd55068cb8b0b522b1027258d517b117a755c7328" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:38:13.387745 systemd[1]: Started cri-containerd-fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b.scope - libcontainer container fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b. Sep 9 05:38:13.406812 systemd[1]: Started cri-containerd-45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae.scope - libcontainer container 45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae. Sep 9 05:38:13.422483 systemd-resolved[1407]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:38:13.437490 containerd[1587]: time="2025-09-09T05:38:13.437441667Z" level=info msg="StartContainer for \"fe5b00696bf3ea9ccd40ffe4e5382c596656a35909c22106a45ba06a3d70209b\" returns successfully" Sep 9 05:38:13.458715 containerd[1587]: time="2025-09-09T05:38:13.458663569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64659bc7c5-hdpgx,Uid:e24812f7-a033-4586-8f78-dd439cb8f791,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae\"" Sep 9 05:38:13.480775 systemd-networkd[1483]: cali8671a71c8d8: Gained IPv6LL Sep 9 05:38:13.544743 systemd-networkd[1483]: cali84a765f9a61: Gained IPv6LL Sep 9 05:38:14.258156 kubelet[2731]: I0909 05:38:14.258027 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-tpcjd" podStartSLOduration=35.258006412 podStartE2EDuration="35.258006412s" podCreationTimestamp="2025-09-09 05:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:38:14.257191611 +0000 UTC m=+42.214580471" watchObservedRunningTime="2025-09-09 05:38:14.258006412 +0000 UTC m=+42.215395282" Sep 9 05:38:14.441738 systemd-networkd[1483]: cali53f6f46c44f: Gained IPv6LL Sep 9 05:38:14.447639 containerd[1587]: time="2025-09-09T05:38:14.447601365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:14.448238 containerd[1587]: time="2025-09-09T05:38:14.448187877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 05:38:14.449501 containerd[1587]: time="2025-09-09T05:38:14.449467571Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:14.451701 containerd[1587]: time="2025-09-09T05:38:14.451620564Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:14.456647 containerd[1587]: time="2025-09-09T05:38:14.456572696Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.620692031s" Sep 9 05:38:14.456647 containerd[1587]: time="2025-09-09T05:38:14.456644761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 05:38:14.462085 containerd[1587]: time="2025-09-09T05:38:14.462050794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:38:14.468716 containerd[1587]: time="2025-09-09T05:38:14.468683962Z" level=info msg="CreateContainer within sandbox \"28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:38:14.478621 containerd[1587]: time="2025-09-09T05:38:14.478179074Z" level=info msg="Container 9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:14.494641 containerd[1587]: time="2025-09-09T05:38:14.494557426Z" level=info msg="CreateContainer within sandbox \"28260eaa35133916e349f8b7faa716344b73ad726215e42df8d075406213f150\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\"" Sep 9 05:38:14.495613 containerd[1587]: time="2025-09-09T05:38:14.495137926Z" level=info msg="StartContainer for \"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\"" Sep 9 05:38:14.496357 containerd[1587]: time="2025-09-09T05:38:14.496329233Z" level=info msg="connecting to shim 9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814" address="unix:///run/containerd/s/9dc9b9ca335b0c65dc2645616fd2949faa8a35e784f4dd01ccb578f33cfdf93e" protocol=ttrpc version=3 Sep 9 05:38:14.538835 systemd[1]: Started cri-containerd-9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814.scope - libcontainer container 9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814. Sep 9 05:38:14.568781 systemd-networkd[1483]: cali3bd265e66f7: Gained IPv6LL Sep 9 05:38:14.832516 containerd[1587]: time="2025-09-09T05:38:14.832394507Z" level=info msg="StartContainer for \"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\" returns successfully" Sep 9 05:38:15.261510 kubelet[2731]: I0909 05:38:15.261411 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-54c44569bd-mj58w" podStartSLOduration=23.636172705 podStartE2EDuration="26.261277023s" podCreationTimestamp="2025-09-09 05:37:49 +0000 UTC" firstStartedPulling="2025-09-09 05:38:11.835167625 +0000 UTC m=+39.792556485" lastFinishedPulling="2025-09-09 05:38:14.460271933 +0000 UTC m=+42.417660803" observedRunningTime="2025-09-09 05:38:15.260735386 +0000 UTC m=+43.218124256" watchObservedRunningTime="2025-09-09 05:38:15.261277023 +0000 UTC m=+43.218665883" Sep 9 05:38:15.400818 systemd-networkd[1483]: cali745b3d8270b: Gained IPv6LL Sep 9 05:38:16.376221 containerd[1587]: time="2025-09-09T05:38:16.376172254Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\" id:\"92bd2c74a4ef3ce1d5c4fe14cdf3c8def4343063612c88516e3fbc7e2c905e7f\" pid:5037 exited_at:{seconds:1757396296 nanos:375853345}" Sep 9 05:38:16.648287 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3747955969.mount: Deactivated successfully. Sep 9 05:38:16.651634 kubelet[2731]: I0909 05:38:16.651553 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:16.699058 systemd[1]: Started sshd@8-10.0.0.118:22-10.0.0.1:37724.service - OpenSSH per-connection server daemon (10.0.0.1:37724). Sep 9 05:38:16.741749 containerd[1587]: time="2025-09-09T05:38:16.741702569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\" id:\"af74ff8ce2e6193171a1a5ffac0a01c68b49df4e7bdf9c2ee09d9b609463a287\" pid:5061 exit_status:1 exited_at:{seconds:1757396296 nanos:741331843}" Sep 9 05:38:16.771577 sshd[5077]: Accepted publickey for core from 10.0.0.1 port 37724 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:16.773439 sshd-session[5077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:16.780219 systemd-logind[1571]: New session 9 of user core. Sep 9 05:38:16.785780 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:38:16.827310 containerd[1587]: time="2025-09-09T05:38:16.827251347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\" id:\"90173dbdd7ed95f932f9c6e7724fedd11767db655f18614b877ed599a5e66ef5\" pid:5093 exit_status:1 exited_at:{seconds:1757396296 nanos:826693591}" Sep 9 05:38:16.943358 sshd[5104]: Connection closed by 10.0.0.1 port 37724 Sep 9 05:38:16.943836 sshd-session[5077]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:16.948162 systemd[1]: sshd@8-10.0.0.118:22-10.0.0.1:37724.service: Deactivated successfully. Sep 9 05:38:16.950719 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:38:16.958772 systemd-logind[1571]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:38:16.960368 systemd-logind[1571]: Removed session 9. Sep 9 05:38:17.602019 containerd[1587]: time="2025-09-09T05:38:17.601962469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:17.602828 containerd[1587]: time="2025-09-09T05:38:17.602790173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 05:38:17.604352 containerd[1587]: time="2025-09-09T05:38:17.604305097Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:17.606728 containerd[1587]: time="2025-09-09T05:38:17.606690907Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:17.607384 containerd[1587]: time="2025-09-09T05:38:17.607342230Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.145260398s" Sep 9 05:38:17.607384 containerd[1587]: time="2025-09-09T05:38:17.607380712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 05:38:17.615020 containerd[1587]: time="2025-09-09T05:38:17.614974981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:38:17.617467 containerd[1587]: time="2025-09-09T05:38:17.617439929Z" level=info msg="CreateContainer within sandbox \"841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:38:17.627821 containerd[1587]: time="2025-09-09T05:38:17.627775776Z" level=info msg="Container 1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:17.640277 containerd[1587]: time="2025-09-09T05:38:17.640210663Z" level=info msg="CreateContainer within sandbox \"841d6051acc7c49041d9ebfa2198a8fdd6016def34129430fbdb86430069a656\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\"" Sep 9 05:38:17.640971 containerd[1587]: time="2025-09-09T05:38:17.640883406Z" level=info msg="StartContainer for \"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\"" Sep 9 05:38:17.642055 containerd[1587]: time="2025-09-09T05:38:17.642027384Z" level=info msg="connecting to shim 1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd" address="unix:///run/containerd/s/465fd37f01ad37b4be23a5cdb050a1fc002312d77d5c3ee86b018767aa132f18" protocol=ttrpc version=3 Sep 9 05:38:17.664895 systemd[1]: Started cri-containerd-1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd.scope - libcontainer container 1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd. Sep 9 05:38:17.716030 containerd[1587]: time="2025-09-09T05:38:17.715991127Z" level=info msg="StartContainer for \"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\" returns successfully" Sep 9 05:38:18.029580 kubelet[2731]: I0909 05:38:18.028926 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:19.259039 kubelet[2731]: I0909 05:38:19.259003 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:19.303398 systemd-networkd[1483]: vxlan.calico: Link UP Sep 9 05:38:19.303412 systemd-networkd[1483]: vxlan.calico: Gained carrier Sep 9 05:38:20.456747 systemd-networkd[1483]: vxlan.calico: Gained IPv6LL Sep 9 05:38:21.197125 containerd[1587]: time="2025-09-09T05:38:21.197072970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:21.197835 containerd[1587]: time="2025-09-09T05:38:21.197813320Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 05:38:21.199151 containerd[1587]: time="2025-09-09T05:38:21.199118479Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:21.201039 containerd[1587]: time="2025-09-09T05:38:21.201012254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:21.201835 containerd[1587]: time="2025-09-09T05:38:21.201790755Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.586769568s" Sep 9 05:38:21.201835 containerd[1587]: time="2025-09-09T05:38:21.201833846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:38:21.202746 containerd[1587]: time="2025-09-09T05:38:21.202703858Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:38:21.203718 containerd[1587]: time="2025-09-09T05:38:21.203681564Z" level=info msg="CreateContainer within sandbox \"800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:38:21.211499 containerd[1587]: time="2025-09-09T05:38:21.211457740Z" level=info msg="Container e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:21.219389 containerd[1587]: time="2025-09-09T05:38:21.219346207Z" level=info msg="CreateContainer within sandbox \"800222533b91821c2c499b44e17aa70dfec8bc602c66a48655350d973850ca68\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077\"" Sep 9 05:38:21.220771 containerd[1587]: time="2025-09-09T05:38:21.219852136Z" level=info msg="StartContainer for \"e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077\"" Sep 9 05:38:21.221045 containerd[1587]: time="2025-09-09T05:38:21.221022072Z" level=info msg="connecting to shim e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077" address="unix:///run/containerd/s/6132f112f6f827af5417763a82cf7e33a56b3840ee0d0e81e3b0b8f4db835475" protocol=ttrpc version=3 Sep 9 05:38:21.266867 systemd[1]: Started cri-containerd-e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077.scope - libcontainer container e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077. Sep 9 05:38:21.525330 containerd[1587]: time="2025-09-09T05:38:21.524824326Z" level=info msg="StartContainer for \"e64a461d634a35adc46e74fecb84eb17cdf7f4715bc3c1d27bba82ac9ead7077\" returns successfully" Sep 9 05:38:21.970770 systemd[1]: Started sshd@9-10.0.0.118:22-10.0.0.1:59592.service - OpenSSH per-connection server daemon (10.0.0.1:59592). Sep 9 05:38:22.026415 sshd[5381]: Accepted publickey for core from 10.0.0.1 port 59592 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:22.028304 sshd-session[5381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:22.032904 systemd-logind[1571]: New session 10 of user core. Sep 9 05:38:22.039733 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:38:22.238951 sshd[5385]: Connection closed by 10.0.0.1 port 59592 Sep 9 05:38:22.239200 sshd-session[5381]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:22.242891 systemd[1]: sshd@9-10.0.0.118:22-10.0.0.1:59592.service: Deactivated successfully. Sep 9 05:38:22.245094 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:38:22.246607 systemd-logind[1571]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:38:22.248258 systemd-logind[1571]: Removed session 10. Sep 9 05:38:22.645208 kubelet[2731]: I0909 05:38:22.644904 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64659bc7c5-dbld7" podStartSLOduration=27.031823654 podStartE2EDuration="35.644887787s" podCreationTimestamp="2025-09-09 05:37:47 +0000 UTC" firstStartedPulling="2025-09-09 05:38:12.589477391 +0000 UTC m=+40.546866261" lastFinishedPulling="2025-09-09 05:38:21.202541524 +0000 UTC m=+49.159930394" observedRunningTime="2025-09-09 05:38:22.642716311 +0000 UTC m=+50.600105181" watchObservedRunningTime="2025-09-09 05:38:22.644887787 +0000 UTC m=+50.602276657" Sep 9 05:38:22.646179 kubelet[2731]: I0909 05:38:22.645612 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-45sx9" podStartSLOduration=27.883468777 podStartE2EDuration="33.645604882s" podCreationTimestamp="2025-09-09 05:37:49 +0000 UTC" firstStartedPulling="2025-09-09 05:38:11.852621538 +0000 UTC m=+39.810010408" lastFinishedPulling="2025-09-09 05:38:17.614757643 +0000 UTC m=+45.572146513" observedRunningTime="2025-09-09 05:38:18.268693626 +0000 UTC m=+46.226082516" watchObservedRunningTime="2025-09-09 05:38:22.645604882 +0000 UTC m=+50.602993752" Sep 9 05:38:22.766891 kubelet[2731]: I0909 05:38:22.766834 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:22.922108 containerd[1587]: time="2025-09-09T05:38:22.921969502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\" id:\"43884264a1d9b41817f8b2ae5e48c62d7eacb26bb66a5162c714d6df35e7f117\" pid:5412 exit_status:1 exited_at:{seconds:1757396302 nanos:920949057}" Sep 9 05:38:23.017915 containerd[1587]: time="2025-09-09T05:38:23.017874501Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\" id:\"13d6b3d697e1febd8f9621e44aca0b29ba0405fe75c0c184a1edf102f3da94ab\" pid:5439 exit_status:1 exited_at:{seconds:1757396303 nanos:17527900}" Sep 9 05:38:23.086421 containerd[1587]: time="2025-09-09T05:38:23.086373998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:23.087312 containerd[1587]: time="2025-09-09T05:38:23.087293894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 05:38:23.088550 containerd[1587]: time="2025-09-09T05:38:23.088497874Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:23.090905 containerd[1587]: time="2025-09-09T05:38:23.090859576Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:23.091603 containerd[1587]: time="2025-09-09T05:38:23.091514245Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.888774098s" Sep 9 05:38:23.091603 containerd[1587]: time="2025-09-09T05:38:23.091540654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 05:38:23.099138 containerd[1587]: time="2025-09-09T05:38:23.099039688Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:38:23.101217 containerd[1587]: time="2025-09-09T05:38:23.101177751Z" level=info msg="CreateContainer within sandbox \"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:38:23.211644 containerd[1587]: time="2025-09-09T05:38:23.211496361Z" level=info msg="Container 1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:23.231520 containerd[1587]: time="2025-09-09T05:38:23.231468353Z" level=info msg="CreateContainer within sandbox \"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61\"" Sep 9 05:38:23.233524 containerd[1587]: time="2025-09-09T05:38:23.233492142Z" level=info msg="StartContainer for \"1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61\"" Sep 9 05:38:23.237516 containerd[1587]: time="2025-09-09T05:38:23.236054140Z" level=info msg="connecting to shim 1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61" address="unix:///run/containerd/s/7f841a7bc18e0150695c633a7fc79bf59b4aebc0800a3478cb87df7fd088c0f3" protocol=ttrpc version=3 Sep 9 05:38:23.263730 systemd[1]: Started cri-containerd-1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61.scope - libcontainer container 1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61. Sep 9 05:38:23.276906 kubelet[2731]: I0909 05:38:23.276869 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:38:23.397113 containerd[1587]: time="2025-09-09T05:38:23.397065989Z" level=info msg="StartContainer for \"1cdc847deb88bfa4dd34d294d7c1f3370a5bf562b665977eba74e7febb5d8f61\" returns successfully" Sep 9 05:38:23.446335 containerd[1587]: time="2025-09-09T05:38:23.446279261Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:23.446989 containerd[1587]: time="2025-09-09T05:38:23.446936163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:38:23.448514 containerd[1587]: time="2025-09-09T05:38:23.448474360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 349.400387ms" Sep 9 05:38:23.448514 containerd[1587]: time="2025-09-09T05:38:23.448509285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:38:23.450183 containerd[1587]: time="2025-09-09T05:38:23.449337961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:38:23.450262 containerd[1587]: time="2025-09-09T05:38:23.450228011Z" level=info msg="CreateContainer within sandbox \"45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:38:23.459829 containerd[1587]: time="2025-09-09T05:38:23.459791740Z" level=info msg="Container 2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:23.469109 containerd[1587]: time="2025-09-09T05:38:23.469027372Z" level=info msg="CreateContainer within sandbox \"45bb33a3833552e1fc18f097a4e49b4e80d002f64db814ed5b51da8f85efcbae\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f\"" Sep 9 05:38:23.469664 containerd[1587]: time="2025-09-09T05:38:23.469497415Z" level=info msg="StartContainer for \"2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f\"" Sep 9 05:38:23.470426 containerd[1587]: time="2025-09-09T05:38:23.470402694Z" level=info msg="connecting to shim 2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f" address="unix:///run/containerd/s/c1ac2ea49a78181c9a076a5fd55068cb8b0b522b1027258d517b117a755c7328" protocol=ttrpc version=3 Sep 9 05:38:23.490810 systemd[1]: Started cri-containerd-2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f.scope - libcontainer container 2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f. Sep 9 05:38:23.534495 containerd[1587]: time="2025-09-09T05:38:23.534457909Z" level=info msg="StartContainer for \"2734f716e4b6ac4a1e9c7f59f797305cbe8567012e1fc045be053552fa6f362f\" returns successfully" Sep 9 05:38:24.310557 kubelet[2731]: I0909 05:38:24.310501 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64659bc7c5-hdpgx" podStartSLOduration=27.321292416 podStartE2EDuration="37.310485206s" podCreationTimestamp="2025-09-09 05:37:47 +0000 UTC" firstStartedPulling="2025-09-09 05:38:13.459945126 +0000 UTC m=+41.417333996" lastFinishedPulling="2025-09-09 05:38:23.449137916 +0000 UTC m=+51.406526786" observedRunningTime="2025-09-09 05:38:24.310116844 +0000 UTC m=+52.267505714" watchObservedRunningTime="2025-09-09 05:38:24.310485206 +0000 UTC m=+52.267874066" Sep 9 05:38:25.108841 containerd[1587]: time="2025-09-09T05:38:25.108792140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:25.109505 containerd[1587]: time="2025-09-09T05:38:25.109479289Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 05:38:25.110671 containerd[1587]: time="2025-09-09T05:38:25.110628255Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:25.112509 containerd[1587]: time="2025-09-09T05:38:25.112477356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:38:25.112939 containerd[1587]: time="2025-09-09T05:38:25.112915166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.663357985s" Sep 9 05:38:25.112972 containerd[1587]: time="2025-09-09T05:38:25.112940133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 05:38:25.118898 containerd[1587]: time="2025-09-09T05:38:25.118868368Z" level=info msg="CreateContainer within sandbox \"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:38:25.127014 containerd[1587]: time="2025-09-09T05:38:25.126972526Z" level=info msg="Container 3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:38:25.135562 containerd[1587]: time="2025-09-09T05:38:25.135526778Z" level=info msg="CreateContainer within sandbox \"5c9b76e29549ddc49202cb160f26d1833ac330679f1609b15f0376126a32ea86\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3\"" Sep 9 05:38:25.135971 containerd[1587]: time="2025-09-09T05:38:25.135949141Z" level=info msg="StartContainer for \"3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3\"" Sep 9 05:38:25.137444 containerd[1587]: time="2025-09-09T05:38:25.137419099Z" level=info msg="connecting to shim 3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3" address="unix:///run/containerd/s/7f841a7bc18e0150695c633a7fc79bf59b4aebc0800a3478cb87df7fd088c0f3" protocol=ttrpc version=3 Sep 9 05:38:25.160722 systemd[1]: Started cri-containerd-3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3.scope - libcontainer container 3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3. Sep 9 05:38:25.205801 containerd[1587]: time="2025-09-09T05:38:25.205767958Z" level=info msg="StartContainer for \"3a84077f35a732f68020777bec1cee15729603e4cb3a04884c3609481640d2a3\" returns successfully" Sep 9 05:38:25.314285 kubelet[2731]: I0909 05:38:25.314213 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fr4sd" podStartSLOduration=23.892611883 podStartE2EDuration="36.314194826s" podCreationTimestamp="2025-09-09 05:37:49 +0000 UTC" firstStartedPulling="2025-09-09 05:38:12.692508271 +0000 UTC m=+40.649897141" lastFinishedPulling="2025-09-09 05:38:25.114091214 +0000 UTC m=+53.071480084" observedRunningTime="2025-09-09 05:38:25.304059687 +0000 UTC m=+53.261448557" watchObservedRunningTime="2025-09-09 05:38:25.314194826 +0000 UTC m=+53.271583696" Sep 9 05:38:25.496382 containerd[1587]: time="2025-09-09T05:38:25.496339236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\" id:\"e6bb4a50b7a58981438a0bb571931fda051049cffa5b96319e67c7ae292dfade\" pid:5579 exited_at:{seconds:1757396305 nanos:496133801}" Sep 9 05:38:26.175708 kubelet[2731]: I0909 05:38:26.175668 2731 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:38:26.175708 kubelet[2731]: I0909 05:38:26.175702 2731 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:38:27.252386 systemd[1]: Started sshd@10-10.0.0.118:22-10.0.0.1:59606.service - OpenSSH per-connection server daemon (10.0.0.1:59606). Sep 9 05:38:27.321145 sshd[5592]: Accepted publickey for core from 10.0.0.1 port 59606 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:27.322825 sshd-session[5592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:27.326960 systemd-logind[1571]: New session 11 of user core. Sep 9 05:38:27.336721 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:38:27.468915 sshd[5596]: Connection closed by 10.0.0.1 port 59606 Sep 9 05:38:27.469275 sshd-session[5592]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:27.483314 systemd[1]: sshd@10-10.0.0.118:22-10.0.0.1:59606.service: Deactivated successfully. Sep 9 05:38:27.485715 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:38:27.486559 systemd-logind[1571]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:38:27.490934 systemd[1]: Started sshd@11-10.0.0.118:22-10.0.0.1:59620.service - OpenSSH per-connection server daemon (10.0.0.1:59620). Sep 9 05:38:27.491552 systemd-logind[1571]: Removed session 11. Sep 9 05:38:27.544101 sshd[5611]: Accepted publickey for core from 10.0.0.1 port 59620 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:27.545855 sshd-session[5611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:27.550836 systemd-logind[1571]: New session 12 of user core. Sep 9 05:38:27.564750 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:38:27.711712 sshd[5614]: Connection closed by 10.0.0.1 port 59620 Sep 9 05:38:27.712860 sshd-session[5611]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:27.725327 systemd[1]: sshd@11-10.0.0.118:22-10.0.0.1:59620.service: Deactivated successfully. Sep 9 05:38:27.728923 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:38:27.731463 systemd-logind[1571]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:38:27.735171 systemd[1]: Started sshd@12-10.0.0.118:22-10.0.0.1:59630.service - OpenSSH per-connection server daemon (10.0.0.1:59630). Sep 9 05:38:27.736472 systemd-logind[1571]: Removed session 12. Sep 9 05:38:27.785362 sshd[5625]: Accepted publickey for core from 10.0.0.1 port 59630 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:27.787339 sshd-session[5625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:27.791601 systemd-logind[1571]: New session 13 of user core. Sep 9 05:38:27.804737 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:38:27.920919 sshd[5628]: Connection closed by 10.0.0.1 port 59630 Sep 9 05:38:27.921321 sshd-session[5625]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:27.925896 systemd[1]: sshd@12-10.0.0.118:22-10.0.0.1:59630.service: Deactivated successfully. Sep 9 05:38:27.927985 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:38:27.928798 systemd-logind[1571]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:38:27.929979 systemd-logind[1571]: Removed session 13. Sep 9 05:38:32.935406 systemd[1]: Started sshd@13-10.0.0.118:22-10.0.0.1:42094.service - OpenSSH per-connection server daemon (10.0.0.1:42094). Sep 9 05:38:32.991738 sshd[5657]: Accepted publickey for core from 10.0.0.1 port 42094 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:32.993168 sshd-session[5657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:32.997501 systemd-logind[1571]: New session 14 of user core. Sep 9 05:38:33.004718 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:38:33.132039 sshd[5660]: Connection closed by 10.0.0.1 port 42094 Sep 9 05:38:33.132386 sshd-session[5657]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:33.136278 systemd[1]: sshd@13-10.0.0.118:22-10.0.0.1:42094.service: Deactivated successfully. Sep 9 05:38:33.138414 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:38:33.139293 systemd-logind[1571]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:38:33.140416 systemd-logind[1571]: Removed session 14. Sep 9 05:38:37.725120 containerd[1587]: time="2025-09-09T05:38:37.725076253Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\" id:\"c360ba2859ff88a013afb0d49b4f1940263a607489321cfd1f88e320ca4e962a\" pid:5685 exited_at:{seconds:1757396317 nanos:724784480}" Sep 9 05:38:38.156377 systemd[1]: Started sshd@14-10.0.0.118:22-10.0.0.1:42096.service - OpenSSH per-connection server daemon (10.0.0.1:42096). Sep 9 05:38:38.222976 sshd[5696]: Accepted publickey for core from 10.0.0.1 port 42096 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:38.224644 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:38.228876 systemd-logind[1571]: New session 15 of user core. Sep 9 05:38:38.240755 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:38:38.355179 sshd[5700]: Connection closed by 10.0.0.1 port 42096 Sep 9 05:38:38.355501 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:38.359739 systemd[1]: sshd@14-10.0.0.118:22-10.0.0.1:42096.service: Deactivated successfully. Sep 9 05:38:38.361872 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:38:38.362720 systemd-logind[1571]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:38:38.363854 systemd-logind[1571]: Removed session 15. Sep 9 05:38:43.373798 systemd[1]: Started sshd@15-10.0.0.118:22-10.0.0.1:40314.service - OpenSSH per-connection server daemon (10.0.0.1:40314). Sep 9 05:38:43.421165 sshd[5723]: Accepted publickey for core from 10.0.0.1 port 40314 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:43.422858 sshd-session[5723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:43.427703 systemd-logind[1571]: New session 16 of user core. Sep 9 05:38:43.439771 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:38:43.555622 sshd[5726]: Connection closed by 10.0.0.1 port 40314 Sep 9 05:38:43.556005 sshd-session[5723]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:43.560992 systemd[1]: sshd@15-10.0.0.118:22-10.0.0.1:40314.service: Deactivated successfully. Sep 9 05:38:43.563393 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:38:43.564307 systemd-logind[1571]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:38:43.565768 systemd-logind[1571]: Removed session 16. Sep 9 05:38:46.723996 containerd[1587]: time="2025-09-09T05:38:46.723954050Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb67576a83ed65457588b4c66281ea2b726ce381a59c3ccbff5d88e932e2ad1e\" id:\"2c3e6a7c123cf62b57a3fe9c3a0c78c47ddacef75396f82b24e2793355e01dfd\" pid:5750 exited_at:{seconds:1757396326 nanos:723649045}" Sep 9 05:38:48.575396 systemd[1]: Started sshd@16-10.0.0.118:22-10.0.0.1:40320.service - OpenSSH per-connection server daemon (10.0.0.1:40320). Sep 9 05:38:48.633798 sshd[5764]: Accepted publickey for core from 10.0.0.1 port 40320 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:48.635751 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:48.640373 systemd-logind[1571]: New session 17 of user core. Sep 9 05:38:48.651821 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:38:48.782997 sshd[5767]: Connection closed by 10.0.0.1 port 40320 Sep 9 05:38:48.783382 sshd-session[5764]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:48.792489 systemd[1]: sshd@16-10.0.0.118:22-10.0.0.1:40320.service: Deactivated successfully. Sep 9 05:38:48.794428 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:38:48.795256 systemd-logind[1571]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:38:48.797815 systemd[1]: Started sshd@17-10.0.0.118:22-10.0.0.1:40336.service - OpenSSH per-connection server daemon (10.0.0.1:40336). Sep 9 05:38:48.798781 systemd-logind[1571]: Removed session 17. Sep 9 05:38:48.848661 sshd[5780]: Accepted publickey for core from 10.0.0.1 port 40336 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:48.850332 sshd-session[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:48.854787 systemd-logind[1571]: New session 18 of user core. Sep 9 05:38:48.866800 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:38:49.176465 sshd[5783]: Connection closed by 10.0.0.1 port 40336 Sep 9 05:38:49.177850 sshd-session[5780]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:49.191516 systemd[1]: sshd@17-10.0.0.118:22-10.0.0.1:40336.service: Deactivated successfully. Sep 9 05:38:49.194687 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:38:49.195746 systemd-logind[1571]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:38:49.199810 systemd[1]: Started sshd@18-10.0.0.118:22-10.0.0.1:40342.service - OpenSSH per-connection server daemon (10.0.0.1:40342). Sep 9 05:38:49.200986 systemd-logind[1571]: Removed session 18. Sep 9 05:38:49.265902 sshd[5794]: Accepted publickey for core from 10.0.0.1 port 40342 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:49.267284 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:49.271524 systemd-logind[1571]: New session 19 of user core. Sep 9 05:38:49.285711 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:38:50.017624 containerd[1587]: time="2025-09-09T05:38:50.017560840Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\" id:\"07d7808e7134d2663482232976b7f9bd1a4077e4206d16dbb02a052196a9e1c8\" pid:5821 exited_at:{seconds:1757396330 nanos:17074931}" Sep 9 05:38:51.049432 sshd[5797]: Connection closed by 10.0.0.1 port 40342 Sep 9 05:38:51.050655 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:51.063889 systemd[1]: sshd@18-10.0.0.118:22-10.0.0.1:40342.service: Deactivated successfully. Sep 9 05:38:51.067205 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:38:51.067769 systemd[1]: session-19.scope: Consumed 616ms CPU time, 74.4M memory peak. Sep 9 05:38:51.069500 systemd-logind[1571]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:38:51.074981 systemd[1]: Started sshd@19-10.0.0.118:22-10.0.0.1:43106.service - OpenSSH per-connection server daemon (10.0.0.1:43106). Sep 9 05:38:51.077022 systemd-logind[1571]: Removed session 19. Sep 9 05:38:51.128827 sshd[5841]: Accepted publickey for core from 10.0.0.1 port 43106 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:51.130663 sshd-session[5841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:51.135316 systemd-logind[1571]: New session 20 of user core. Sep 9 05:38:51.144758 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:38:51.871180 sshd[5844]: Connection closed by 10.0.0.1 port 43106 Sep 9 05:38:51.873188 sshd-session[5841]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:51.885126 systemd[1]: sshd@19-10.0.0.118:22-10.0.0.1:43106.service: Deactivated successfully. Sep 9 05:38:51.887383 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:38:51.888517 systemd-logind[1571]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:38:51.892440 systemd[1]: Started sshd@20-10.0.0.118:22-10.0.0.1:43118.service - OpenSSH per-connection server daemon (10.0.0.1:43118). Sep 9 05:38:51.893441 systemd-logind[1571]: Removed session 20. Sep 9 05:38:51.942264 sshd[5855]: Accepted publickey for core from 10.0.0.1 port 43118 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:51.944109 sshd-session[5855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:51.948498 systemd-logind[1571]: New session 21 of user core. Sep 9 05:38:51.958785 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 05:38:52.072762 sshd[5858]: Connection closed by 10.0.0.1 port 43118 Sep 9 05:38:52.073101 sshd-session[5855]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:52.077071 systemd[1]: sshd@20-10.0.0.118:22-10.0.0.1:43118.service: Deactivated successfully. Sep 9 05:38:52.079142 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 05:38:52.079954 systemd-logind[1571]: Session 21 logged out. Waiting for processes to exit. Sep 9 05:38:52.081162 systemd-logind[1571]: Removed session 21. Sep 9 05:38:52.849792 containerd[1587]: time="2025-09-09T05:38:52.849726013Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c6851ad18298bb7faca80ddf7674b05c6b08cd49e49f944ac633323935999dd\" id:\"158032c110e5f43930f6d6476d871ff2593e535adc9898ddfd49ace8304fc24f\" pid:5885 exited_at:{seconds:1757396332 nanos:849345657}" Sep 9 05:38:57.085635 systemd[1]: Started sshd@21-10.0.0.118:22-10.0.0.1:43132.service - OpenSSH per-connection server daemon (10.0.0.1:43132). Sep 9 05:38:57.131500 sshd[5899]: Accepted publickey for core from 10.0.0.1 port 43132 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:38:57.133172 sshd-session[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:57.137848 systemd-logind[1571]: New session 22 of user core. Sep 9 05:38:57.148797 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 05:38:57.257142 sshd[5902]: Connection closed by 10.0.0.1 port 43132 Sep 9 05:38:57.257668 sshd-session[5899]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:57.261792 systemd[1]: sshd@21-10.0.0.118:22-10.0.0.1:43132.service: Deactivated successfully. Sep 9 05:38:57.263801 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 05:38:57.264724 systemd-logind[1571]: Session 22 logged out. Waiting for processes to exit. Sep 9 05:38:57.265946 systemd-logind[1571]: Removed session 22. Sep 9 05:38:57.925061 kubelet[2731]: I0909 05:38:57.925017 2731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:39:02.274446 systemd[1]: Started sshd@22-10.0.0.118:22-10.0.0.1:34684.service - OpenSSH per-connection server daemon (10.0.0.1:34684). Sep 9 05:39:02.338504 sshd[5928]: Accepted publickey for core from 10.0.0.1 port 34684 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:39:02.340346 sshd-session[5928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:02.344846 systemd-logind[1571]: New session 23 of user core. Sep 9 05:39:02.353729 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 05:39:02.475047 sshd[5931]: Connection closed by 10.0.0.1 port 34684 Sep 9 05:39:02.475517 sshd-session[5928]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:02.479908 systemd[1]: sshd@22-10.0.0.118:22-10.0.0.1:34684.service: Deactivated successfully. Sep 9 05:39:02.482042 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 05:39:02.482903 systemd-logind[1571]: Session 23 logged out. Waiting for processes to exit. Sep 9 05:39:02.484064 systemd-logind[1571]: Removed session 23. Sep 9 05:39:07.491737 systemd[1]: Started sshd@23-10.0.0.118:22-10.0.0.1:34700.service - OpenSSH per-connection server daemon (10.0.0.1:34700). Sep 9 05:39:07.570477 sshd[5945]: Accepted publickey for core from 10.0.0.1 port 34700 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:39:07.572151 sshd-session[5945]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:07.576817 systemd-logind[1571]: New session 24 of user core. Sep 9 05:39:07.582767 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 05:39:07.703514 sshd[5948]: Connection closed by 10.0.0.1 port 34700 Sep 9 05:39:07.705534 sshd-session[5945]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:07.709396 systemd[1]: sshd@23-10.0.0.118:22-10.0.0.1:34700.service: Deactivated successfully. Sep 9 05:39:07.713402 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 05:39:07.714953 systemd-logind[1571]: Session 24 logged out. Waiting for processes to exit. Sep 9 05:39:07.716313 systemd-logind[1571]: Removed session 24. Sep 9 05:39:07.731976 containerd[1587]: time="2025-09-09T05:39:07.731943705Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9983231b7733c080b533750192e024a0d989f26b1a5338d80a49df41b275f814\" id:\"1b9e87c78a853869ca81161f0350db682d930ac0e9c9fb077321e6165b2897c2\" pid:5968 exited_at:{seconds:1757396347 nanos:731748995}" Sep 9 05:39:12.718694 systemd[1]: Started sshd@24-10.0.0.118:22-10.0.0.1:39888.service - OpenSSH per-connection server daemon (10.0.0.1:39888). Sep 9 05:39:12.769063 sshd[5986]: Accepted publickey for core from 10.0.0.1 port 39888 ssh2: RSA SHA256:9+3J2aT7q2koLO1Rle2UX2pTYMxmV9eQF9r8rZDBoIg Sep 9 05:39:12.770392 sshd-session[5986]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:12.775053 systemd-logind[1571]: New session 25 of user core. Sep 9 05:39:12.783744 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 05:39:12.918089 sshd[5989]: Connection closed by 10.0.0.1 port 39888 Sep 9 05:39:12.918505 sshd-session[5986]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:12.925567 systemd[1]: sshd@24-10.0.0.118:22-10.0.0.1:39888.service: Deactivated successfully. Sep 9 05:39:12.928182 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 05:39:12.928977 systemd-logind[1571]: Session 25 logged out. Waiting for processes to exit. Sep 9 05:39:12.930772 systemd-logind[1571]: Removed session 25.