Sep 5 06:19:59.839227 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 5 04:19:33 -00 2025 Sep 5 06:19:59.839255 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:19:59.839267 kernel: BIOS-provided physical RAM map: Sep 5 06:19:59.839273 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 5 06:19:59.839280 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 5 06:19:59.839286 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 5 06:19:59.839296 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 5 06:19:59.839304 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 5 06:19:59.839313 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 5 06:19:59.839324 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 5 06:19:59.839331 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 5 06:19:59.839337 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 5 06:19:59.839344 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 5 06:19:59.839351 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 5 06:19:59.839359 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 5 06:19:59.839368 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 5 06:19:59.839378 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 5 06:19:59.839385 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 5 06:19:59.839393 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 5 06:19:59.839400 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 5 06:19:59.839407 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 5 06:19:59.839414 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 5 06:19:59.839421 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 5 06:19:59.839428 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 5 06:19:59.839434 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 5 06:19:59.839444 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 5 06:19:59.839451 kernel: NX (Execute Disable) protection: active Sep 5 06:19:59.839458 kernel: APIC: Static calls initialized Sep 5 06:19:59.839465 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 5 06:19:59.839483 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 5 06:19:59.839491 kernel: extended physical RAM map: Sep 5 06:19:59.839507 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 5 06:19:59.839515 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 5 06:19:59.839522 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 5 06:19:59.839529 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 5 06:19:59.839536 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 5 06:19:59.839547 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 5 06:19:59.839554 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 5 06:19:59.839561 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 5 06:19:59.839573 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 5 06:19:59.839589 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 5 06:19:59.839604 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 5 06:19:59.839618 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 5 06:19:59.839626 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 5 06:19:59.839633 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 5 06:19:59.839641 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 5 06:19:59.839648 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 5 06:19:59.839655 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 5 06:19:59.839663 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 5 06:19:59.839670 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 5 06:19:59.839677 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 5 06:19:59.839685 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 5 06:19:59.839694 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 5 06:19:59.839702 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 5 06:19:59.839709 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 5 06:19:59.839716 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 5 06:19:59.839724 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 5 06:19:59.839731 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 5 06:19:59.839743 kernel: efi: EFI v2.7 by EDK II Sep 5 06:19:59.839751 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 5 06:19:59.839758 kernel: random: crng init done Sep 5 06:19:59.839768 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 5 06:19:59.839775 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 5 06:19:59.839787 kernel: secureboot: Secure boot disabled Sep 5 06:19:59.839795 kernel: SMBIOS 2.8 present. Sep 5 06:19:59.839802 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 5 06:19:59.839839 kernel: DMI: Memory slots populated: 1/1 Sep 5 06:19:59.839847 kernel: Hypervisor detected: KVM Sep 5 06:19:59.839854 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 5 06:19:59.839861 kernel: kvm-clock: using sched offset of 4996866344 cycles Sep 5 06:19:59.839870 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 5 06:19:59.839877 kernel: tsc: Detected 2794.748 MHz processor Sep 5 06:19:59.839885 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 5 06:19:59.839892 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 5 06:19:59.839903 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 5 06:19:59.839911 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 5 06:19:59.839918 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 5 06:19:59.839926 kernel: Using GB pages for direct mapping Sep 5 06:19:59.839933 kernel: ACPI: Early table checksum verification disabled Sep 5 06:19:59.839941 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 5 06:19:59.839949 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 5 06:19:59.839966 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.839974 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.839984 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 5 06:19:59.839991 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.839999 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.840006 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.840014 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 06:19:59.840021 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 5 06:19:59.840028 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 5 06:19:59.840036 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 5 06:19:59.840045 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 5 06:19:59.840053 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 5 06:19:59.840060 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 5 06:19:59.840068 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 5 06:19:59.840075 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 5 06:19:59.840082 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 5 06:19:59.840090 kernel: No NUMA configuration found Sep 5 06:19:59.840097 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 5 06:19:59.840104 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 5 06:19:59.840112 kernel: Zone ranges: Sep 5 06:19:59.840121 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 5 06:19:59.840129 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 5 06:19:59.840136 kernel: Normal empty Sep 5 06:19:59.840144 kernel: Device empty Sep 5 06:19:59.840151 kernel: Movable zone start for each node Sep 5 06:19:59.840158 kernel: Early memory node ranges Sep 5 06:19:59.840166 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 5 06:19:59.840173 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 5 06:19:59.840183 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 5 06:19:59.840193 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 5 06:19:59.840200 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 5 06:19:59.840208 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 5 06:19:59.840215 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 5 06:19:59.840222 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 5 06:19:59.840230 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 5 06:19:59.840237 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 5 06:19:59.840247 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 5 06:19:59.840264 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 5 06:19:59.840272 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 5 06:19:59.840280 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 5 06:19:59.840288 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 5 06:19:59.840298 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 5 06:19:59.840306 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 5 06:19:59.840313 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 5 06:19:59.840321 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 5 06:19:59.840329 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 5 06:19:59.840338 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 5 06:19:59.840346 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 5 06:19:59.840354 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 5 06:19:59.840362 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 5 06:19:59.840370 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 5 06:19:59.840377 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 5 06:19:59.840385 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 5 06:19:59.840393 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 5 06:19:59.840400 kernel: TSC deadline timer available Sep 5 06:19:59.840410 kernel: CPU topo: Max. logical packages: 1 Sep 5 06:19:59.840418 kernel: CPU topo: Max. logical dies: 1 Sep 5 06:19:59.840425 kernel: CPU topo: Max. dies per package: 1 Sep 5 06:19:59.840433 kernel: CPU topo: Max. threads per core: 1 Sep 5 06:19:59.840440 kernel: CPU topo: Num. cores per package: 4 Sep 5 06:19:59.840448 kernel: CPU topo: Num. threads per package: 4 Sep 5 06:19:59.840456 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 5 06:19:59.840463 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 5 06:19:59.840471 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 5 06:19:59.840479 kernel: kvm-guest: setup PV sched yield Sep 5 06:19:59.840488 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 5 06:19:59.840496 kernel: Booting paravirtualized kernel on KVM Sep 5 06:19:59.840504 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 5 06:19:59.840512 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 5 06:19:59.840520 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 5 06:19:59.840527 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 5 06:19:59.840535 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 5 06:19:59.840543 kernel: kvm-guest: PV spinlocks enabled Sep 5 06:19:59.840552 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 5 06:19:59.840561 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:19:59.840572 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 5 06:19:59.840582 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 5 06:19:59.840593 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 5 06:19:59.840603 kernel: Fallback order for Node 0: 0 Sep 5 06:19:59.840612 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 5 06:19:59.840619 kernel: Policy zone: DMA32 Sep 5 06:19:59.840627 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 5 06:19:59.840638 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 5 06:19:59.840646 kernel: ftrace: allocating 40102 entries in 157 pages Sep 5 06:19:59.840653 kernel: ftrace: allocated 157 pages with 5 groups Sep 5 06:19:59.840661 kernel: Dynamic Preempt: voluntary Sep 5 06:19:59.840669 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 5 06:19:59.840677 kernel: rcu: RCU event tracing is enabled. Sep 5 06:19:59.840685 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 5 06:19:59.840693 kernel: Trampoline variant of Tasks RCU enabled. Sep 5 06:19:59.840701 kernel: Rude variant of Tasks RCU enabled. Sep 5 06:19:59.840711 kernel: Tracing variant of Tasks RCU enabled. Sep 5 06:19:59.840719 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 5 06:19:59.840730 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 5 06:19:59.840738 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 5 06:19:59.840746 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 5 06:19:59.840754 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 5 06:19:59.840762 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 5 06:19:59.840769 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 5 06:19:59.840779 kernel: Console: colour dummy device 80x25 Sep 5 06:19:59.840787 kernel: printk: legacy console [ttyS0] enabled Sep 5 06:19:59.840795 kernel: ACPI: Core revision 20240827 Sep 5 06:19:59.840803 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 5 06:19:59.840828 kernel: APIC: Switch to symmetric I/O mode setup Sep 5 06:19:59.840835 kernel: x2apic enabled Sep 5 06:19:59.840843 kernel: APIC: Switched APIC routing to: physical x2apic Sep 5 06:19:59.840851 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 5 06:19:59.840859 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 5 06:19:59.840867 kernel: kvm-guest: setup PV IPIs Sep 5 06:19:59.840878 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 5 06:19:59.840886 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 5 06:19:59.840894 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 5 06:19:59.840902 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 5 06:19:59.840909 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 5 06:19:59.840917 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 5 06:19:59.840925 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 5 06:19:59.840933 kernel: Spectre V2 : Mitigation: Retpolines Sep 5 06:19:59.840943 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 5 06:19:59.840951 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 5 06:19:59.840967 kernel: active return thunk: retbleed_return_thunk Sep 5 06:19:59.840974 kernel: RETBleed: Mitigation: untrained return thunk Sep 5 06:19:59.840985 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 5 06:19:59.840993 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 5 06:19:59.841000 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 5 06:19:59.841009 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 5 06:19:59.841017 kernel: active return thunk: srso_return_thunk Sep 5 06:19:59.841027 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 5 06:19:59.841035 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 5 06:19:59.841043 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 5 06:19:59.841051 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 5 06:19:59.841059 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 5 06:19:59.841067 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 5 06:19:59.841074 kernel: Freeing SMP alternatives memory: 32K Sep 5 06:19:59.841082 kernel: pid_max: default: 32768 minimum: 301 Sep 5 06:19:59.841090 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 5 06:19:59.841099 kernel: landlock: Up and running. Sep 5 06:19:59.841107 kernel: SELinux: Initializing. Sep 5 06:19:59.841115 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 06:19:59.841123 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 06:19:59.841130 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 5 06:19:59.841138 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 5 06:19:59.841146 kernel: ... version: 0 Sep 5 06:19:59.841154 kernel: ... bit width: 48 Sep 5 06:19:59.841161 kernel: ... generic registers: 6 Sep 5 06:19:59.841171 kernel: ... value mask: 0000ffffffffffff Sep 5 06:19:59.841179 kernel: ... max period: 00007fffffffffff Sep 5 06:19:59.841186 kernel: ... fixed-purpose events: 0 Sep 5 06:19:59.841194 kernel: ... event mask: 000000000000003f Sep 5 06:19:59.841202 kernel: signal: max sigframe size: 1776 Sep 5 06:19:59.841209 kernel: rcu: Hierarchical SRCU implementation. Sep 5 06:19:59.841217 kernel: rcu: Max phase no-delay instances is 400. Sep 5 06:19:59.841228 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 5 06:19:59.841235 kernel: smp: Bringing up secondary CPUs ... Sep 5 06:19:59.841245 kernel: smpboot: x86: Booting SMP configuration: Sep 5 06:19:59.841253 kernel: .... node #0, CPUs: #1 #2 #3 Sep 5 06:19:59.841261 kernel: smp: Brought up 1 node, 4 CPUs Sep 5 06:19:59.841268 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 5 06:19:59.841276 kernel: Memory: 2422676K/2565800K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54068K init, 2900K bss, 137196K reserved, 0K cma-reserved) Sep 5 06:19:59.841284 kernel: devtmpfs: initialized Sep 5 06:19:59.841292 kernel: x86/mm: Memory block size: 128MB Sep 5 06:19:59.841300 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 5 06:19:59.841307 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 5 06:19:59.841317 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 5 06:19:59.841325 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 5 06:19:59.841333 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 5 06:19:59.841341 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 5 06:19:59.841349 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 5 06:19:59.841357 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 5 06:19:59.841365 kernel: pinctrl core: initialized pinctrl subsystem Sep 5 06:19:59.841373 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 5 06:19:59.841390 kernel: audit: initializing netlink subsys (disabled) Sep 5 06:19:59.841403 kernel: audit: type=2000 audit(1757053196.763:1): state=initialized audit_enabled=0 res=1 Sep 5 06:19:59.841412 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 5 06:19:59.841422 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 5 06:19:59.841432 kernel: cpuidle: using governor menu Sep 5 06:19:59.841465 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 5 06:19:59.841477 kernel: dca service started, version 1.12.1 Sep 5 06:19:59.841487 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 5 06:19:59.841497 kernel: PCI: Using configuration type 1 for base access Sep 5 06:19:59.841510 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 5 06:19:59.841518 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 5 06:19:59.841526 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 5 06:19:59.841534 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 5 06:19:59.841541 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 5 06:19:59.841549 kernel: ACPI: Added _OSI(Module Device) Sep 5 06:19:59.841557 kernel: ACPI: Added _OSI(Processor Device) Sep 5 06:19:59.841565 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 5 06:19:59.841573 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 5 06:19:59.841590 kernel: ACPI: Interpreter enabled Sep 5 06:19:59.841603 kernel: ACPI: PM: (supports S0 S3 S5) Sep 5 06:19:59.841614 kernel: ACPI: Using IOAPIC for interrupt routing Sep 5 06:19:59.841622 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 5 06:19:59.841630 kernel: PCI: Using E820 reservations for host bridge windows Sep 5 06:19:59.841638 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 5 06:19:59.841645 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 5 06:19:59.841902 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 5 06:19:59.842051 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 5 06:19:59.842174 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 5 06:19:59.842185 kernel: PCI host bridge to bus 0000:00 Sep 5 06:19:59.842345 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 5 06:19:59.842457 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 5 06:19:59.842566 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 5 06:19:59.842688 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 5 06:19:59.842824 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 5 06:19:59.842941 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 5 06:19:59.843064 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 5 06:19:59.843227 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 5 06:19:59.843377 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 5 06:19:59.843523 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 5 06:19:59.843659 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 5 06:19:59.843788 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 5 06:19:59.843940 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 5 06:19:59.844095 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 5 06:19:59.844219 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 5 06:19:59.844341 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 5 06:19:59.844462 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 5 06:19:59.844611 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 5 06:19:59.844744 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 5 06:19:59.844900 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 5 06:19:59.845060 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 5 06:19:59.845203 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 5 06:19:59.845329 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 5 06:19:59.845464 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 5 06:19:59.845600 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 5 06:19:59.845733 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 5 06:19:59.845894 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 5 06:19:59.846029 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 5 06:19:59.846184 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 5 06:19:59.846306 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 5 06:19:59.846427 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 5 06:19:59.846572 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 5 06:19:59.846726 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 5 06:19:59.846741 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 5 06:19:59.846751 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 5 06:19:59.846761 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 5 06:19:59.846771 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 5 06:19:59.846781 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 5 06:19:59.846791 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 5 06:19:59.846806 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 5 06:19:59.846844 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 5 06:19:59.846854 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 5 06:19:59.846864 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 5 06:19:59.846875 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 5 06:19:59.846885 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 5 06:19:59.846896 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 5 06:19:59.846907 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 5 06:19:59.846918 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 5 06:19:59.846933 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 5 06:19:59.846944 kernel: iommu: Default domain type: Translated Sep 5 06:19:59.846966 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 5 06:19:59.846977 kernel: efivars: Registered efivars operations Sep 5 06:19:59.846988 kernel: PCI: Using ACPI for IRQ routing Sep 5 06:19:59.846999 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 5 06:19:59.847010 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 5 06:19:59.847019 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 5 06:19:59.847030 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 5 06:19:59.847044 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 5 06:19:59.847055 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 5 06:19:59.847065 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 5 06:19:59.847076 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 5 06:19:59.847087 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 5 06:19:59.847266 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 5 06:19:59.847446 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 5 06:19:59.847594 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 5 06:19:59.847614 kernel: vgaarb: loaded Sep 5 06:19:59.847625 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 5 06:19:59.847635 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 5 06:19:59.847644 kernel: clocksource: Switched to clocksource kvm-clock Sep 5 06:19:59.847654 kernel: VFS: Disk quotas dquot_6.6.0 Sep 5 06:19:59.847665 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 5 06:19:59.847676 kernel: pnp: PnP ACPI init Sep 5 06:19:59.847907 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 5 06:19:59.847933 kernel: pnp: PnP ACPI: found 6 devices Sep 5 06:19:59.847944 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 5 06:19:59.847964 kernel: NET: Registered PF_INET protocol family Sep 5 06:19:59.847975 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 5 06:19:59.847986 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 5 06:19:59.847997 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 5 06:19:59.848008 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 5 06:19:59.848019 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 5 06:19:59.848029 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 5 06:19:59.848045 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 06:19:59.848056 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 06:19:59.848067 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 5 06:19:59.848078 kernel: NET: Registered PF_XDP protocol family Sep 5 06:19:59.848250 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 5 06:19:59.848427 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 5 06:19:59.848582 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 5 06:19:59.848737 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 5 06:19:59.848917 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 5 06:19:59.849100 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 5 06:19:59.849276 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 5 06:19:59.849425 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 5 06:19:59.849441 kernel: PCI: CLS 0 bytes, default 64 Sep 5 06:19:59.849453 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 5 06:19:59.849465 kernel: Initialise system trusted keyrings Sep 5 06:19:59.849481 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 5 06:19:59.849492 kernel: Key type asymmetric registered Sep 5 06:19:59.849503 kernel: Asymmetric key parser 'x509' registered Sep 5 06:19:59.849514 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 5 06:19:59.849525 kernel: io scheduler mq-deadline registered Sep 5 06:19:59.849536 kernel: io scheduler kyber registered Sep 5 06:19:59.849547 kernel: io scheduler bfq registered Sep 5 06:19:59.849562 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 5 06:19:59.849573 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 5 06:19:59.849585 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 5 06:19:59.849596 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 5 06:19:59.849607 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 5 06:19:59.849617 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 5 06:19:59.849629 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 5 06:19:59.849640 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 5 06:19:59.849651 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 5 06:19:59.849851 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 5 06:19:59.849869 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 5 06:19:59.850028 kernel: rtc_cmos 00:04: registered as rtc0 Sep 5 06:19:59.850175 kernel: rtc_cmos 00:04: setting system clock to 2025-09-05T06:19:59 UTC (1757053199) Sep 5 06:19:59.850325 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 5 06:19:59.850341 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 5 06:19:59.850352 kernel: efifb: probing for efifb Sep 5 06:19:59.850364 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 5 06:19:59.850380 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 5 06:19:59.850391 kernel: efifb: scrolling: redraw Sep 5 06:19:59.850402 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 5 06:19:59.850414 kernel: Console: switching to colour frame buffer device 160x50 Sep 5 06:19:59.850425 kernel: fb0: EFI VGA frame buffer device Sep 5 06:19:59.850436 kernel: pstore: Using crash dump compression: deflate Sep 5 06:19:59.850447 kernel: pstore: Registered efi_pstore as persistent store backend Sep 5 06:19:59.850458 kernel: NET: Registered PF_INET6 protocol family Sep 5 06:19:59.850469 kernel: Segment Routing with IPv6 Sep 5 06:19:59.850484 kernel: In-situ OAM (IOAM) with IPv6 Sep 5 06:19:59.850495 kernel: NET: Registered PF_PACKET protocol family Sep 5 06:19:59.850506 kernel: Key type dns_resolver registered Sep 5 06:19:59.850517 kernel: IPI shorthand broadcast: enabled Sep 5 06:19:59.850529 kernel: sched_clock: Marking stable (3146003746, 164926228)->(3330587400, -19657426) Sep 5 06:19:59.850540 kernel: registered taskstats version 1 Sep 5 06:19:59.850551 kernel: Loading compiled-in X.509 certificates Sep 5 06:19:59.850562 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 0a288d3740f799f7923bd7314e999f997bd1026c' Sep 5 06:19:59.850574 kernel: Demotion targets for Node 0: null Sep 5 06:19:59.850588 kernel: Key type .fscrypt registered Sep 5 06:19:59.850599 kernel: Key type fscrypt-provisioning registered Sep 5 06:19:59.850610 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 5 06:19:59.850622 kernel: ima: Allocated hash algorithm: sha1 Sep 5 06:19:59.850633 kernel: ima: No architecture policies found Sep 5 06:19:59.850644 kernel: clk: Disabling unused clocks Sep 5 06:19:59.850655 kernel: Warning: unable to open an initial console. Sep 5 06:19:59.850667 kernel: Freeing unused kernel image (initmem) memory: 54068K Sep 5 06:19:59.850679 kernel: Write protecting the kernel read-only data: 24576k Sep 5 06:19:59.850693 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 5 06:19:59.850705 kernel: Run /init as init process Sep 5 06:19:59.850716 kernel: with arguments: Sep 5 06:19:59.850727 kernel: /init Sep 5 06:19:59.850739 kernel: with environment: Sep 5 06:19:59.850750 kernel: HOME=/ Sep 5 06:19:59.850761 kernel: TERM=linux Sep 5 06:19:59.850772 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 5 06:19:59.850785 systemd[1]: Successfully made /usr/ read-only. Sep 5 06:19:59.850803 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 5 06:19:59.850851 systemd[1]: Detected virtualization kvm. Sep 5 06:19:59.850863 systemd[1]: Detected architecture x86-64. Sep 5 06:19:59.850875 systemd[1]: Running in initrd. Sep 5 06:19:59.850887 systemd[1]: No hostname configured, using default hostname. Sep 5 06:19:59.850899 systemd[1]: Hostname set to . Sep 5 06:19:59.850911 systemd[1]: Initializing machine ID from VM UUID. Sep 5 06:19:59.850927 systemd[1]: Queued start job for default target initrd.target. Sep 5 06:19:59.850939 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:19:59.850951 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:19:59.850973 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 5 06:19:59.850986 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 06:19:59.850998 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 5 06:19:59.851011 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 5 06:19:59.851028 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 5 06:19:59.851041 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 5 06:19:59.851053 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:19:59.851065 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:19:59.851077 systemd[1]: Reached target paths.target - Path Units. Sep 5 06:19:59.851089 systemd[1]: Reached target slices.target - Slice Units. Sep 5 06:19:59.851101 systemd[1]: Reached target swap.target - Swaps. Sep 5 06:19:59.851113 systemd[1]: Reached target timers.target - Timer Units. Sep 5 06:19:59.851128 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 06:19:59.851141 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 06:19:59.851153 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 5 06:19:59.851165 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 5 06:19:59.851177 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:19:59.851189 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 06:19:59.851202 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:19:59.851213 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 06:19:59.851225 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 5 06:19:59.851241 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 06:19:59.851253 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 5 06:19:59.851269 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 5 06:19:59.851281 systemd[1]: Starting systemd-fsck-usr.service... Sep 5 06:19:59.851294 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 06:19:59.851306 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 06:19:59.851318 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:19:59.851330 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 5 06:19:59.851346 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:19:59.851361 systemd[1]: Finished systemd-fsck-usr.service. Sep 5 06:19:59.851373 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 06:19:59.851428 systemd-journald[219]: Collecting audit messages is disabled. Sep 5 06:19:59.851460 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:19:59.851473 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 06:19:59.851486 systemd-journald[219]: Journal started Sep 5 06:19:59.851521 systemd-journald[219]: Runtime Journal (/run/log/journal/7d60e15d556a48d59e4cabbf0fc45d47) is 6M, max 48.4M, 42.4M free. Sep 5 06:19:59.837405 systemd-modules-load[220]: Inserted module 'overlay' Sep 5 06:19:59.860203 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 06:19:59.860224 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 06:19:59.864352 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 06:19:59.866115 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 5 06:19:59.869289 kernel: Bridge firewalling registered Sep 5 06:19:59.868390 systemd-modules-load[220]: Inserted module 'br_netfilter' Sep 5 06:19:59.868529 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 06:19:59.869711 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 06:19:59.877011 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 06:19:59.881771 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:19:59.888283 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 06:19:59.891950 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 5 06:19:59.893677 systemd-tmpfiles[242]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 5 06:19:59.898593 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:19:59.903453 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:19:59.906273 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 06:19:59.914501 dracut-cmdline[257]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=4b2174e9c368fa97600991ce20efc370fbbf3ddfce3ea407f50212a1021bd496 Sep 5 06:19:59.960990 systemd-resolved[267]: Positive Trust Anchors: Sep 5 06:19:59.961007 systemd-resolved[267]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 06:19:59.961037 systemd-resolved[267]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 06:19:59.963655 systemd-resolved[267]: Defaulting to hostname 'linux'. Sep 5 06:19:59.964829 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 06:19:59.992333 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:20:00.060852 kernel: SCSI subsystem initialized Sep 5 06:20:00.072841 kernel: Loading iSCSI transport class v2.0-870. Sep 5 06:20:00.083845 kernel: iscsi: registered transport (tcp) Sep 5 06:20:00.104931 kernel: iscsi: registered transport (qla4xxx) Sep 5 06:20:00.105030 kernel: QLogic iSCSI HBA Driver Sep 5 06:20:00.127584 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 06:20:00.148083 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:20:00.149654 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 06:20:00.211788 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 5 06:20:00.213742 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 5 06:20:00.275882 kernel: raid6: avx2x4 gen() 22231 MB/s Sep 5 06:20:00.292867 kernel: raid6: avx2x2 gen() 20655 MB/s Sep 5 06:20:00.310204 kernel: raid6: avx2x1 gen() 16639 MB/s Sep 5 06:20:00.310293 kernel: raid6: using algorithm avx2x4 gen() 22231 MB/s Sep 5 06:20:00.327956 kernel: raid6: .... xor() 6766 MB/s, rmw enabled Sep 5 06:20:00.328016 kernel: raid6: using avx2x2 recovery algorithm Sep 5 06:20:00.352843 kernel: xor: automatically using best checksumming function avx Sep 5 06:20:00.532878 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 5 06:20:00.543164 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 5 06:20:00.546065 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:20:00.578886 systemd-udevd[471]: Using default interface naming scheme 'v255'. Sep 5 06:20:00.585442 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:20:00.589163 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 5 06:20:00.620296 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Sep 5 06:20:00.657209 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 06:20:00.660313 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 06:20:00.752191 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:20:00.756137 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 5 06:20:00.794850 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 5 06:20:00.808841 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 5 06:20:00.813244 kernel: cryptd: max_cpu_qlen set to 1000 Sep 5 06:20:00.818839 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 5 06:20:00.830849 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 5 06:20:00.830883 kernel: GPT:9289727 != 19775487 Sep 5 06:20:00.830894 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 5 06:20:00.830905 kernel: GPT:9289727 != 19775487 Sep 5 06:20:00.830914 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 5 06:20:00.830940 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 5 06:20:00.831859 kernel: libata version 3.00 loaded. Sep 5 06:20:00.833830 kernel: AES CTR mode by8 optimization enabled Sep 5 06:20:00.843983 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 06:20:00.844309 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:00.849447 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:20:00.854838 kernel: ahci 0000:00:1f.2: version 3.0 Sep 5 06:20:00.855052 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 5 06:20:00.856965 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 5 06:20:00.858003 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:20:00.860142 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 5 06:20:00.860329 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 5 06:20:00.862985 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 5 06:20:00.874839 kernel: scsi host0: ahci Sep 5 06:20:00.889898 kernel: scsi host1: ahci Sep 5 06:20:00.890843 kernel: scsi host2: ahci Sep 5 06:20:00.893848 kernel: scsi host3: ahci Sep 5 06:20:00.894105 kernel: scsi host4: ahci Sep 5 06:20:00.894840 kernel: scsi host5: ahci Sep 5 06:20:00.896289 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 5 06:20:00.896319 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 5 06:20:00.898075 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 5 06:20:00.898104 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 5 06:20:00.899867 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 5 06:20:00.899896 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 5 06:20:00.904154 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 5 06:20:00.907119 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:00.919588 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 5 06:20:00.929609 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 5 06:20:00.943750 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 5 06:20:00.945020 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 5 06:20:00.955652 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 5 06:20:00.956784 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 06:20:00.956860 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:00.960226 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:20:00.971115 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:20:00.971440 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 5 06:20:00.980481 disk-uuid[633]: Primary Header is updated. Sep 5 06:20:00.980481 disk-uuid[633]: Secondary Entries is updated. Sep 5 06:20:00.980481 disk-uuid[633]: Secondary Header is updated. Sep 5 06:20:00.984878 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 5 06:20:01.002272 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:01.212186 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 5 06:20:01.212268 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 5 06:20:01.212283 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 5 06:20:01.212297 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 5 06:20:01.213838 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 5 06:20:01.213864 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 5 06:20:01.214847 kernel: ata3.00: LPM support broken, forcing max_power Sep 5 06:20:01.216234 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 5 06:20:01.216249 kernel: ata3.00: applying bridge limits Sep 5 06:20:01.217006 kernel: ata3.00: LPM support broken, forcing max_power Sep 5 06:20:01.217018 kernel: ata3.00: configured for UDMA/100 Sep 5 06:20:01.217840 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 5 06:20:01.258845 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 5 06:20:01.259147 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 5 06:20:01.273086 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 5 06:20:01.662346 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 5 06:20:01.663104 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 06:20:01.665636 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:20:01.668076 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 06:20:01.669189 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 5 06:20:01.703581 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 5 06:20:01.993688 disk-uuid[634]: The operation has completed successfully. Sep 5 06:20:01.994991 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 5 06:20:02.024778 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 5 06:20:02.024949 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 5 06:20:02.064014 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 5 06:20:02.090494 sh[668]: Success Sep 5 06:20:02.107867 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 5 06:20:02.107938 kernel: device-mapper: uevent: version 1.0.3 Sep 5 06:20:02.109556 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 5 06:20:02.119853 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 5 06:20:02.151226 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 5 06:20:02.154663 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 5 06:20:02.173768 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 5 06:20:02.182801 kernel: BTRFS: device fsid 98069635-e988-4e04-b156-f40a4a69cf42 devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (680) Sep 5 06:20:02.182853 kernel: BTRFS info (device dm-0): first mount of filesystem 98069635-e988-4e04-b156-f40a4a69cf42 Sep 5 06:20:02.182865 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:20:02.188853 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 5 06:20:02.188888 kernel: BTRFS info (device dm-0): enabling free space tree Sep 5 06:20:02.190251 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 5 06:20:02.191070 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 5 06:20:02.193163 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 5 06:20:02.194336 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 5 06:20:02.198995 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 5 06:20:02.229729 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (713) Sep 5 06:20:02.229799 kernel: BTRFS info (device vda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:20:02.229876 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:20:02.236026 kernel: BTRFS info (device vda6): turning on async discard Sep 5 06:20:02.236110 kernel: BTRFS info (device vda6): enabling free space tree Sep 5 06:20:02.241989 kernel: BTRFS info (device vda6): last unmount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:20:02.243008 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 5 06:20:02.246082 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 5 06:20:02.336557 ignition[758]: Ignition 2.22.0 Sep 5 06:20:02.336575 ignition[758]: Stage: fetch-offline Sep 5 06:20:02.336613 ignition[758]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:02.336625 ignition[758]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:02.336735 ignition[758]: parsed url from cmdline: "" Sep 5 06:20:02.336739 ignition[758]: no config URL provided Sep 5 06:20:02.336746 ignition[758]: reading system config file "/usr/lib/ignition/user.ign" Sep 5 06:20:02.336757 ignition[758]: no config at "/usr/lib/ignition/user.ign" Sep 5 06:20:02.336783 ignition[758]: op(1): [started] loading QEMU firmware config module Sep 5 06:20:02.336791 ignition[758]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 5 06:20:02.348611 ignition[758]: op(1): [finished] loading QEMU firmware config module Sep 5 06:20:02.353743 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 06:20:02.356940 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 06:20:02.396013 ignition[758]: parsing config with SHA512: b3d5da728d07875c5600d5b71d243c796f7b76deb100286414ce1f154c5181f932e2ba3b9c6dfec650c5e5512bcf910787931718d8a5b3e4cc52324a5ddff387 Sep 5 06:20:02.400754 unknown[758]: fetched base config from "system" Sep 5 06:20:02.400778 unknown[758]: fetched user config from "qemu" Sep 5 06:20:02.401583 ignition[758]: fetch-offline: fetch-offline passed Sep 5 06:20:02.401700 ignition[758]: Ignition finished successfully Sep 5 06:20:02.405884 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 06:20:02.407308 systemd-networkd[857]: lo: Link UP Sep 5 06:20:02.407312 systemd-networkd[857]: lo: Gained carrier Sep 5 06:20:02.408926 systemd-networkd[857]: Enumeration completed Sep 5 06:20:02.409032 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 06:20:02.409325 systemd-networkd[857]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 06:20:02.409331 systemd-networkd[857]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 06:20:02.410725 systemd-networkd[857]: eth0: Link UP Sep 5 06:20:02.410885 systemd[1]: Reached target network.target - Network. Sep 5 06:20:02.410973 systemd-networkd[857]: eth0: Gained carrier Sep 5 06:20:02.410982 systemd-networkd[857]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 06:20:02.412724 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 5 06:20:02.413548 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 5 06:20:02.431875 systemd-networkd[857]: eth0: DHCPv4 address 10.0.0.140/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 5 06:20:02.450402 ignition[861]: Ignition 2.22.0 Sep 5 06:20:02.450415 ignition[861]: Stage: kargs Sep 5 06:20:02.450549 ignition[861]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:02.450562 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:02.451419 ignition[861]: kargs: kargs passed Sep 5 06:20:02.451462 ignition[861]: Ignition finished successfully Sep 5 06:20:02.458910 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 5 06:20:02.461304 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 5 06:20:02.500143 ignition[870]: Ignition 2.22.0 Sep 5 06:20:02.500156 ignition[870]: Stage: disks Sep 5 06:20:02.500285 ignition[870]: no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:02.500295 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:02.501335 ignition[870]: disks: disks passed Sep 5 06:20:02.501384 ignition[870]: Ignition finished successfully Sep 5 06:20:02.505699 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 5 06:20:02.507079 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 5 06:20:02.507918 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 5 06:20:02.509949 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 06:20:02.512263 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 06:20:02.514086 systemd[1]: Reached target basic.target - Basic System. Sep 5 06:20:02.517999 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 5 06:20:02.542059 systemd-fsck[880]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 5 06:20:02.772697 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 5 06:20:02.775751 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 5 06:20:02.957864 kernel: EXT4-fs (vda9): mounted filesystem 5e58259f-916a-43e8-ae75-b44bea97e14e r/w with ordered data mode. Quota mode: none. Sep 5 06:20:02.958625 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 5 06:20:02.960823 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 5 06:20:02.964141 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 06:20:02.966582 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 5 06:20:02.968516 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 5 06:20:02.968564 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 5 06:20:02.968589 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 06:20:02.977289 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 5 06:20:02.980852 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 5 06:20:02.985931 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (888) Sep 5 06:20:02.985958 kernel: BTRFS info (device vda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:20:02.985971 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:20:02.988947 kernel: BTRFS info (device vda6): turning on async discard Sep 5 06:20:02.988972 kernel: BTRFS info (device vda6): enabling free space tree Sep 5 06:20:02.991324 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 06:20:03.027207 initrd-setup-root[913]: cut: /sysroot/etc/passwd: No such file or directory Sep 5 06:20:03.031530 initrd-setup-root[920]: cut: /sysroot/etc/group: No such file or directory Sep 5 06:20:03.035712 initrd-setup-root[927]: cut: /sysroot/etc/shadow: No such file or directory Sep 5 06:20:03.040446 initrd-setup-root[934]: cut: /sysroot/etc/gshadow: No such file or directory Sep 5 06:20:03.130537 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 5 06:20:03.133390 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 5 06:20:03.135350 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 5 06:20:03.156833 kernel: BTRFS info (device vda6): last unmount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:20:03.172137 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 5 06:20:03.182689 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 5 06:20:03.191478 ignition[1002]: INFO : Ignition 2.22.0 Sep 5 06:20:03.191478 ignition[1002]: INFO : Stage: mount Sep 5 06:20:03.193278 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:03.193278 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:03.196259 ignition[1002]: INFO : mount: mount passed Sep 5 06:20:03.197129 ignition[1002]: INFO : Ignition finished successfully Sep 5 06:20:03.201068 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 5 06:20:03.204399 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 5 06:20:03.230496 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 06:20:03.261884 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1015) Sep 5 06:20:03.261937 kernel: BTRFS info (device vda6): first mount of filesystem b74bbc0c-6da1-4206-9f48-c70f629ccdff Sep 5 06:20:03.261949 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 5 06:20:03.266093 kernel: BTRFS info (device vda6): turning on async discard Sep 5 06:20:03.266123 kernel: BTRFS info (device vda6): enabling free space tree Sep 5 06:20:03.268641 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 06:20:03.312557 ignition[1032]: INFO : Ignition 2.22.0 Sep 5 06:20:03.312557 ignition[1032]: INFO : Stage: files Sep 5 06:20:03.314552 ignition[1032]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:03.314552 ignition[1032]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:03.314552 ignition[1032]: DEBUG : files: compiled without relabeling support, skipping Sep 5 06:20:03.314552 ignition[1032]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 5 06:20:03.314552 ignition[1032]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 5 06:20:03.321540 ignition[1032]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 5 06:20:03.321540 ignition[1032]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 5 06:20:03.321540 ignition[1032]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 5 06:20:03.321540 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 5 06:20:03.321540 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 5 06:20:03.317504 unknown[1032]: wrote ssh authorized keys file for user: core Sep 5 06:20:03.637117 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 5 06:20:03.676039 systemd-networkd[857]: eth0: Gained IPv6LL Sep 5 06:20:03.829407 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 5 06:20:03.829407 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 06:20:03.833925 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 5 06:20:03.848690 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 5 06:20:04.294371 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 5 06:20:04.795832 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 5 06:20:04.798346 ignition[1032]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 5 06:20:04.799751 ignition[1032]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 06:20:04.803837 ignition[1032]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 06:20:04.803837 ignition[1032]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 5 06:20:04.803837 ignition[1032]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 5 06:20:04.809059 ignition[1032]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 5 06:20:04.809059 ignition[1032]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 5 06:20:04.809059 ignition[1032]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 5 06:20:04.809059 ignition[1032]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 5 06:20:04.833335 ignition[1032]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 5 06:20:04.840428 ignition[1032]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 5 06:20:04.842216 ignition[1032]: INFO : files: files passed Sep 5 06:20:04.842216 ignition[1032]: INFO : Ignition finished successfully Sep 5 06:20:04.846851 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 5 06:20:04.852203 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 5 06:20:04.855600 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 5 06:20:04.874194 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 5 06:20:04.874364 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 5 06:20:04.878573 initrd-setup-root-after-ignition[1061]: grep: /sysroot/oem/oem-release: No such file or directory Sep 5 06:20:04.881475 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:20:04.883228 initrd-setup-root-after-ignition[1063]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:20:04.884846 initrd-setup-root-after-ignition[1067]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 06:20:04.885676 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 06:20:04.889751 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 5 06:20:04.892404 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 5 06:20:04.954048 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 5 06:20:04.954202 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 5 06:20:04.955636 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 5 06:20:04.958196 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 5 06:20:04.960509 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 5 06:20:04.964338 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 5 06:20:04.991628 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 06:20:04.993534 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 5 06:20:05.016324 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:20:05.017826 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:20:05.019170 systemd[1]: Stopped target timers.target - Timer Units. Sep 5 06:20:05.020727 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 5 06:20:05.020935 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 06:20:05.026111 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 5 06:20:05.027483 systemd[1]: Stopped target basic.target - Basic System. Sep 5 06:20:05.029679 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 5 06:20:05.031645 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 06:20:05.032240 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 5 06:20:05.036017 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 5 06:20:05.036347 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 5 06:20:05.036677 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 06:20:05.037304 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 5 06:20:05.037622 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 5 06:20:05.038122 systemd[1]: Stopped target swap.target - Swaps. Sep 5 06:20:05.038545 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 5 06:20:05.038679 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 5 06:20:05.054098 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:20:05.054317 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:20:05.056660 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 5 06:20:05.059247 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:20:05.063225 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 5 06:20:05.063378 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 5 06:20:05.066672 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 5 06:20:05.066841 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 06:20:05.067980 systemd[1]: Stopped target paths.target - Path Units. Sep 5 06:20:05.068330 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 5 06:20:05.071987 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:20:05.075051 systemd[1]: Stopped target slices.target - Slice Units. Sep 5 06:20:05.076060 systemd[1]: Stopped target sockets.target - Socket Units. Sep 5 06:20:05.077988 systemd[1]: iscsid.socket: Deactivated successfully. Sep 5 06:20:05.078098 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 06:20:05.079031 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 5 06:20:05.079116 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 06:20:05.080710 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 5 06:20:05.080864 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 06:20:05.082518 systemd[1]: ignition-files.service: Deactivated successfully. Sep 5 06:20:05.082656 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 5 06:20:05.088608 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 5 06:20:05.091037 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 5 06:20:05.091191 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:20:05.093198 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 5 06:20:05.094243 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 5 06:20:05.094421 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:20:05.098077 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 5 06:20:05.098185 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 06:20:05.106692 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 5 06:20:05.108081 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 5 06:20:05.125800 ignition[1087]: INFO : Ignition 2.22.0 Sep 5 06:20:05.125800 ignition[1087]: INFO : Stage: umount Sep 5 06:20:05.127651 ignition[1087]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 06:20:05.127651 ignition[1087]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 5 06:20:05.127651 ignition[1087]: INFO : umount: umount passed Sep 5 06:20:05.127651 ignition[1087]: INFO : Ignition finished successfully Sep 5 06:20:05.129514 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 5 06:20:05.129665 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 5 06:20:05.132264 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 5 06:20:05.132779 systemd[1]: Stopped target network.target - Network. Sep 5 06:20:05.132921 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 5 06:20:05.132976 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 5 06:20:05.133251 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 5 06:20:05.133304 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 5 06:20:05.133584 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 5 06:20:05.133635 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 5 06:20:05.134079 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 5 06:20:05.134121 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 5 06:20:05.134494 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 5 06:20:05.134951 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 5 06:20:05.146849 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 5 06:20:05.146998 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 5 06:20:05.151650 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 5 06:20:05.152099 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 5 06:20:05.152169 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:20:05.156879 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 5 06:20:05.161640 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 5 06:20:05.161785 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 5 06:20:05.166028 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 5 06:20:05.166249 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 5 06:20:05.169312 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 5 06:20:05.169366 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:20:05.170276 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 5 06:20:05.171420 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 5 06:20:05.171475 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 06:20:05.171806 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 5 06:20:05.171890 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:20:05.177035 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 5 06:20:05.177085 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 5 06:20:05.177589 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:20:05.178953 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 5 06:20:05.201734 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 5 06:20:05.209045 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:20:05.210731 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 5 06:20:05.210781 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 5 06:20:05.211616 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 5 06:20:05.211659 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:20:05.213524 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 5 06:20:05.213581 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 5 06:20:05.214408 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 5 06:20:05.214462 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 5 06:20:05.219724 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 06:20:05.219780 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 06:20:05.223698 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 5 06:20:05.223761 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 5 06:20:05.223841 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:20:05.228079 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 5 06:20:05.228147 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:20:05.232571 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 06:20:05.232644 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:05.237459 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 5 06:20:05.244249 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 5 06:20:05.271366 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 5 06:20:05.271499 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 5 06:20:05.398899 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 5 06:20:05.399061 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 5 06:20:05.401583 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 5 06:20:05.402350 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 5 06:20:05.402405 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 5 06:20:05.408279 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 5 06:20:05.426241 systemd[1]: Switching root. Sep 5 06:20:05.470171 systemd-journald[219]: Journal stopped Sep 5 06:20:06.886903 systemd-journald[219]: Received SIGTERM from PID 1 (systemd). Sep 5 06:20:06.887001 kernel: SELinux: policy capability network_peer_controls=1 Sep 5 06:20:06.887019 kernel: SELinux: policy capability open_perms=1 Sep 5 06:20:06.887039 kernel: SELinux: policy capability extended_socket_class=1 Sep 5 06:20:06.887052 kernel: SELinux: policy capability always_check_network=0 Sep 5 06:20:06.887068 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 5 06:20:06.887089 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 5 06:20:06.887103 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 5 06:20:06.887114 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 5 06:20:06.887126 kernel: SELinux: policy capability userspace_initial_context=0 Sep 5 06:20:06.887141 kernel: audit: type=1403 audit(1757053206.020:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 5 06:20:06.887154 systemd[1]: Successfully loaded SELinux policy in 58.235ms. Sep 5 06:20:06.887179 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.519ms. Sep 5 06:20:06.887194 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 5 06:20:06.887207 systemd[1]: Detected virtualization kvm. Sep 5 06:20:06.887219 systemd[1]: Detected architecture x86-64. Sep 5 06:20:06.887233 systemd[1]: Detected first boot. Sep 5 06:20:06.887248 systemd[1]: Initializing machine ID from VM UUID. Sep 5 06:20:06.887261 zram_generator::config[1132]: No configuration found. Sep 5 06:20:06.887282 kernel: Guest personality initialized and is inactive Sep 5 06:20:06.887294 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 5 06:20:06.887305 kernel: Initialized host personality Sep 5 06:20:06.887321 kernel: NET: Registered PF_VSOCK protocol family Sep 5 06:20:06.887335 systemd[1]: Populated /etc with preset unit settings. Sep 5 06:20:06.887348 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 5 06:20:06.887360 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 5 06:20:06.887372 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 5 06:20:06.887384 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 5 06:20:06.887399 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 5 06:20:06.887411 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 5 06:20:06.887425 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 5 06:20:06.887438 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 5 06:20:06.887450 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 5 06:20:06.887462 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 5 06:20:06.887474 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 5 06:20:06.887489 systemd[1]: Created slice user.slice - User and Session Slice. Sep 5 06:20:06.887501 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 06:20:06.887513 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 06:20:06.887526 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 5 06:20:06.887541 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 5 06:20:06.887553 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 5 06:20:06.887566 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 06:20:06.887578 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 5 06:20:06.887591 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 06:20:06.887603 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 06:20:06.887615 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 5 06:20:06.887628 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 5 06:20:06.887642 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 5 06:20:06.887654 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 5 06:20:06.887666 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 06:20:06.887679 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 06:20:06.887691 systemd[1]: Reached target slices.target - Slice Units. Sep 5 06:20:06.887706 systemd[1]: Reached target swap.target - Swaps. Sep 5 06:20:06.887723 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 5 06:20:06.887736 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 5 06:20:06.887751 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 5 06:20:06.887765 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 06:20:06.887787 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 06:20:06.887800 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 06:20:06.887825 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 5 06:20:06.887849 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 5 06:20:06.887861 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 5 06:20:06.887873 systemd[1]: Mounting media.mount - External Media Directory... Sep 5 06:20:06.887886 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:06.887898 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 5 06:20:06.887914 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 5 06:20:06.887927 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 5 06:20:06.887943 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 5 06:20:06.887955 systemd[1]: Reached target machines.target - Containers. Sep 5 06:20:06.887968 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 5 06:20:06.887980 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 06:20:06.887993 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 06:20:06.888006 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 5 06:20:06.888021 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:20:06.888033 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 06:20:06.888046 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:20:06.888058 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 5 06:20:06.888071 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:20:06.888085 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 5 06:20:06.888098 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 5 06:20:06.888110 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 5 06:20:06.888122 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 5 06:20:06.888136 systemd[1]: Stopped systemd-fsck-usr.service. Sep 5 06:20:06.888149 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:20:06.888162 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 06:20:06.888175 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 06:20:06.888187 kernel: loop: module loaded Sep 5 06:20:06.888199 kernel: fuse: init (API version 7.41) Sep 5 06:20:06.888211 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 06:20:06.888230 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 5 06:20:06.888243 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 5 06:20:06.888255 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 06:20:06.888267 systemd[1]: verity-setup.service: Deactivated successfully. Sep 5 06:20:06.888282 systemd[1]: Stopped verity-setup.service. Sep 5 06:20:06.888295 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:06.888310 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 5 06:20:06.888323 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 5 06:20:06.888335 systemd[1]: Mounted media.mount - External Media Directory. Sep 5 06:20:06.888347 kernel: ACPI: bus type drm_connector registered Sep 5 06:20:06.888366 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 5 06:20:06.888377 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 5 06:20:06.888415 systemd-journald[1200]: Collecting audit messages is disabled. Sep 5 06:20:06.888439 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 5 06:20:06.888451 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 06:20:06.888464 systemd-journald[1200]: Journal started Sep 5 06:20:06.888487 systemd-journald[1200]: Runtime Journal (/run/log/journal/7d60e15d556a48d59e4cabbf0fc45d47) is 6M, max 48.4M, 42.4M free. Sep 5 06:20:06.644478 systemd[1]: Queued start job for default target multi-user.target. Sep 5 06:20:06.664414 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 5 06:20:06.665015 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 5 06:20:06.891880 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 06:20:06.892859 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 5 06:20:06.893215 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 5 06:20:06.894868 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 5 06:20:06.896258 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:20:06.896479 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:20:06.897894 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 06:20:06.898109 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 06:20:06.899422 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:20:06.899636 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:20:06.901151 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 5 06:20:06.901367 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 5 06:20:06.902746 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:20:06.903027 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:20:06.904405 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 06:20:06.905799 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 06:20:06.907335 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 5 06:20:06.908913 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 5 06:20:06.921727 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 06:20:06.924190 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 5 06:20:06.926284 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 5 06:20:06.927475 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 5 06:20:06.927502 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 06:20:06.929399 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 5 06:20:06.938855 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 5 06:20:06.940256 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:20:06.941594 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 5 06:20:06.945781 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 5 06:20:06.947127 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 06:20:06.948973 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 5 06:20:06.950110 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 06:20:06.952948 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 06:20:06.958949 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 5 06:20:06.962150 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 5 06:20:06.966907 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 5 06:20:06.967435 systemd-journald[1200]: Time spent on flushing to /var/log/journal/7d60e15d556a48d59e4cabbf0fc45d47 is 30.930ms for 1071 entries. Sep 5 06:20:06.967435 systemd-journald[1200]: System Journal (/var/log/journal/7d60e15d556a48d59e4cabbf0fc45d47) is 8M, max 195.6M, 187.6M free. Sep 5 06:20:07.671398 systemd-journald[1200]: Received client request to flush runtime journal. Sep 5 06:20:07.671538 kernel: loop0: detected capacity change from 0 to 111000 Sep 5 06:20:07.671591 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 5 06:20:06.969053 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 5 06:20:06.970687 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 5 06:20:06.975258 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 5 06:20:07.348891 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 5 06:20:07.350570 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 06:20:07.354189 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 06:20:07.680848 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 5 06:20:07.700230 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 5 06:20:07.702552 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 5 06:20:07.704406 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 5 06:20:07.708891 kernel: loop1: detected capacity change from 0 to 128016 Sep 5 06:20:07.710468 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 06:20:07.745947 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 5 06:20:07.746355 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 5 06:20:07.750681 kernel: loop2: detected capacity change from 0 to 229808 Sep 5 06:20:07.758141 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 06:20:07.793853 kernel: loop3: detected capacity change from 0 to 111000 Sep 5 06:20:07.809850 kernel: loop4: detected capacity change from 0 to 128016 Sep 5 06:20:07.820842 kernel: loop5: detected capacity change from 0 to 229808 Sep 5 06:20:07.827588 (sd-merge)[1274]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 5 06:20:07.828315 (sd-merge)[1274]: Merged extensions into '/usr'. Sep 5 06:20:07.856950 systemd[1]: Reload requested from client PID 1251 ('systemd-sysext') (unit systemd-sysext.service)... Sep 5 06:20:07.856964 systemd[1]: Reloading... Sep 5 06:20:07.972853 zram_generator::config[1296]: No configuration found. Sep 5 06:20:08.229038 ldconfig[1246]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 5 06:20:08.232939 systemd[1]: Reloading finished in 375 ms. Sep 5 06:20:08.269175 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 5 06:20:08.271629 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 5 06:20:08.293741 systemd[1]: Starting ensure-sysext.service... Sep 5 06:20:08.296792 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 06:20:08.327744 systemd[1]: Reload requested from client PID 1337 ('systemctl') (unit ensure-sysext.service)... Sep 5 06:20:08.327892 systemd[1]: Reloading... Sep 5 06:20:08.335043 systemd-tmpfiles[1339]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 5 06:20:08.335497 systemd-tmpfiles[1339]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 5 06:20:08.336016 systemd-tmpfiles[1339]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 5 06:20:08.336607 systemd-tmpfiles[1339]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 5 06:20:08.338142 systemd-tmpfiles[1339]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 5 06:20:08.338716 systemd-tmpfiles[1339]: ACLs are not supported, ignoring. Sep 5 06:20:08.338948 systemd-tmpfiles[1339]: ACLs are not supported, ignoring. Sep 5 06:20:08.345185 systemd-tmpfiles[1339]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 06:20:08.345345 systemd-tmpfiles[1339]: Skipping /boot Sep 5 06:20:08.358048 systemd-tmpfiles[1339]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 06:20:08.358163 systemd-tmpfiles[1339]: Skipping /boot Sep 5 06:20:08.397076 zram_generator::config[1366]: No configuration found. Sep 5 06:20:08.607855 systemd[1]: Reloading finished in 279 ms. Sep 5 06:20:08.634324 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 5 06:20:08.652108 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 06:20:08.661460 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 5 06:20:08.663985 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 5 06:20:08.666594 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 5 06:20:08.676037 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 06:20:08.680000 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 06:20:08.682773 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 5 06:20:08.688014 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.688192 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 06:20:08.690880 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:20:08.698582 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:20:08.702242 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:20:08.703883 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:20:08.703988 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:20:08.706264 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 5 06:20:08.707524 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.709037 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:20:08.709292 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:20:08.736872 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 5 06:20:08.743286 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:20:08.743771 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:20:08.750126 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:20:08.750609 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:20:08.753019 augenrules[1436]: No rules Sep 5 06:20:08.757075 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 06:20:08.757538 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 5 06:20:08.766460 systemd-udevd[1410]: Using default interface naming scheme 'v255'. Sep 5 06:20:08.768643 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 5 06:20:08.771194 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 5 06:20:08.777873 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.778176 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 06:20:08.781102 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:20:08.790396 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 06:20:08.798089 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:20:08.799537 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:20:08.799782 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:20:08.802157 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 5 06:20:08.804995 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 5 06:20:08.805152 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.811740 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 5 06:20:08.813939 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 06:20:08.816623 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:20:08.820125 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:20:08.826124 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:20:08.826436 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:20:08.840830 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.843903 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 5 06:20:08.845394 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 06:20:08.853319 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 06:20:08.861777 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 06:20:08.866726 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 06:20:08.869420 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 06:20:08.869582 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 5 06:20:08.878085 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 06:20:08.879453 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 5 06:20:08.879594 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 5 06:20:08.882898 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 06:20:08.883205 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 06:20:08.885692 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 5 06:20:08.888301 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 06:20:08.889045 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 06:20:08.890944 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 06:20:08.892056 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 06:20:08.898581 systemd-resolved[1408]: Positive Trust Anchors: Sep 5 06:20:08.898608 systemd-resolved[1408]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 06:20:08.898648 systemd-resolved[1408]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 06:20:08.899364 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 06:20:08.899795 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 06:20:08.902033 augenrules[1485]: /sbin/augenrules: No change Sep 5 06:20:08.909202 systemd[1]: Finished ensure-sysext.service. Sep 5 06:20:08.917640 systemd-resolved[1408]: Defaulting to hostname 'linux'. Sep 5 06:20:08.921913 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 06:20:08.927726 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 06:20:08.929151 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 06:20:08.929214 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 06:20:08.936091 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 5 06:20:08.947151 augenrules[1514]: No rules Sep 5 06:20:08.949588 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 06:20:08.951045 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 5 06:20:08.986741 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 5 06:20:09.026851 kernel: mousedev: PS/2 mouse device common for all mice Sep 5 06:20:09.028954 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 5 06:20:09.032072 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 5 06:20:09.044599 systemd-networkd[1494]: lo: Link UP Sep 5 06:20:09.044888 systemd-networkd[1494]: lo: Gained carrier Sep 5 06:20:09.046906 systemd-networkd[1494]: Enumeration completed Sep 5 06:20:09.047111 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 06:20:09.047697 systemd-networkd[1494]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 06:20:09.047773 systemd-networkd[1494]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 06:20:09.048340 systemd-networkd[1494]: eth0: Link UP Sep 5 06:20:09.048697 systemd-networkd[1494]: eth0: Gained carrier Sep 5 06:20:09.048776 systemd-networkd[1494]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 06:20:09.049008 systemd[1]: Reached target network.target - Network. Sep 5 06:20:09.053019 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 5 06:20:09.053865 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 5 06:20:09.056830 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 5 06:20:09.060838 kernel: ACPI: button: Power Button [PWRF] Sep 5 06:20:09.062974 systemd-networkd[1494]: eth0: DHCPv4 address 10.0.0.140/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 5 06:20:09.073488 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 5 06:20:09.103096 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 5 06:20:09.103484 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 5 06:20:09.103685 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 5 06:20:09.113725 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 5 06:20:09.115772 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 5 06:20:09.115905 systemd-timesyncd[1512]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 5 06:20:09.115968 systemd-timesyncd[1512]: Initial clock synchronization to Fri 2025-09-05 06:20:09.368998 UTC. Sep 5 06:20:09.117978 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 06:20:09.119317 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 5 06:20:09.120664 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 5 06:20:09.121967 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 5 06:20:09.123312 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 5 06:20:09.124801 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 5 06:20:09.124848 systemd[1]: Reached target paths.target - Path Units. Sep 5 06:20:09.125874 systemd[1]: Reached target time-set.target - System Time Set. Sep 5 06:20:09.127458 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 5 06:20:09.128917 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 5 06:20:09.131998 systemd[1]: Reached target timers.target - Timer Units. Sep 5 06:20:09.134182 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 5 06:20:09.137494 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 5 06:20:09.143571 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 5 06:20:09.148263 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 5 06:20:09.150641 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 5 06:20:09.180359 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 5 06:20:09.181933 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 5 06:20:09.183938 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 5 06:20:09.185763 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 06:20:09.187956 systemd[1]: Reached target basic.target - Basic System. Sep 5 06:20:09.189085 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 5 06:20:09.189138 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 5 06:20:09.195949 systemd[1]: Starting containerd.service - containerd container runtime... Sep 5 06:20:09.202744 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 5 06:20:09.236951 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 5 06:20:09.241897 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 5 06:20:09.256160 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 5 06:20:09.257371 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 5 06:20:09.260004 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 5 06:20:09.262513 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 5 06:20:09.307244 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 5 06:20:09.315066 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 5 06:20:09.318143 jq[1558]: false Sep 5 06:20:09.329066 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 5 06:20:09.333037 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Refreshing passwd entry cache Sep 5 06:20:09.333624 oslogin_cache_refresh[1560]: Refreshing passwd entry cache Sep 5 06:20:09.339448 extend-filesystems[1559]: Found /dev/vda6 Sep 5 06:20:09.340036 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 5 06:20:09.343032 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 5 06:20:09.344447 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 5 06:20:09.345264 systemd[1]: Starting update-engine.service - Update Engine... Sep 5 06:20:09.345498 extend-filesystems[1559]: Found /dev/vda9 Sep 5 06:20:09.359747 extend-filesystems[1559]: Checking size of /dev/vda9 Sep 5 06:20:09.360747 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 5 06:20:09.362474 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Failure getting users, quitting Sep 5 06:20:09.362464 oslogin_cache_refresh[1560]: Failure getting users, quitting Sep 5 06:20:09.362587 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 5 06:20:09.362587 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Refreshing group entry cache Sep 5 06:20:09.362496 oslogin_cache_refresh[1560]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 5 06:20:09.362590 oslogin_cache_refresh[1560]: Refreshing group entry cache Sep 5 06:20:09.367467 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 5 06:20:09.369592 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 5 06:20:09.369742 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Failure getting groups, quitting Sep 5 06:20:09.369735 oslogin_cache_refresh[1560]: Failure getting groups, quitting Sep 5 06:20:09.369849 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 5 06:20:09.369768 oslogin_cache_refresh[1560]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 5 06:20:09.369925 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 5 06:20:09.370433 systemd[1]: motdgen.service: Deactivated successfully. Sep 5 06:20:09.372541 extend-filesystems[1559]: Resized partition /dev/vda9 Sep 5 06:20:09.379148 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 5 06:20:09.381554 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 5 06:20:09.382269 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 5 06:20:09.385173 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 5 06:20:09.385989 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 5 06:20:09.389559 extend-filesystems[1583]: resize2fs 1.47.2 (1-Jan-2025) Sep 5 06:20:09.406263 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 5 06:20:09.405162 (ntainerd)[1585]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 5 06:20:09.419338 kernel: kvm_amd: TSC scaling supported Sep 5 06:20:09.419467 kernel: kvm_amd: Nested Virtualization enabled Sep 5 06:20:09.419482 kernel: kvm_amd: Nested Paging enabled Sep 5 06:20:09.419511 kernel: kvm_amd: LBR virtualization supported Sep 5 06:20:09.422502 jq[1579]: true Sep 5 06:20:09.422794 update_engine[1575]: I20250905 06:20:09.421458 1575 main.cc:92] Flatcar Update Engine starting Sep 5 06:20:09.424290 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 5 06:20:09.424319 kernel: kvm_amd: Virtual GIF supported Sep 5 06:20:09.427239 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 06:20:09.437558 tar[1584]: linux-amd64/LICENSE Sep 5 06:20:09.438104 tar[1584]: linux-amd64/helm Sep 5 06:20:09.461842 jq[1600]: true Sep 5 06:20:09.467684 dbus-daemon[1556]: [system] SELinux support is enabled Sep 5 06:20:09.468306 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 5 06:20:09.478839 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 5 06:20:09.511161 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 5 06:20:09.511196 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 5 06:20:09.525481 update_engine[1575]: I20250905 06:20:09.514528 1575 update_check_scheduler.cc:74] Next update check in 4m46s Sep 5 06:20:09.512900 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 5 06:20:09.512916 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 5 06:20:09.514446 systemd[1]: Started update-engine.service - Update Engine. Sep 5 06:20:09.519758 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 5 06:20:09.537110 extend-filesystems[1583]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 5 06:20:09.537110 extend-filesystems[1583]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 5 06:20:09.537110 extend-filesystems[1583]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 5 06:20:09.541512 extend-filesystems[1559]: Resized filesystem in /dev/vda9 Sep 5 06:20:09.538329 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 5 06:20:09.538745 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 5 06:20:09.560069 systemd-logind[1570]: Watching system buttons on /dev/input/event2 (Power Button) Sep 5 06:20:09.560098 systemd-logind[1570]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 5 06:20:09.561398 systemd-logind[1570]: New seat seat0. Sep 5 06:20:09.565137 systemd[1]: Started systemd-logind.service - User Login Management. Sep 5 06:20:09.610408 bash[1623]: Updated "/home/core/.ssh/authorized_keys" Sep 5 06:20:09.613014 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 5 06:20:09.615203 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 5 06:20:09.645196 kernel: EDAC MC: Ver: 3.0.0 Sep 5 06:20:09.738281 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 06:20:09.741681 locksmithd[1607]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 5 06:20:09.867843 containerd[1585]: time="2025-09-05T06:20:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 5 06:20:09.868567 containerd[1585]: time="2025-09-05T06:20:09.868532033Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 5 06:20:09.897594 containerd[1585]: time="2025-09-05T06:20:09.897482185Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="24.234µs" Sep 5 06:20:09.897594 containerd[1585]: time="2025-09-05T06:20:09.897563918Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 5 06:20:09.897594 containerd[1585]: time="2025-09-05T06:20:09.897596930Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 5 06:20:09.898865 containerd[1585]: time="2025-09-05T06:20:09.897896803Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 5 06:20:09.898865 containerd[1585]: time="2025-09-05T06:20:09.897923793Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 5 06:20:09.898865 containerd[1585]: time="2025-09-05T06:20:09.897961935Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 5 06:20:09.898865 containerd[1585]: time="2025-09-05T06:20:09.898149827Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 5 06:20:09.898865 containerd[1585]: time="2025-09-05T06:20:09.898193590Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 5 06:20:09.899373 containerd[1585]: time="2025-09-05T06:20:09.899133462Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 5 06:20:09.899373 containerd[1585]: time="2025-09-05T06:20:09.899207602Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 5 06:20:09.899373 containerd[1585]: time="2025-09-05T06:20:09.899255171Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 5 06:20:09.899373 containerd[1585]: time="2025-09-05T06:20:09.899289916Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.899750350Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900355294Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900393215Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900404116Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900450864Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900719978Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 5 06:20:09.901827 containerd[1585]: time="2025-09-05T06:20:09.900867074Z" level=info msg="metadata content store policy set" policy=shared Sep 5 06:20:09.907777 containerd[1585]: time="2025-09-05T06:20:09.907589794Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 5 06:20:09.907846 containerd[1585]: time="2025-09-05T06:20:09.907798997Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 5 06:20:09.907846 containerd[1585]: time="2025-09-05T06:20:09.907834934Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 5 06:20:09.907909 containerd[1585]: time="2025-09-05T06:20:09.907853198Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 5 06:20:09.907909 containerd[1585]: time="2025-09-05T06:20:09.907872905Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 5 06:20:09.907909 containerd[1585]: time="2025-09-05T06:20:09.907894035Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907914543Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907927718Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907938618Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907949559Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907959197Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 5 06:20:09.907979 containerd[1585]: time="2025-09-05T06:20:09.907973544Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 5 06:20:09.908137 containerd[1585]: time="2025-09-05T06:20:09.908107676Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 5 06:20:09.908137 containerd[1585]: time="2025-09-05T06:20:09.908137301Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908158811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908170213Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908181995Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908195951Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908229815Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908257537Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908282403Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908295838Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 5 06:20:09.908370 containerd[1585]: time="2025-09-05T06:20:09.908308943Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 5 06:20:09.908673 containerd[1585]: time="2025-09-05T06:20:09.908411285Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 5 06:20:09.908673 containerd[1585]: time="2025-09-05T06:20:09.908428617Z" level=info msg="Start snapshots syncer" Sep 5 06:20:09.908673 containerd[1585]: time="2025-09-05T06:20:09.908471047Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 5 06:20:09.908848 containerd[1585]: time="2025-09-05T06:20:09.908775168Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 5 06:20:09.909058 containerd[1585]: time="2025-09-05T06:20:09.908871198Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 5 06:20:09.912196 containerd[1585]: time="2025-09-05T06:20:09.911662674Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 5 06:20:09.912552 containerd[1585]: time="2025-09-05T06:20:09.912531093Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 5 06:20:09.912650 containerd[1585]: time="2025-09-05T06:20:09.912632413Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 5 06:20:09.912736 containerd[1585]: time="2025-09-05T06:20:09.912703526Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 5 06:20:09.913452 containerd[1585]: time="2025-09-05T06:20:09.912824904Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 5 06:20:09.913523 containerd[1585]: time="2025-09-05T06:20:09.913509618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 5 06:20:09.913611 containerd[1585]: time="2025-09-05T06:20:09.913579058Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 5 06:20:09.913685 containerd[1585]: time="2025-09-05T06:20:09.913670800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 5 06:20:09.913834 containerd[1585]: time="2025-09-05T06:20:09.913786798Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 5 06:20:09.913918 containerd[1585]: time="2025-09-05T06:20:09.913904950Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 5 06:20:09.914002 containerd[1585]: time="2025-09-05T06:20:09.913984158Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 5 06:20:09.914121 containerd[1585]: time="2025-09-05T06:20:09.914106077Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 5 06:20:09.914416 containerd[1585]: time="2025-09-05T06:20:09.914388256Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 5 06:20:09.914499 containerd[1585]: time="2025-09-05T06:20:09.914467515Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 5 06:20:09.914600 containerd[1585]: time="2025-09-05T06:20:09.914579906Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 5 06:20:09.914714 containerd[1585]: time="2025-09-05T06:20:09.914670866Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 5 06:20:09.914832 containerd[1585]: time="2025-09-05T06:20:09.914786623Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 5 06:20:09.914921 containerd[1585]: time="2025-09-05T06:20:09.914899104Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 5 06:20:09.915014 containerd[1585]: time="2025-09-05T06:20:09.915002138Z" level=info msg="runtime interface created" Sep 5 06:20:09.916009 containerd[1585]: time="2025-09-05T06:20:09.915093649Z" level=info msg="created NRI interface" Sep 5 06:20:09.916009 containerd[1585]: time="2025-09-05T06:20:09.915114478Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 5 06:20:09.916009 containerd[1585]: time="2025-09-05T06:20:09.915153772Z" level=info msg="Connect containerd service" Sep 5 06:20:09.916009 containerd[1585]: time="2025-09-05T06:20:09.915190290Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 5 06:20:09.917886 containerd[1585]: time="2025-09-05T06:20:09.917859197Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 5 06:20:10.216218 containerd[1585]: time="2025-09-05T06:20:10.216024791Z" level=info msg="Start subscribing containerd event" Sep 5 06:20:10.216218 containerd[1585]: time="2025-09-05T06:20:10.216173866Z" level=info msg="Start recovering state" Sep 5 06:20:10.216441 containerd[1585]: time="2025-09-05T06:20:10.216401665Z" level=info msg="Start event monitor" Sep 5 06:20:10.216478 containerd[1585]: time="2025-09-05T06:20:10.216453162Z" level=info msg="Start cni network conf syncer for default" Sep 5 06:20:10.216478 containerd[1585]: time="2025-09-05T06:20:10.216466207Z" level=info msg="Start streaming server" Sep 5 06:20:10.216554 containerd[1585]: time="2025-09-05T06:20:10.216486778Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 5 06:20:10.216554 containerd[1585]: time="2025-09-05T06:20:10.216504422Z" level=info msg="runtime interface starting up..." Sep 5 06:20:10.216554 containerd[1585]: time="2025-09-05T06:20:10.216524475Z" level=info msg="starting plugins..." Sep 5 06:20:10.216554 containerd[1585]: time="2025-09-05T06:20:10.216552601Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 5 06:20:10.217349 containerd[1585]: time="2025-09-05T06:20:10.217310091Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 5 06:20:10.217421 containerd[1585]: time="2025-09-05T06:20:10.217396754Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 5 06:20:10.217607 containerd[1585]: time="2025-09-05T06:20:10.217564714Z" level=info msg="containerd successfully booted in 0.350874s" Sep 5 06:20:10.218126 systemd[1]: Started containerd.service - containerd container runtime. Sep 5 06:20:10.227796 tar[1584]: linux-amd64/README.md Sep 5 06:20:10.248392 sshd_keygen[1597]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 5 06:20:10.258371 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 5 06:20:10.280732 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 5 06:20:10.284585 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 5 06:20:10.317090 systemd[1]: issuegen.service: Deactivated successfully. Sep 5 06:20:10.317430 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 5 06:20:10.320674 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 5 06:20:10.362253 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 5 06:20:10.366540 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 5 06:20:10.369113 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 5 06:20:10.370688 systemd[1]: Reached target getty.target - Login Prompts. Sep 5 06:20:10.524553 systemd-networkd[1494]: eth0: Gained IPv6LL Sep 5 06:20:10.528323 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 5 06:20:10.530520 systemd[1]: Reached target network-online.target - Network is Online. Sep 5 06:20:10.533953 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 5 06:20:10.537561 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:10.540822 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 5 06:20:10.596746 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 5 06:20:10.627324 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 5 06:20:10.627668 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 5 06:20:10.629474 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 5 06:20:11.972628 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:11.975082 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 5 06:20:11.976718 systemd[1]: Startup finished in 3.213s (kernel) + 6.371s (initrd) + 6.011s (userspace) = 15.596s. Sep 5 06:20:11.977367 (kubelet)[1697]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:20:12.519517 kubelet[1697]: E0905 06:20:12.519447 1697 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:20:12.523791 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:20:12.524039 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:20:12.524469 systemd[1]: kubelet.service: Consumed 1.709s CPU time, 266.1M memory peak. Sep 5 06:20:13.550430 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 5 06:20:13.551813 systemd[1]: Started sshd@0-10.0.0.140:22-10.0.0.1:42310.service - OpenSSH per-connection server daemon (10.0.0.1:42310). Sep 5 06:20:13.634442 sshd[1710]: Accepted publickey for core from 10.0.0.1 port 42310 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:13.636516 sshd-session[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:13.643620 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 5 06:20:13.644730 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 5 06:20:13.652695 systemd-logind[1570]: New session 1 of user core. Sep 5 06:20:13.669416 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 5 06:20:13.672396 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 5 06:20:13.692935 (systemd)[1715]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 5 06:20:13.695867 systemd-logind[1570]: New session c1 of user core. Sep 5 06:20:14.009101 systemd[1715]: Queued start job for default target default.target. Sep 5 06:20:14.018332 systemd[1715]: Created slice app.slice - User Application Slice. Sep 5 06:20:14.018365 systemd[1715]: Reached target paths.target - Paths. Sep 5 06:20:14.018418 systemd[1715]: Reached target timers.target - Timers. Sep 5 06:20:14.020168 systemd[1715]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 5 06:20:14.032404 systemd[1715]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 5 06:20:14.032556 systemd[1715]: Reached target sockets.target - Sockets. Sep 5 06:20:14.032595 systemd[1715]: Reached target basic.target - Basic System. Sep 5 06:20:14.032637 systemd[1715]: Reached target default.target - Main User Target. Sep 5 06:20:14.032670 systemd[1715]: Startup finished in 328ms. Sep 5 06:20:14.033035 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 5 06:20:14.034779 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 5 06:20:14.100170 systemd[1]: Started sshd@1-10.0.0.140:22-10.0.0.1:42320.service - OpenSSH per-connection server daemon (10.0.0.1:42320). Sep 5 06:20:14.162701 sshd[1726]: Accepted publickey for core from 10.0.0.1 port 42320 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:14.164533 sshd-session[1726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:14.169148 systemd-logind[1570]: New session 2 of user core. Sep 5 06:20:14.185952 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 5 06:20:14.242164 sshd[1729]: Connection closed by 10.0.0.1 port 42320 Sep 5 06:20:14.242571 sshd-session[1726]: pam_unix(sshd:session): session closed for user core Sep 5 06:20:14.251633 systemd[1]: sshd@1-10.0.0.140:22-10.0.0.1:42320.service: Deactivated successfully. Sep 5 06:20:14.253308 systemd[1]: session-2.scope: Deactivated successfully. Sep 5 06:20:14.254168 systemd-logind[1570]: Session 2 logged out. Waiting for processes to exit. Sep 5 06:20:14.256914 systemd[1]: Started sshd@2-10.0.0.140:22-10.0.0.1:42334.service - OpenSSH per-connection server daemon (10.0.0.1:42334). Sep 5 06:20:14.257478 systemd-logind[1570]: Removed session 2. Sep 5 06:20:14.318510 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 42334 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:14.320005 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:14.325208 systemd-logind[1570]: New session 3 of user core. Sep 5 06:20:14.338964 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 5 06:20:14.393683 sshd[1739]: Connection closed by 10.0.0.1 port 42334 Sep 5 06:20:14.394277 sshd-session[1735]: pam_unix(sshd:session): session closed for user core Sep 5 06:20:14.404986 systemd[1]: sshd@2-10.0.0.140:22-10.0.0.1:42334.service: Deactivated successfully. Sep 5 06:20:14.407650 systemd[1]: session-3.scope: Deactivated successfully. Sep 5 06:20:14.408634 systemd-logind[1570]: Session 3 logged out. Waiting for processes to exit. Sep 5 06:20:14.412744 systemd[1]: Started sshd@3-10.0.0.140:22-10.0.0.1:42342.service - OpenSSH per-connection server daemon (10.0.0.1:42342). Sep 5 06:20:14.413764 systemd-logind[1570]: Removed session 3. Sep 5 06:20:14.476485 sshd[1745]: Accepted publickey for core from 10.0.0.1 port 42342 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:14.478459 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:14.483925 systemd-logind[1570]: New session 4 of user core. Sep 5 06:20:14.495037 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 5 06:20:14.552313 sshd[1748]: Connection closed by 10.0.0.1 port 42342 Sep 5 06:20:14.552787 sshd-session[1745]: pam_unix(sshd:session): session closed for user core Sep 5 06:20:14.566624 systemd[1]: sshd@3-10.0.0.140:22-10.0.0.1:42342.service: Deactivated successfully. Sep 5 06:20:14.569107 systemd[1]: session-4.scope: Deactivated successfully. Sep 5 06:20:14.570016 systemd-logind[1570]: Session 4 logged out. Waiting for processes to exit. Sep 5 06:20:14.573660 systemd[1]: Started sshd@4-10.0.0.140:22-10.0.0.1:42358.service - OpenSSH per-connection server daemon (10.0.0.1:42358). Sep 5 06:20:14.574236 systemd-logind[1570]: Removed session 4. Sep 5 06:20:14.633559 sshd[1754]: Accepted publickey for core from 10.0.0.1 port 42358 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:14.635135 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:14.640035 systemd-logind[1570]: New session 5 of user core. Sep 5 06:20:14.656031 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 5 06:20:14.716661 sudo[1759]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 5 06:20:14.716997 sudo[1759]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:20:14.736585 sudo[1759]: pam_unix(sudo:session): session closed for user root Sep 5 06:20:14.738212 sshd[1758]: Connection closed by 10.0.0.1 port 42358 Sep 5 06:20:14.738605 sshd-session[1754]: pam_unix(sshd:session): session closed for user core Sep 5 06:20:14.752353 systemd[1]: sshd@4-10.0.0.140:22-10.0.0.1:42358.service: Deactivated successfully. Sep 5 06:20:14.754146 systemd[1]: session-5.scope: Deactivated successfully. Sep 5 06:20:14.754978 systemd-logind[1570]: Session 5 logged out. Waiting for processes to exit. Sep 5 06:20:14.757739 systemd[1]: Started sshd@5-10.0.0.140:22-10.0.0.1:42368.service - OpenSSH per-connection server daemon (10.0.0.1:42368). Sep 5 06:20:14.758508 systemd-logind[1570]: Removed session 5. Sep 5 06:20:14.820771 sshd[1765]: Accepted publickey for core from 10.0.0.1 port 42368 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:14.822339 sshd-session[1765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:14.827262 systemd-logind[1570]: New session 6 of user core. Sep 5 06:20:14.834988 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 5 06:20:14.890674 sudo[1770]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 5 06:20:14.891525 sudo[1770]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:20:15.134041 sudo[1770]: pam_unix(sudo:session): session closed for user root Sep 5 06:20:15.141541 sudo[1769]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 5 06:20:15.141931 sudo[1769]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:20:15.156034 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 5 06:20:15.215446 augenrules[1792]: No rules Sep 5 06:20:15.216583 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 06:20:15.216886 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 5 06:20:15.218167 sudo[1769]: pam_unix(sudo:session): session closed for user root Sep 5 06:20:15.220075 sshd[1768]: Connection closed by 10.0.0.1 port 42368 Sep 5 06:20:15.220532 sshd-session[1765]: pam_unix(sshd:session): session closed for user core Sep 5 06:20:15.229553 systemd[1]: sshd@5-10.0.0.140:22-10.0.0.1:42368.service: Deactivated successfully. Sep 5 06:20:15.231444 systemd[1]: session-6.scope: Deactivated successfully. Sep 5 06:20:15.232510 systemd-logind[1570]: Session 6 logged out. Waiting for processes to exit. Sep 5 06:20:15.235346 systemd[1]: Started sshd@6-10.0.0.140:22-10.0.0.1:42380.service - OpenSSH per-connection server daemon (10.0.0.1:42380). Sep 5 06:20:15.236380 systemd-logind[1570]: Removed session 6. Sep 5 06:20:15.300715 sshd[1801]: Accepted publickey for core from 10.0.0.1 port 42380 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:20:15.302660 sshd-session[1801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:20:15.308162 systemd-logind[1570]: New session 7 of user core. Sep 5 06:20:15.318165 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 5 06:20:15.376349 sudo[1805]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 5 06:20:15.376778 sudo[1805]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 06:20:16.192175 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 5 06:20:16.217228 (dockerd)[1825]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 5 06:20:16.844042 dockerd[1825]: time="2025-09-05T06:20:16.843920108Z" level=info msg="Starting up" Sep 5 06:20:16.844969 dockerd[1825]: time="2025-09-05T06:20:16.844918359Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 5 06:20:16.867736 dockerd[1825]: time="2025-09-05T06:20:16.867677215Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 5 06:20:20.196580 dockerd[1825]: time="2025-09-05T06:20:20.196520421Z" level=info msg="Loading containers: start." Sep 5 06:20:20.342864 kernel: Initializing XFRM netlink socket Sep 5 06:20:20.993045 systemd-networkd[1494]: docker0: Link UP Sep 5 06:20:21.182612 dockerd[1825]: time="2025-09-05T06:20:21.182538189Z" level=info msg="Loading containers: done." Sep 5 06:20:21.199482 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1626168940-merged.mount: Deactivated successfully. Sep 5 06:20:21.312144 dockerd[1825]: time="2025-09-05T06:20:21.311946782Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 5 06:20:21.312144 dockerd[1825]: time="2025-09-05T06:20:21.312104711Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 5 06:20:21.312664 dockerd[1825]: time="2025-09-05T06:20:21.312304451Z" level=info msg="Initializing buildkit" Sep 5 06:20:21.358345 dockerd[1825]: time="2025-09-05T06:20:21.358280256Z" level=info msg="Completed buildkit initialization" Sep 5 06:20:21.364336 dockerd[1825]: time="2025-09-05T06:20:21.364273881Z" level=info msg="Daemon has completed initialization" Sep 5 06:20:21.364520 dockerd[1825]: time="2025-09-05T06:20:21.364425493Z" level=info msg="API listen on /run/docker.sock" Sep 5 06:20:21.364714 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 5 06:20:22.775100 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 5 06:20:22.778344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:23.016098 containerd[1585]: time="2025-09-05T06:20:23.016020185Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 5 06:20:23.110970 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:23.115096 (kubelet)[2051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:20:23.353784 kubelet[2051]: E0905 06:20:23.353666 2051 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:20:23.360649 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:20:23.360885 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:20:23.361279 systemd[1]: kubelet.service: Consumed 333ms CPU time, 111.8M memory peak. Sep 5 06:20:24.381799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3849643483.mount: Deactivated successfully. Sep 5 06:20:26.262154 containerd[1585]: time="2025-09-05T06:20:26.262063425Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:26.368275 containerd[1585]: time="2025-09-05T06:20:26.368179767Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078664" Sep 5 06:20:26.413411 containerd[1585]: time="2025-09-05T06:20:26.413314145Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:26.430388 containerd[1585]: time="2025-09-05T06:20:26.430306623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:26.431838 containerd[1585]: time="2025-09-05T06:20:26.431731534Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 3.415631022s" Sep 5 06:20:26.432032 containerd[1585]: time="2025-09-05T06:20:26.431860043Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Sep 5 06:20:26.433124 containerd[1585]: time="2025-09-05T06:20:26.433098721Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 5 06:20:30.942795 containerd[1585]: time="2025-09-05T06:20:30.942740967Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:30.943743 containerd[1585]: time="2025-09-05T06:20:30.943713227Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018066" Sep 5 06:20:30.945348 containerd[1585]: time="2025-09-05T06:20:30.945275132Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:30.948380 containerd[1585]: time="2025-09-05T06:20:30.948346982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:30.953311 containerd[1585]: time="2025-09-05T06:20:30.949710240Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 4.516577014s" Sep 5 06:20:30.953362 containerd[1585]: time="2025-09-05T06:20:30.953312796Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Sep 5 06:20:30.953921 containerd[1585]: time="2025-09-05T06:20:30.953875974Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 5 06:20:32.214430 containerd[1585]: time="2025-09-05T06:20:32.214380601Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:32.215238 containerd[1585]: time="2025-09-05T06:20:32.215184585Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153911" Sep 5 06:20:32.216383 containerd[1585]: time="2025-09-05T06:20:32.216347675Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:32.218948 containerd[1585]: time="2025-09-05T06:20:32.218900781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:32.220067 containerd[1585]: time="2025-09-05T06:20:32.220036182Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.265922969s" Sep 5 06:20:32.220067 containerd[1585]: time="2025-09-05T06:20:32.220067534Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Sep 5 06:20:32.220583 containerd[1585]: time="2025-09-05T06:20:32.220557494Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 5 06:20:33.611469 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 5 06:20:33.613521 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:33.873722 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:33.900409 (kubelet)[2129]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:20:34.801845 kubelet[2129]: E0905 06:20:34.801780 2129 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:20:34.806120 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:20:34.806314 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:20:34.806714 systemd[1]: kubelet.service: Consumed 342ms CPU time, 109.3M memory peak. Sep 5 06:20:35.325845 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount98429534.mount: Deactivated successfully. Sep 5 06:20:37.074056 containerd[1585]: time="2025-09-05T06:20:37.073896979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:37.076761 containerd[1585]: time="2025-09-05T06:20:37.076681296Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899626" Sep 5 06:20:37.078353 containerd[1585]: time="2025-09-05T06:20:37.078293135Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:37.080681 containerd[1585]: time="2025-09-05T06:20:37.080627652Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:37.081370 containerd[1585]: time="2025-09-05T06:20:37.081297935Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 4.860706901s" Sep 5 06:20:37.081370 containerd[1585]: time="2025-09-05T06:20:37.081350489Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Sep 5 06:20:37.082311 containerd[1585]: time="2025-09-05T06:20:37.082099478Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 5 06:20:38.556029 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount759578328.mount: Deactivated successfully. Sep 5 06:20:40.486564 containerd[1585]: time="2025-09-05T06:20:40.486475121Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:40.487783 containerd[1585]: time="2025-09-05T06:20:40.487697048Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 5 06:20:40.489569 containerd[1585]: time="2025-09-05T06:20:40.489529541Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:40.492718 containerd[1585]: time="2025-09-05T06:20:40.492680547Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:40.493876 containerd[1585]: time="2025-09-05T06:20:40.493834216Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 3.411677589s" Sep 5 06:20:40.493876 containerd[1585]: time="2025-09-05T06:20:40.493871096Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 5 06:20:40.494540 containerd[1585]: time="2025-09-05T06:20:40.494512518Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 5 06:20:41.809340 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount681742923.mount: Deactivated successfully. Sep 5 06:20:41.816856 containerd[1585]: time="2025-09-05T06:20:41.816800537Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:20:41.817620 containerd[1585]: time="2025-09-05T06:20:41.817579273Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 5 06:20:41.818937 containerd[1585]: time="2025-09-05T06:20:41.818898059Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:20:41.821600 containerd[1585]: time="2025-09-05T06:20:41.821514783Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 06:20:41.822517 containerd[1585]: time="2025-09-05T06:20:41.822442563Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.327896994s" Sep 5 06:20:41.822517 containerd[1585]: time="2025-09-05T06:20:41.822495332Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 5 06:20:41.823145 containerd[1585]: time="2025-09-05T06:20:41.823095341Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 5 06:20:42.406308 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2097424307.mount: Deactivated successfully. Sep 5 06:20:44.159898 containerd[1585]: time="2025-09-05T06:20:44.159803254Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:44.160566 containerd[1585]: time="2025-09-05T06:20:44.160510502Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377871" Sep 5 06:20:44.161874 containerd[1585]: time="2025-09-05T06:20:44.161839662Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:44.164579 containerd[1585]: time="2025-09-05T06:20:44.164519032Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:20:44.165710 containerd[1585]: time="2025-09-05T06:20:44.165677049Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.342531154s" Sep 5 06:20:44.165710 containerd[1585]: time="2025-09-05T06:20:44.165710409Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 5 06:20:44.870718 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 5 06:20:44.873120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:45.087272 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:45.112130 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 06:20:45.151926 kubelet[2288]: E0905 06:20:45.151747 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 06:20:45.156595 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 06:20:45.156825 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 06:20:45.157281 systemd[1]: kubelet.service: Consumed 226ms CPU time, 110M memory peak. Sep 5 06:20:46.807281 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:46.807500 systemd[1]: kubelet.service: Consumed 226ms CPU time, 110M memory peak. Sep 5 06:20:46.810346 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:46.842923 systemd[1]: Reload requested from client PID 2304 ('systemctl') (unit session-7.scope)... Sep 5 06:20:46.842944 systemd[1]: Reloading... Sep 5 06:20:46.946896 zram_generator::config[2350]: No configuration found. Sep 5 06:20:48.035303 systemd[1]: Reloading finished in 1191 ms. Sep 5 06:20:48.122042 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 5 06:20:48.122173 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 5 06:20:48.122557 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:48.122610 systemd[1]: kubelet.service: Consumed 172ms CPU time, 98.2M memory peak. Sep 5 06:20:48.124563 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:48.380638 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:48.398469 (kubelet)[2395]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 06:20:48.445958 kubelet[2395]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:20:48.445958 kubelet[2395]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 5 06:20:48.445958 kubelet[2395]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:20:48.446483 kubelet[2395]: I0905 06:20:48.446013 2395 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 06:20:49.609559 kubelet[2395]: I0905 06:20:49.609464 2395 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 5 06:20:49.609559 kubelet[2395]: I0905 06:20:49.609527 2395 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 06:20:49.610265 kubelet[2395]: I0905 06:20:49.609977 2395 server.go:956] "Client rotation is on, will bootstrap in background" Sep 5 06:20:50.546108 kubelet[2395]: E0905 06:20:50.546019 2395 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.140:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 5 06:20:50.546330 kubelet[2395]: I0905 06:20:50.546264 2395 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 06:20:50.553052 kubelet[2395]: I0905 06:20:50.553022 2395 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 5 06:20:50.561042 kubelet[2395]: I0905 06:20:50.560993 2395 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 06:20:50.561404 kubelet[2395]: I0905 06:20:50.561347 2395 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 06:20:50.561694 kubelet[2395]: I0905 06:20:50.561387 2395 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 06:20:50.561694 kubelet[2395]: I0905 06:20:50.561632 2395 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 06:20:50.561694 kubelet[2395]: I0905 06:20:50.561643 2395 container_manager_linux.go:303] "Creating device plugin manager" Sep 5 06:20:50.562554 kubelet[2395]: I0905 06:20:50.562514 2395 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:20:50.564790 kubelet[2395]: I0905 06:20:50.564758 2395 kubelet.go:480] "Attempting to sync node with API server" Sep 5 06:20:50.564866 kubelet[2395]: I0905 06:20:50.564841 2395 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 06:20:50.564899 kubelet[2395]: I0905 06:20:50.564879 2395 kubelet.go:386] "Adding apiserver pod source" Sep 5 06:20:50.564930 kubelet[2395]: I0905 06:20:50.564921 2395 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 06:20:50.573567 kubelet[2395]: I0905 06:20:50.573418 2395 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 5 06:20:50.574358 kubelet[2395]: I0905 06:20:50.574290 2395 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 5 06:20:50.575256 kubelet[2395]: E0905 06:20:50.575175 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.140:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 5 06:20:50.575389 kubelet[2395]: W0905 06:20:50.575370 2395 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 5 06:20:50.576515 kubelet[2395]: E0905 06:20:50.576475 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.140:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 5 06:20:50.579918 kubelet[2395]: I0905 06:20:50.579891 2395 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 5 06:20:50.579989 kubelet[2395]: I0905 06:20:50.579948 2395 server.go:1289] "Started kubelet" Sep 5 06:20:50.580186 kubelet[2395]: I0905 06:20:50.580124 2395 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 06:20:50.582965 kubelet[2395]: I0905 06:20:50.582940 2395 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 06:20:50.583096 kubelet[2395]: I0905 06:20:50.583064 2395 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 06:20:50.583191 kubelet[2395]: I0905 06:20:50.582952 2395 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 06:20:50.584306 kubelet[2395]: I0905 06:20:50.584275 2395 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 06:20:50.585186 kubelet[2395]: I0905 06:20:50.585156 2395 server.go:317] "Adding debug handlers to kubelet server" Sep 5 06:20:50.587068 kubelet[2395]: I0905 06:20:50.585964 2395 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 5 06:20:50.587068 kubelet[2395]: I0905 06:20:50.586038 2395 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 5 06:20:50.587068 kubelet[2395]: I0905 06:20:50.586081 2395 reconciler.go:26] "Reconciler: start to sync state" Sep 5 06:20:50.587068 kubelet[2395]: E0905 06:20:50.586421 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.140:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 5 06:20:50.587068 kubelet[2395]: E0905 06:20:50.584014 2395 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.140:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.140:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18624e9f4442e488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-05 06:20:50.579915912 +0000 UTC m=+2.176133504,LastTimestamp:2025-09-05 06:20:50.579915912 +0000 UTC m=+2.176133504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 5 06:20:50.587068 kubelet[2395]: E0905 06:20:50.587038 2395 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:20:50.587668 kubelet[2395]: E0905 06:20:50.587622 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.140:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.140:6443: connect: connection refused" interval="200ms" Sep 5 06:20:50.587909 kubelet[2395]: E0905 06:20:50.587873 2395 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 06:20:50.587963 kubelet[2395]: I0905 06:20:50.587919 2395 factory.go:223] Registration of the containerd container factory successfully Sep 5 06:20:50.587963 kubelet[2395]: I0905 06:20:50.587933 2395 factory.go:223] Registration of the systemd container factory successfully Sep 5 06:20:50.588042 kubelet[2395]: I0905 06:20:50.588019 2395 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 06:20:50.604448 kubelet[2395]: I0905 06:20:50.604420 2395 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 5 06:20:50.604448 kubelet[2395]: I0905 06:20:50.604436 2395 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 5 06:20:50.604448 kubelet[2395]: I0905 06:20:50.604454 2395 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:20:50.607729 kubelet[2395]: I0905 06:20:50.607676 2395 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 5 06:20:50.608252 kubelet[2395]: I0905 06:20:50.608219 2395 policy_none.go:49] "None policy: Start" Sep 5 06:20:50.608252 kubelet[2395]: I0905 06:20:50.608248 2395 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 5 06:20:50.608338 kubelet[2395]: I0905 06:20:50.608263 2395 state_mem.go:35] "Initializing new in-memory state store" Sep 5 06:20:50.609829 kubelet[2395]: I0905 06:20:50.609547 2395 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 5 06:20:50.609829 kubelet[2395]: I0905 06:20:50.609572 2395 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 5 06:20:50.609829 kubelet[2395]: I0905 06:20:50.609594 2395 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 5 06:20:50.609829 kubelet[2395]: I0905 06:20:50.609603 2395 kubelet.go:2436] "Starting kubelet main sync loop" Sep 5 06:20:50.609829 kubelet[2395]: E0905 06:20:50.609687 2395 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 06:20:50.612330 kubelet[2395]: E0905 06:20:50.612292 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.140:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 5 06:20:50.619744 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 5 06:20:50.632175 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 5 06:20:50.635592 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 5 06:20:50.658295 kubelet[2395]: E0905 06:20:50.658249 2395 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 5 06:20:50.658777 kubelet[2395]: I0905 06:20:50.658760 2395 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 06:20:50.658881 kubelet[2395]: I0905 06:20:50.658776 2395 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 06:20:50.659558 kubelet[2395]: I0905 06:20:50.659040 2395 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 06:20:50.659937 kubelet[2395]: E0905 06:20:50.659904 2395 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 5 06:20:50.660059 kubelet[2395]: E0905 06:20:50.660042 2395 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 5 06:20:50.722645 systemd[1]: Created slice kubepods-burstable-pod733869d080fc73b0d4a8990046979896.slice - libcontainer container kubepods-burstable-pod733869d080fc73b0d4a8990046979896.slice. Sep 5 06:20:50.731830 kubelet[2395]: E0905 06:20:50.731779 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:50.735775 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Sep 5 06:20:50.737625 kubelet[2395]: E0905 06:20:50.737595 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:50.749928 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Sep 5 06:20:50.751740 kubelet[2395]: E0905 06:20:50.751706 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:50.761028 kubelet[2395]: I0905 06:20:50.760990 2395 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:20:50.761353 kubelet[2395]: E0905 06:20:50.761331 2395 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.140:6443/api/v1/nodes\": dial tcp 10.0.0.140:6443: connect: connection refused" node="localhost" Sep 5 06:20:50.788255 kubelet[2395]: E0905 06:20:50.788185 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.140:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.140:6443: connect: connection refused" interval="400ms" Sep 5 06:20:50.887623 kubelet[2395]: I0905 06:20:50.887582 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:50.887623 kubelet[2395]: I0905 06:20:50.887622 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:50.887780 kubelet[2395]: I0905 06:20:50.887640 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:50.887780 kubelet[2395]: I0905 06:20:50.887658 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:50.887780 kubelet[2395]: I0905 06:20:50.887704 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:50.887780 kubelet[2395]: I0905 06:20:50.887726 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:50.887780 kubelet[2395]: I0905 06:20:50.887745 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:50.887916 kubelet[2395]: I0905 06:20:50.887764 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:50.887916 kubelet[2395]: I0905 06:20:50.887783 2395 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:50.963730 kubelet[2395]: I0905 06:20:50.963685 2395 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:20:50.964086 kubelet[2395]: E0905 06:20:50.964046 2395 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.140:6443/api/v1/nodes\": dial tcp 10.0.0.140:6443: connect: connection refused" node="localhost" Sep 5 06:20:51.034130 containerd[1585]: time="2025-09-05T06:20:51.034060730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:733869d080fc73b0d4a8990046979896,Namespace:kube-system,Attempt:0,}" Sep 5 06:20:51.038690 containerd[1585]: time="2025-09-05T06:20:51.038644192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Sep 5 06:20:51.053681 containerd[1585]: time="2025-09-05T06:20:51.053532702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Sep 5 06:20:51.077042 containerd[1585]: time="2025-09-05T06:20:51.076869654Z" level=info msg="connecting to shim 58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430" address="unix:///run/containerd/s/464ffd1681f2f5f8ceed0f4584449a9872b37db2d8ddbd4fb25400364431aeb4" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:20:51.082876 containerd[1585]: time="2025-09-05T06:20:51.082824794Z" level=info msg="connecting to shim 8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d" address="unix:///run/containerd/s/92525c51b7a69b3f48c49d7c74ff5caf8f4a699dac0790508d4c3b6eb3a93f03" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:20:51.136652 containerd[1585]: time="2025-09-05T06:20:51.136589545Z" level=info msg="connecting to shim f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe" address="unix:///run/containerd/s/ec82f8976d7db02d63a2e097783845a8d88ed6d8e74ea6c7934b24e9d80d71c2" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:20:51.164154 systemd[1]: Started cri-containerd-58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430.scope - libcontainer container 58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430. Sep 5 06:20:51.169260 systemd[1]: Started cri-containerd-8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d.scope - libcontainer container 8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d. Sep 5 06:20:51.172098 systemd[1]: Started cri-containerd-f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe.scope - libcontainer container f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe. Sep 5 06:20:51.223489 kubelet[2395]: E0905 06:20:51.223284 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.140:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.140:6443: connect: connection refused" interval="800ms" Sep 5 06:20:51.306395 containerd[1585]: time="2025-09-05T06:20:51.306335128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:733869d080fc73b0d4a8990046979896,Namespace:kube-system,Attempt:0,} returns sandbox id \"58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430\"" Sep 5 06:20:51.309905 containerd[1585]: time="2025-09-05T06:20:51.309619306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d\"" Sep 5 06:20:51.314417 containerd[1585]: time="2025-09-05T06:20:51.314375004Z" level=info msg="CreateContainer within sandbox \"58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 5 06:20:51.316972 containerd[1585]: time="2025-09-05T06:20:51.316929144Z" level=info msg="CreateContainer within sandbox \"8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 5 06:20:51.352423 containerd[1585]: time="2025-09-05T06:20:51.352342838Z" level=info msg="Container 024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:20:51.352571 containerd[1585]: time="2025-09-05T06:20:51.352427047Z" level=info msg="Container 74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:20:51.364562 containerd[1585]: time="2025-09-05T06:20:51.364516096Z" level=info msg="CreateContainer within sandbox \"8d520782aeb5033ac6500327ca635ea125ebc97aaa36cc9567db6874eda0528d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd\"" Sep 5 06:20:51.364788 containerd[1585]: time="2025-09-05T06:20:51.364752617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe\"" Sep 5 06:20:51.365394 containerd[1585]: time="2025-09-05T06:20:51.365357224Z" level=info msg="StartContainer for \"024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd\"" Sep 5 06:20:51.365894 kubelet[2395]: I0905 06:20:51.365861 2395 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:20:51.366212 kubelet[2395]: E0905 06:20:51.366190 2395 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.140:6443/api/v1/nodes\": dial tcp 10.0.0.140:6443: connect: connection refused" node="localhost" Sep 5 06:20:51.366952 containerd[1585]: time="2025-09-05T06:20:51.366907461Z" level=info msg="connecting to shim 024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd" address="unix:///run/containerd/s/92525c51b7a69b3f48c49d7c74ff5caf8f4a699dac0790508d4c3b6eb3a93f03" protocol=ttrpc version=3 Sep 5 06:20:51.370011 containerd[1585]: time="2025-09-05T06:20:51.369983913Z" level=info msg="CreateContainer within sandbox \"f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 5 06:20:51.370952 containerd[1585]: time="2025-09-05T06:20:51.370919411Z" level=info msg="CreateContainer within sandbox \"58d983655cb81490bfc39660a9d9ca7bbc97a5385afd934f1994bb90a4534430\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd\"" Sep 5 06:20:51.371239 containerd[1585]: time="2025-09-05T06:20:51.371200331Z" level=info msg="StartContainer for \"74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd\"" Sep 5 06:20:51.372148 containerd[1585]: time="2025-09-05T06:20:51.372121007Z" level=info msg="connecting to shim 74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd" address="unix:///run/containerd/s/464ffd1681f2f5f8ceed0f4584449a9872b37db2d8ddbd4fb25400364431aeb4" protocol=ttrpc version=3 Sep 5 06:20:51.381325 containerd[1585]: time="2025-09-05T06:20:51.380856924Z" level=info msg="Container 35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:20:51.393047 containerd[1585]: time="2025-09-05T06:20:51.392990383Z" level=info msg="CreateContainer within sandbox \"f9a405e67ad804e483bf478f78017e9f8f92646a33f9671e03929496f4e0a4fe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a\"" Sep 5 06:20:51.393584 containerd[1585]: time="2025-09-05T06:20:51.393542842Z" level=info msg="StartContainer for \"35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a\"" Sep 5 06:20:51.394095 systemd[1]: Started cri-containerd-024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd.scope - libcontainer container 024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd. Sep 5 06:20:51.399229 systemd[1]: Started cri-containerd-74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd.scope - libcontainer container 74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd. Sep 5 06:20:51.401196 kubelet[2395]: E0905 06:20:51.401151 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.140:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 5 06:20:51.405615 kubelet[2395]: E0905 06:20:51.405539 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.140:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 5 06:20:51.427430 containerd[1585]: time="2025-09-05T06:20:51.427302967Z" level=info msg="connecting to shim 35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a" address="unix:///run/containerd/s/ec82f8976d7db02d63a2e097783845a8d88ed6d8e74ea6c7934b24e9d80d71c2" protocol=ttrpc version=3 Sep 5 06:20:51.456137 kubelet[2395]: E0905 06:20:51.456077 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.140:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 5 06:20:51.456956 systemd[1]: Started cri-containerd-35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a.scope - libcontainer container 35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a. Sep 5 06:20:51.484179 containerd[1585]: time="2025-09-05T06:20:51.484132163Z" level=info msg="StartContainer for \"74953e10cfd45136c554455e190bae26ccfa5325e51bc2474528575ef47d10dd\" returns successfully" Sep 5 06:20:51.497977 kubelet[2395]: E0905 06:20:51.497908 2395 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.140:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.140:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 5 06:20:51.519319 containerd[1585]: time="2025-09-05T06:20:51.519262702Z" level=info msg="StartContainer for \"024d7ee1468c2d811306d8312f52ff2b75a1873ec349d2a172386ce36949e3fd\" returns successfully" Sep 5 06:20:51.578861 containerd[1585]: time="2025-09-05T06:20:51.572962688Z" level=info msg="StartContainer for \"35c384d66625ba34bccb276597d5bf64d604d6b7af06ac2101e1b03707ea9a9a\" returns successfully" Sep 5 06:20:51.619798 kubelet[2395]: E0905 06:20:51.619743 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:51.628068 kubelet[2395]: E0905 06:20:51.628022 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:51.629241 kubelet[2395]: E0905 06:20:51.629202 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:52.167725 kubelet[2395]: I0905 06:20:52.167678 2395 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:20:52.635760 kubelet[2395]: E0905 06:20:52.635583 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:52.636441 kubelet[2395]: E0905 06:20:52.636319 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:53.636380 kubelet[2395]: E0905 06:20:53.636339 2395 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 5 06:20:53.871051 kubelet[2395]: E0905 06:20:53.871003 2395 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 5 06:20:54.321760 kubelet[2395]: I0905 06:20:54.321681 2395 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 5 06:20:54.321760 kubelet[2395]: E0905 06:20:54.321745 2395 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 5 06:20:54.382887 update_engine[1575]: I20250905 06:20:54.382735 1575 update_attempter.cc:509] Updating boot flags... Sep 5 06:20:54.389096 kubelet[2395]: I0905 06:20:54.388939 2395 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:54.582995 kubelet[2395]: I0905 06:20:54.582835 2395 apiserver.go:52] "Watching apiserver" Sep 5 06:20:54.586722 kubelet[2395]: I0905 06:20:54.586690 2395 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 5 06:20:54.746368 kubelet[2395]: E0905 06:20:54.745998 2395 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:54.746368 kubelet[2395]: I0905 06:20:54.746032 2395 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:54.747774 kubelet[2395]: E0905 06:20:54.747730 2395 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:54.747774 kubelet[2395]: I0905 06:20:54.747758 2395 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:54.749691 kubelet[2395]: E0905 06:20:54.749647 2395 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:54.831435 kubelet[2395]: I0905 06:20:54.831396 2395 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:54.833646 kubelet[2395]: E0905 06:20:54.833544 2395 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:57.838019 systemd[1]: Reload requested from client PID 2694 ('systemctl') (unit session-7.scope)... Sep 5 06:20:57.838044 systemd[1]: Reloading... Sep 5 06:20:57.974855 zram_generator::config[2740]: No configuration found. Sep 5 06:20:58.254517 systemd[1]: Reloading finished in 415 ms. Sep 5 06:20:58.288357 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:58.305187 systemd[1]: kubelet.service: Deactivated successfully. Sep 5 06:20:58.305657 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:58.305744 systemd[1]: kubelet.service: Consumed 2.004s CPU time, 133.9M memory peak. Sep 5 06:20:58.308498 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 06:20:58.568828 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 06:20:58.586497 (kubelet)[2782]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 06:20:58.656492 kubelet[2782]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:20:58.656492 kubelet[2782]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 5 06:20:58.656492 kubelet[2782]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 06:20:58.657104 kubelet[2782]: I0905 06:20:58.656503 2782 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 06:20:58.667357 kubelet[2782]: I0905 06:20:58.667289 2782 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 5 06:20:58.667357 kubelet[2782]: I0905 06:20:58.667337 2782 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 06:20:58.667678 kubelet[2782]: I0905 06:20:58.667651 2782 server.go:956] "Client rotation is on, will bootstrap in background" Sep 5 06:20:58.669361 kubelet[2782]: I0905 06:20:58.669340 2782 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 5 06:20:58.672165 kubelet[2782]: I0905 06:20:58.672097 2782 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 06:20:58.681303 kubelet[2782]: I0905 06:20:58.681257 2782 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 5 06:20:58.688761 kubelet[2782]: I0905 06:20:58.688693 2782 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 06:20:58.689299 kubelet[2782]: I0905 06:20:58.689231 2782 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 06:20:58.689638 kubelet[2782]: I0905 06:20:58.689278 2782 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 5 06:20:58.689761 kubelet[2782]: I0905 06:20:58.689651 2782 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 06:20:58.689761 kubelet[2782]: I0905 06:20:58.689666 2782 container_manager_linux.go:303] "Creating device plugin manager" Sep 5 06:20:58.689761 kubelet[2782]: I0905 06:20:58.689727 2782 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:20:58.690024 kubelet[2782]: I0905 06:20:58.689986 2782 kubelet.go:480] "Attempting to sync node with API server" Sep 5 06:20:58.690024 kubelet[2782]: I0905 06:20:58.690014 2782 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 06:20:58.690140 kubelet[2782]: I0905 06:20:58.690050 2782 kubelet.go:386] "Adding apiserver pod source" Sep 5 06:20:58.690140 kubelet[2782]: I0905 06:20:58.690073 2782 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 06:20:58.693428 kubelet[2782]: I0905 06:20:58.693388 2782 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 5 06:20:58.694302 kubelet[2782]: I0905 06:20:58.694273 2782 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 5 06:20:58.699036 kubelet[2782]: I0905 06:20:58.698999 2782 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 5 06:20:58.699165 kubelet[2782]: I0905 06:20:58.699052 2782 server.go:1289] "Started kubelet" Sep 5 06:20:58.700773 kubelet[2782]: I0905 06:20:58.700748 2782 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 06:20:58.708367 kubelet[2782]: I0905 06:20:58.708321 2782 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 5 06:20:58.708660 kubelet[2782]: I0905 06:20:58.708405 2782 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 5 06:20:58.708660 kubelet[2782]: I0905 06:20:58.708507 2782 reconciler.go:26] "Reconciler: start to sync state" Sep 5 06:20:58.708850 kubelet[2782]: E0905 06:20:58.708776 2782 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 5 06:20:58.708918 kubelet[2782]: I0905 06:20:58.708830 2782 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 06:20:58.709790 kubelet[2782]: I0905 06:20:58.709744 2782 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 06:20:58.710177 kubelet[2782]: I0905 06:20:58.710141 2782 server.go:317] "Adding debug handlers to kubelet server" Sep 5 06:20:58.713770 kubelet[2782]: E0905 06:20:58.713725 2782 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 06:20:58.713770 kubelet[2782]: I0905 06:20:58.713734 2782 factory.go:223] Registration of the systemd container factory successfully Sep 5 06:20:58.714051 kubelet[2782]: I0905 06:20:58.713878 2782 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 06:20:58.714681 kubelet[2782]: I0905 06:20:58.714433 2782 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 06:20:58.714920 kubelet[2782]: I0905 06:20:58.714893 2782 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 06:20:58.715395 kubelet[2782]: I0905 06:20:58.715356 2782 factory.go:223] Registration of the containerd container factory successfully Sep 5 06:20:58.720696 kubelet[2782]: I0905 06:20:58.720659 2782 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 5 06:20:58.722279 kubelet[2782]: I0905 06:20:58.722263 2782 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 5 06:20:58.722381 kubelet[2782]: I0905 06:20:58.722369 2782 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 5 06:20:58.722457 kubelet[2782]: I0905 06:20:58.722445 2782 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 5 06:20:58.722511 kubelet[2782]: I0905 06:20:58.722503 2782 kubelet.go:2436] "Starting kubelet main sync loop" Sep 5 06:20:58.722641 kubelet[2782]: E0905 06:20:58.722611 2782 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 06:20:58.768221 kubelet[2782]: I0905 06:20:58.768181 2782 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 5 06:20:58.768221 kubelet[2782]: I0905 06:20:58.768203 2782 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 5 06:20:58.768221 kubelet[2782]: I0905 06:20:58.768231 2782 state_mem.go:36] "Initialized new in-memory state store" Sep 5 06:20:58.768455 kubelet[2782]: I0905 06:20:58.768416 2782 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 5 06:20:58.768455 kubelet[2782]: I0905 06:20:58.768429 2782 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 5 06:20:58.768455 kubelet[2782]: I0905 06:20:58.768455 2782 policy_none.go:49] "None policy: Start" Sep 5 06:20:58.768521 kubelet[2782]: I0905 06:20:58.768467 2782 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 5 06:20:58.768521 kubelet[2782]: I0905 06:20:58.768480 2782 state_mem.go:35] "Initializing new in-memory state store" Sep 5 06:20:58.768633 kubelet[2782]: I0905 06:20:58.768619 2782 state_mem.go:75] "Updated machine memory state" Sep 5 06:20:58.774283 kubelet[2782]: E0905 06:20:58.774218 2782 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 5 06:20:58.774620 kubelet[2782]: I0905 06:20:58.774574 2782 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 06:20:58.774689 kubelet[2782]: I0905 06:20:58.774619 2782 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 06:20:58.775262 kubelet[2782]: I0905 06:20:58.775132 2782 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 06:20:58.776247 kubelet[2782]: E0905 06:20:58.776186 2782 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 5 06:20:58.825165 kubelet[2782]: I0905 06:20:58.824523 2782 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:58.825165 kubelet[2782]: I0905 06:20:58.824879 2782 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:58.825948 kubelet[2782]: I0905 06:20:58.825471 2782 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:58.878173 kubelet[2782]: I0905 06:20:58.878111 2782 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 5 06:20:58.910262 kubelet[2782]: I0905 06:20:58.910165 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:58.910262 kubelet[2782]: I0905 06:20:58.910231 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:58.910262 kubelet[2782]: I0905 06:20:58.910281 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:58.910509 kubelet[2782]: I0905 06:20:58.910303 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:58.910509 kubelet[2782]: I0905 06:20:58.910327 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:58.910509 kubelet[2782]: I0905 06:20:58.910347 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:58.910509 kubelet[2782]: I0905 06:20:58.910367 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/733869d080fc73b0d4a8990046979896-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"733869d080fc73b0d4a8990046979896\") " pod="kube-system/kube-apiserver-localhost" Sep 5 06:20:58.910509 kubelet[2782]: I0905 06:20:58.910395 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:58.910718 kubelet[2782]: I0905 06:20:58.910434 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 5 06:20:59.564126 kubelet[2782]: I0905 06:20:59.564081 2782 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 5 06:20:59.564401 kubelet[2782]: I0905 06:20:59.564193 2782 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 5 06:20:59.692347 kubelet[2782]: I0905 06:20:59.692292 2782 apiserver.go:52] "Watching apiserver" Sep 5 06:20:59.709063 kubelet[2782]: I0905 06:20:59.709038 2782 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 5 06:20:59.740910 kubelet[2782]: I0905 06:20:59.740163 2782 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 5 06:20:59.740910 kubelet[2782]: I0905 06:20:59.740299 2782 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:21:00.094686 kubelet[2782]: E0905 06:21:00.094013 2782 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 5 06:21:00.094686 kubelet[2782]: E0905 06:21:00.094129 2782 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 5 06:21:00.094686 kubelet[2782]: I0905 06:21:00.093793 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.093774102 podStartE2EDuration="2.093774102s" podCreationTimestamp="2025-09-05 06:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:00.093757587 +0000 UTC m=+1.500999475" watchObservedRunningTime="2025-09-05 06:21:00.093774102 +0000 UTC m=+1.501015980" Sep 5 06:21:00.399485 kubelet[2782]: I0905 06:21:00.399411 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.399388616 podStartE2EDuration="2.399388616s" podCreationTimestamp="2025-09-05 06:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:00.320959313 +0000 UTC m=+1.728201191" watchObservedRunningTime="2025-09-05 06:21:00.399388616 +0000 UTC m=+1.806630494" Sep 5 06:21:01.228843 kubelet[2782]: I0905 06:21:01.228699 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.228678317 podStartE2EDuration="3.228678317s" podCreationTimestamp="2025-09-05 06:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:00.399762936 +0000 UTC m=+1.807004814" watchObservedRunningTime="2025-09-05 06:21:01.228678317 +0000 UTC m=+2.635920196" Sep 5 06:21:04.771837 kubelet[2782]: I0905 06:21:04.771785 2782 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 5 06:21:04.772394 containerd[1585]: time="2025-09-05T06:21:04.772214136Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 5 06:21:04.772644 kubelet[2782]: I0905 06:21:04.772470 2782 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 5 06:21:05.292512 systemd[1]: Created slice kubepods-besteffort-podd1945b0e_1439_4135_9016_192540761539.slice - libcontainer container kubepods-besteffort-podd1945b0e_1439_4135_9016_192540761539.slice. Sep 5 06:21:05.345046 kubelet[2782]: I0905 06:21:05.344988 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d1945b0e-1439-4135-9016-192540761539-kube-proxy\") pod \"kube-proxy-vznj9\" (UID: \"d1945b0e-1439-4135-9016-192540761539\") " pod="kube-system/kube-proxy-vznj9" Sep 5 06:21:05.345046 kubelet[2782]: I0905 06:21:05.345032 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d1945b0e-1439-4135-9016-192540761539-xtables-lock\") pod \"kube-proxy-vznj9\" (UID: \"d1945b0e-1439-4135-9016-192540761539\") " pod="kube-system/kube-proxy-vznj9" Sep 5 06:21:05.345046 kubelet[2782]: I0905 06:21:05.345057 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1945b0e-1439-4135-9016-192540761539-lib-modules\") pod \"kube-proxy-vznj9\" (UID: \"d1945b0e-1439-4135-9016-192540761539\") " pod="kube-system/kube-proxy-vznj9" Sep 5 06:21:05.345280 kubelet[2782]: I0905 06:21:05.345079 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2dcl\" (UniqueName: \"kubernetes.io/projected/d1945b0e-1439-4135-9016-192540761539-kube-api-access-t2dcl\") pod \"kube-proxy-vznj9\" (UID: \"d1945b0e-1439-4135-9016-192540761539\") " pod="kube-system/kube-proxy-vznj9" Sep 5 06:21:05.493628 systemd[1]: Created slice kubepods-besteffort-pod806f3c95_9e05_4820_bb98_e53f8d3c1f78.slice - libcontainer container kubepods-besteffort-pod806f3c95_9e05_4820_bb98_e53f8d3c1f78.slice. Sep 5 06:21:05.546494 kubelet[2782]: I0905 06:21:05.546353 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mkd\" (UniqueName: \"kubernetes.io/projected/806f3c95-9e05-4820-bb98-e53f8d3c1f78-kube-api-access-64mkd\") pod \"tigera-operator-755d956888-m6z9h\" (UID: \"806f3c95-9e05-4820-bb98-e53f8d3c1f78\") " pod="tigera-operator/tigera-operator-755d956888-m6z9h" Sep 5 06:21:05.546494 kubelet[2782]: I0905 06:21:05.546400 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/806f3c95-9e05-4820-bb98-e53f8d3c1f78-var-lib-calico\") pod \"tigera-operator-755d956888-m6z9h\" (UID: \"806f3c95-9e05-4820-bb98-e53f8d3c1f78\") " pod="tigera-operator/tigera-operator-755d956888-m6z9h" Sep 5 06:21:05.612083 containerd[1585]: time="2025-09-05T06:21:05.612031219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vznj9,Uid:d1945b0e-1439-4135-9016-192540761539,Namespace:kube-system,Attempt:0,}" Sep 5 06:21:05.676022 containerd[1585]: time="2025-09-05T06:21:05.675962092Z" level=info msg="connecting to shim b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8" address="unix:///run/containerd/s/7595dd56eed98e6131f2d33f80d689688e8b6f550cde31b6b4b7c0dd84937211" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:05.703004 systemd[1]: Started cri-containerd-b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8.scope - libcontainer container b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8. Sep 5 06:21:05.733980 containerd[1585]: time="2025-09-05T06:21:05.733927509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vznj9,Uid:d1945b0e-1439-4135-9016-192540761539,Namespace:kube-system,Attempt:0,} returns sandbox id \"b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8\"" Sep 5 06:21:05.740322 containerd[1585]: time="2025-09-05T06:21:05.740280074Z" level=info msg="CreateContainer within sandbox \"b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 5 06:21:05.752012 containerd[1585]: time="2025-09-05T06:21:05.751958305Z" level=info msg="Container 1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:05.767062 containerd[1585]: time="2025-09-05T06:21:05.767009761Z" level=info msg="CreateContainer within sandbox \"b89155b41593e5cd59deaa4475b9fc8785a2b76a253c9bf8422604caedb433a8\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6\"" Sep 5 06:21:05.767704 containerd[1585]: time="2025-09-05T06:21:05.767669572Z" level=info msg="StartContainer for \"1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6\"" Sep 5 06:21:05.769228 containerd[1585]: time="2025-09-05T06:21:05.769193820Z" level=info msg="connecting to shim 1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6" address="unix:///run/containerd/s/7595dd56eed98e6131f2d33f80d689688e8b6f550cde31b6b4b7c0dd84937211" protocol=ttrpc version=3 Sep 5 06:21:05.788986 systemd[1]: Started cri-containerd-1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6.scope - libcontainer container 1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6. Sep 5 06:21:05.797650 containerd[1585]: time="2025-09-05T06:21:05.797491323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-m6z9h,Uid:806f3c95-9e05-4820-bb98-e53f8d3c1f78,Namespace:tigera-operator,Attempt:0,}" Sep 5 06:21:05.818867 containerd[1585]: time="2025-09-05T06:21:05.818783904Z" level=info msg="connecting to shim 03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd" address="unix:///run/containerd/s/f319f7e35c63fec673d2dca5cb418c5fa04724f010a2eff53b724a6bb9bcd83e" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:05.843671 containerd[1585]: time="2025-09-05T06:21:05.843605491Z" level=info msg="StartContainer for \"1164c28e1f099d0f159128aec875f869830270f0b485065e425370ad68988bc6\" returns successfully" Sep 5 06:21:05.852535 systemd[1]: Started cri-containerd-03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd.scope - libcontainer container 03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd. Sep 5 06:21:05.904120 containerd[1585]: time="2025-09-05T06:21:05.904081370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-m6z9h,Uid:806f3c95-9e05-4820-bb98-e53f8d3c1f78,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd\"" Sep 5 06:21:05.906372 containerd[1585]: time="2025-09-05T06:21:05.905837285Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 5 06:21:06.470775 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1933544824.mount: Deactivated successfully. Sep 5 06:21:06.763109 kubelet[2782]: I0905 06:21:06.762837 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vznj9" podStartSLOduration=2.762800778 podStartE2EDuration="2.762800778s" podCreationTimestamp="2025-09-05 06:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:06.762545612 +0000 UTC m=+8.169787490" watchObservedRunningTime="2025-09-05 06:21:06.762800778 +0000 UTC m=+8.170042656" Sep 5 06:21:09.287628 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount328457473.mount: Deactivated successfully. Sep 5 06:21:10.450770 containerd[1585]: time="2025-09-05T06:21:10.450683476Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:10.475264 containerd[1585]: time="2025-09-05T06:21:10.475187144Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 5 06:21:10.488844 containerd[1585]: time="2025-09-05T06:21:10.488761513Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:10.514436 containerd[1585]: time="2025-09-05T06:21:10.514351909Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:10.515052 containerd[1585]: time="2025-09-05T06:21:10.515008322Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 4.60913859s" Sep 5 06:21:10.515142 containerd[1585]: time="2025-09-05T06:21:10.515051831Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 5 06:21:10.567664 containerd[1585]: time="2025-09-05T06:21:10.567583444Z" level=info msg="CreateContainer within sandbox \"03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 5 06:21:10.672078 containerd[1585]: time="2025-09-05T06:21:10.672020360Z" level=info msg="Container 5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:10.677705 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1852587015.mount: Deactivated successfully. Sep 5 06:21:10.898141 containerd[1585]: time="2025-09-05T06:21:10.898084365Z" level=info msg="CreateContainer within sandbox \"03b1fc72fb692836654a4a5e96c36d7cf287340201b7ab4bbb012fe3e88533cd\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666\"" Sep 5 06:21:10.898761 containerd[1585]: time="2025-09-05T06:21:10.898643801Z" level=info msg="StartContainer for \"5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666\"" Sep 5 06:21:10.900164 containerd[1585]: time="2025-09-05T06:21:10.899502726Z" level=info msg="connecting to shim 5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666" address="unix:///run/containerd/s/f319f7e35c63fec673d2dca5cb418c5fa04724f010a2eff53b724a6bb9bcd83e" protocol=ttrpc version=3 Sep 5 06:21:10.967075 systemd[1]: Started cri-containerd-5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666.scope - libcontainer container 5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666. Sep 5 06:21:11.237256 containerd[1585]: time="2025-09-05T06:21:11.237099046Z" level=info msg="StartContainer for \"5f3985b0c9ad05037eeaf946497ea20d78350bf83ce7d8aec4030dbd977ae666\" returns successfully" Sep 5 06:21:16.622048 sudo[1805]: pam_unix(sudo:session): session closed for user root Sep 5 06:21:16.624851 sshd[1804]: Connection closed by 10.0.0.1 port 42380 Sep 5 06:21:16.626198 sshd-session[1801]: pam_unix(sshd:session): session closed for user core Sep 5 06:21:16.636536 systemd[1]: sshd@6-10.0.0.140:22-10.0.0.1:42380.service: Deactivated successfully. Sep 5 06:21:16.642890 systemd[1]: session-7.scope: Deactivated successfully. Sep 5 06:21:16.644872 systemd[1]: session-7.scope: Consumed 5.917s CPU time, 227.6M memory peak. Sep 5 06:21:16.646507 systemd-logind[1570]: Session 7 logged out. Waiting for processes to exit. Sep 5 06:21:16.651501 systemd-logind[1570]: Removed session 7. Sep 5 06:21:20.703711 kubelet[2782]: I0905 06:21:20.703611 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-m6z9h" podStartSLOduration=11.092891166 podStartE2EDuration="15.70359267s" podCreationTimestamp="2025-09-05 06:21:05 +0000 UTC" firstStartedPulling="2025-09-05 06:21:05.905287811 +0000 UTC m=+7.312529689" lastFinishedPulling="2025-09-05 06:21:10.515989315 +0000 UTC m=+11.923231193" observedRunningTime="2025-09-05 06:21:11.775858599 +0000 UTC m=+13.183100477" watchObservedRunningTime="2025-09-05 06:21:20.70359267 +0000 UTC m=+22.110834548" Sep 5 06:21:20.722439 systemd[1]: Created slice kubepods-besteffort-pod17aeba91_5f6e_4863_b094_b1322e805f33.slice - libcontainer container kubepods-besteffort-pod17aeba91_5f6e_4863_b094_b1322e805f33.slice. Sep 5 06:21:20.860945 kubelet[2782]: I0905 06:21:20.860847 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17aeba91-5f6e-4863-b094-b1322e805f33-tigera-ca-bundle\") pod \"calico-typha-7c9644548d-vzhrr\" (UID: \"17aeba91-5f6e-4863-b094-b1322e805f33\") " pod="calico-system/calico-typha-7c9644548d-vzhrr" Sep 5 06:21:20.860945 kubelet[2782]: I0905 06:21:20.860923 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwkj\" (UniqueName: \"kubernetes.io/projected/17aeba91-5f6e-4863-b094-b1322e805f33-kube-api-access-8lwkj\") pod \"calico-typha-7c9644548d-vzhrr\" (UID: \"17aeba91-5f6e-4863-b094-b1322e805f33\") " pod="calico-system/calico-typha-7c9644548d-vzhrr" Sep 5 06:21:20.860945 kubelet[2782]: I0905 06:21:20.860953 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/17aeba91-5f6e-4863-b094-b1322e805f33-typha-certs\") pod \"calico-typha-7c9644548d-vzhrr\" (UID: \"17aeba91-5f6e-4863-b094-b1322e805f33\") " pod="calico-system/calico-typha-7c9644548d-vzhrr" Sep 5 06:21:21.032420 containerd[1585]: time="2025-09-05T06:21:21.032282544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c9644548d-vzhrr,Uid:17aeba91-5f6e-4863-b094-b1322e805f33,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:21.061061 systemd[1]: Created slice kubepods-besteffort-pod703cccdc_4845_4624_88f5_32cc362bbcc9.slice - libcontainer container kubepods-besteffort-pod703cccdc_4845_4624_88f5_32cc362bbcc9.slice. Sep 5 06:21:21.074839 containerd[1585]: time="2025-09-05T06:21:21.074766917Z" level=info msg="connecting to shim c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb" address="unix:///run/containerd/s/bc8184b56e2551990b73c8c1c74c72e47c69b110d68eab20e7cdda34d0c02b34" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:21.104943 systemd[1]: Started cri-containerd-c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb.scope - libcontainer container c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb. Sep 5 06:21:21.158192 containerd[1585]: time="2025-09-05T06:21:21.158122610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c9644548d-vzhrr,Uid:17aeba91-5f6e-4863-b094-b1322e805f33,Namespace:calico-system,Attempt:0,} returns sandbox id \"c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb\"" Sep 5 06:21:21.159767 containerd[1585]: time="2025-09-05T06:21:21.159744071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 5 06:21:21.162999 kubelet[2782]: I0905 06:21:21.162965 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/703cccdc-4845-4624-88f5-32cc362bbcc9-tigera-ca-bundle\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163103 kubelet[2782]: I0905 06:21:21.163005 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-cni-bin-dir\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163103 kubelet[2782]: I0905 06:21:21.163021 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-lib-modules\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163103 kubelet[2782]: I0905 06:21:21.163041 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/703cccdc-4845-4624-88f5-32cc362bbcc9-node-certs\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163103 kubelet[2782]: I0905 06:21:21.163057 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-xtables-lock\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163103 kubelet[2782]: I0905 06:21:21.163082 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-policysync\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163231 kubelet[2782]: I0905 06:21:21.163097 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-cni-net-dir\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163231 kubelet[2782]: I0905 06:21:21.163126 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-var-run-calico\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163231 kubelet[2782]: I0905 06:21:21.163143 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-var-lib-calico\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163231 kubelet[2782]: I0905 06:21:21.163158 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-cni-log-dir\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163231 kubelet[2782]: I0905 06:21:21.163179 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/703cccdc-4845-4624-88f5-32cc362bbcc9-flexvol-driver-host\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.163342 kubelet[2782]: I0905 06:21:21.163207 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznjg\" (UniqueName: \"kubernetes.io/projected/703cccdc-4845-4624-88f5-32cc362bbcc9-kube-api-access-dznjg\") pod \"calico-node-wqdqv\" (UID: \"703cccdc-4845-4624-88f5-32cc362bbcc9\") " pod="calico-system/calico-node-wqdqv" Sep 5 06:21:21.266033 kubelet[2782]: E0905 06:21:21.265881 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.266033 kubelet[2782]: W0905 06:21:21.265912 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.266033 kubelet[2782]: E0905 06:21:21.265936 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.266880 kubelet[2782]: E0905 06:21:21.266849 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.266880 kubelet[2782]: W0905 06:21:21.266864 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.266880 kubelet[2782]: E0905 06:21:21.266874 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.269100 kubelet[2782]: E0905 06:21:21.267779 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.269100 kubelet[2782]: W0905 06:21:21.267789 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.269100 kubelet[2782]: E0905 06:21:21.267799 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.269100 kubelet[2782]: E0905 06:21:21.268034 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.269100 kubelet[2782]: W0905 06:21:21.268042 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.269100 kubelet[2782]: E0905 06:21:21.268050 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.269516 kubelet[2782]: E0905 06:21:21.269432 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.269516 kubelet[2782]: W0905 06:21:21.269451 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.269516 kubelet[2782]: E0905 06:21:21.269462 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.270920 kubelet[2782]: E0905 06:21:21.270139 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.270920 kubelet[2782]: W0905 06:21:21.270887 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.270920 kubelet[2782]: E0905 06:21:21.270901 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.272008 kubelet[2782]: E0905 06:21:21.271612 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.272008 kubelet[2782]: W0905 06:21:21.271626 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.272008 kubelet[2782]: E0905 06:21:21.271636 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.272669 kubelet[2782]: E0905 06:21:21.272645 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.272669 kubelet[2782]: W0905 06:21:21.272659 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.273952 kubelet[2782]: E0905 06:21:21.272686 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.273952 kubelet[2782]: E0905 06:21:21.273890 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.273952 kubelet[2782]: W0905 06:21:21.273900 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.274052 kubelet[2782]: E0905 06:21:21.273938 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.274420 kubelet[2782]: E0905 06:21:21.274384 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.274420 kubelet[2782]: W0905 06:21:21.274412 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.274571 kubelet[2782]: E0905 06:21:21.274442 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.281564 kubelet[2782]: E0905 06:21:21.281531 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.281564 kubelet[2782]: W0905 06:21:21.281557 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.281657 kubelet[2782]: E0905 06:21:21.281581 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.292416 kubelet[2782]: E0905 06:21:21.292149 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:21.363873 kubelet[2782]: E0905 06:21:21.363786 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.364156 kubelet[2782]: W0905 06:21:21.363957 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.364156 kubelet[2782]: E0905 06:21:21.363990 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.364547 kubelet[2782]: E0905 06:21:21.364436 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.364547 kubelet[2782]: W0905 06:21:21.364452 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.364547 kubelet[2782]: E0905 06:21:21.364463 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.364936 kubelet[2782]: E0905 06:21:21.364799 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.364936 kubelet[2782]: W0905 06:21:21.364832 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.364936 kubelet[2782]: E0905 06:21:21.364842 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.365387 kubelet[2782]: E0905 06:21:21.365342 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.365387 kubelet[2782]: W0905 06:21:21.365376 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.365574 kubelet[2782]: E0905 06:21:21.365422 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.365836 kubelet[2782]: E0905 06:21:21.365800 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.365836 kubelet[2782]: W0905 06:21:21.365833 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.365918 kubelet[2782]: E0905 06:21:21.365845 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.366107 kubelet[2782]: E0905 06:21:21.366087 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.366107 kubelet[2782]: W0905 06:21:21.366102 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.366194 kubelet[2782]: E0905 06:21:21.366113 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.366374 kubelet[2782]: E0905 06:21:21.366353 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.366374 kubelet[2782]: W0905 06:21:21.366367 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.366374 kubelet[2782]: E0905 06:21:21.366376 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.366915 kubelet[2782]: E0905 06:21:21.366739 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.366915 kubelet[2782]: W0905 06:21:21.366760 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.366915 kubelet[2782]: E0905 06:21:21.366772 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.367229 kubelet[2782]: E0905 06:21:21.367184 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.367229 kubelet[2782]: W0905 06:21:21.367210 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.367229 kubelet[2782]: E0905 06:21:21.367220 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.367431 kubelet[2782]: E0905 06:21:21.367395 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.367431 kubelet[2782]: W0905 06:21:21.367402 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.367431 kubelet[2782]: E0905 06:21:21.367410 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.367671 containerd[1585]: time="2025-09-05T06:21:21.367518207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wqdqv,Uid:703cccdc-4845-4624-88f5-32cc362bbcc9,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:21.367710 kubelet[2782]: E0905 06:21:21.367670 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.367710 kubelet[2782]: W0905 06:21:21.367678 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.367710 kubelet[2782]: E0905 06:21:21.367687 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.368204 kubelet[2782]: E0905 06:21:21.367867 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.368204 kubelet[2782]: W0905 06:21:21.367879 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.368204 kubelet[2782]: E0905 06:21:21.367887 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.368204 kubelet[2782]: E0905 06:21:21.368086 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.368204 kubelet[2782]: W0905 06:21:21.368094 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.368204 kubelet[2782]: E0905 06:21:21.368101 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.368395 kubelet[2782]: E0905 06:21:21.368263 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.368395 kubelet[2782]: W0905 06:21:21.368271 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.368395 kubelet[2782]: E0905 06:21:21.368279 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.368488 kubelet[2782]: E0905 06:21:21.368460 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.368488 kubelet[2782]: W0905 06:21:21.368468 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.368488 kubelet[2782]: E0905 06:21:21.368476 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.368903 kubelet[2782]: E0905 06:21:21.368704 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.368903 kubelet[2782]: W0905 06:21:21.368717 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.368903 kubelet[2782]: E0905 06:21:21.368726 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.369133 kubelet[2782]: E0905 06:21:21.369104 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.369133 kubelet[2782]: W0905 06:21:21.369121 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.369133 kubelet[2782]: E0905 06:21:21.369130 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.369396 kubelet[2782]: E0905 06:21:21.369373 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.369396 kubelet[2782]: W0905 06:21:21.369389 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.369396 kubelet[2782]: E0905 06:21:21.369398 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.369700 kubelet[2782]: E0905 06:21:21.369654 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.369700 kubelet[2782]: W0905 06:21:21.369698 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.369775 kubelet[2782]: E0905 06:21:21.369709 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.370008 kubelet[2782]: E0905 06:21:21.369965 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.370008 kubelet[2782]: W0905 06:21:21.369980 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.370008 kubelet[2782]: E0905 06:21:21.369989 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.371384 kubelet[2782]: E0905 06:21:21.371357 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.371384 kubelet[2782]: W0905 06:21:21.371375 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.371384 kubelet[2782]: E0905 06:21:21.371385 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.371505 kubelet[2782]: I0905 06:21:21.371420 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/184974df-56dd-43e9-9fda-6672bf4bc449-kubelet-dir\") pod \"csi-node-driver-fmvm2\" (UID: \"184974df-56dd-43e9-9fda-6672bf4bc449\") " pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:21.371852 kubelet[2782]: E0905 06:21:21.371708 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.371852 kubelet[2782]: W0905 06:21:21.371722 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.371852 kubelet[2782]: E0905 06:21:21.371732 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.371852 kubelet[2782]: I0905 06:21:21.371768 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/184974df-56dd-43e9-9fda-6672bf4bc449-socket-dir\") pod \"csi-node-driver-fmvm2\" (UID: \"184974df-56dd-43e9-9fda-6672bf4bc449\") " pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:21.372523 kubelet[2782]: E0905 06:21:21.371996 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.372523 kubelet[2782]: W0905 06:21:21.372010 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.372523 kubelet[2782]: E0905 06:21:21.372018 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.372523 kubelet[2782]: I0905 06:21:21.372312 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/184974df-56dd-43e9-9fda-6672bf4bc449-registration-dir\") pod \"csi-node-driver-fmvm2\" (UID: \"184974df-56dd-43e9-9fda-6672bf4bc449\") " pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:21.372873 kubelet[2782]: E0905 06:21:21.372792 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.372952 kubelet[2782]: W0905 06:21:21.372915 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.373093 kubelet[2782]: E0905 06:21:21.372973 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.373093 kubelet[2782]: I0905 06:21:21.373080 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/184974df-56dd-43e9-9fda-6672bf4bc449-varrun\") pod \"csi-node-driver-fmvm2\" (UID: \"184974df-56dd-43e9-9fda-6672bf4bc449\") " pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:21.373676 kubelet[2782]: E0905 06:21:21.373438 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.373676 kubelet[2782]: W0905 06:21:21.373455 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.373676 kubelet[2782]: E0905 06:21:21.373465 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.373968 kubelet[2782]: E0905 06:21:21.373936 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.374010 kubelet[2782]: W0905 06:21:21.373973 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.374034 kubelet[2782]: E0905 06:21:21.374008 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.374320 kubelet[2782]: E0905 06:21:21.374290 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.374320 kubelet[2782]: W0905 06:21:21.374305 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.374320 kubelet[2782]: E0905 06:21:21.374315 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.374592 kubelet[2782]: E0905 06:21:21.374538 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.374894 kubelet[2782]: W0905 06:21:21.374580 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.374940 kubelet[2782]: E0905 06:21:21.374894 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.375348 kubelet[2782]: E0905 06:21:21.375327 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.375348 kubelet[2782]: W0905 06:21:21.375338 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.375572 kubelet[2782]: E0905 06:21:21.375543 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.375880 kubelet[2782]: I0905 06:21:21.375854 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpc8r\" (UniqueName: \"kubernetes.io/projected/184974df-56dd-43e9-9fda-6672bf4bc449-kube-api-access-fpc8r\") pod \"csi-node-driver-fmvm2\" (UID: \"184974df-56dd-43e9-9fda-6672bf4bc449\") " pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:21.377962 kubelet[2782]: E0905 06:21:21.377941 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.377962 kubelet[2782]: W0905 06:21:21.377955 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.378060 kubelet[2782]: E0905 06:21:21.377967 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.378324 kubelet[2782]: E0905 06:21:21.378282 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.378324 kubelet[2782]: W0905 06:21:21.378298 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.378324 kubelet[2782]: E0905 06:21:21.378313 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.378565 kubelet[2782]: E0905 06:21:21.378540 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.378565 kubelet[2782]: W0905 06:21:21.378555 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.378565 kubelet[2782]: E0905 06:21:21.378565 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.378910 kubelet[2782]: E0905 06:21:21.378871 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.378910 kubelet[2782]: W0905 06:21:21.378886 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.378910 kubelet[2782]: E0905 06:21:21.378896 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.379117 kubelet[2782]: E0905 06:21:21.379098 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.379250 kubelet[2782]: W0905 06:21:21.379124 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.379250 kubelet[2782]: E0905 06:21:21.379136 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.379375 kubelet[2782]: E0905 06:21:21.379365 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.379375 kubelet[2782]: W0905 06:21:21.379375 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.379510 kubelet[2782]: E0905 06:21:21.379387 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.393053 containerd[1585]: time="2025-09-05T06:21:21.392995771Z" level=info msg="connecting to shim 5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539" address="unix:///run/containerd/s/7d6166a1c6bb772089047bfa35911cb17a835a5eb362c79ccb4bf4e3c7aa93d9" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:21.422956 systemd[1]: Started cri-containerd-5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539.scope - libcontainer container 5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539. Sep 5 06:21:21.480012 kubelet[2782]: E0905 06:21:21.479967 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.480012 kubelet[2782]: W0905 06:21:21.480001 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.480170 kubelet[2782]: E0905 06:21:21.480028 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.480354 kubelet[2782]: E0905 06:21:21.480326 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.480389 kubelet[2782]: W0905 06:21:21.480352 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.480389 kubelet[2782]: E0905 06:21:21.480381 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.480716 kubelet[2782]: E0905 06:21:21.480683 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.480716 kubelet[2782]: W0905 06:21:21.480704 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.480776 kubelet[2782]: E0905 06:21:21.480718 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.481194 kubelet[2782]: E0905 06:21:21.481160 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.481194 kubelet[2782]: W0905 06:21:21.481181 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.481194 kubelet[2782]: E0905 06:21:21.481193 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.481446 kubelet[2782]: E0905 06:21:21.481414 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.481446 kubelet[2782]: W0905 06:21:21.481433 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.481492 kubelet[2782]: E0905 06:21:21.481447 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.481735 kubelet[2782]: E0905 06:21:21.481713 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.481735 kubelet[2782]: W0905 06:21:21.481729 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.481790 kubelet[2782]: E0905 06:21:21.481770 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.482080 kubelet[2782]: E0905 06:21:21.482046 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.482080 kubelet[2782]: W0905 06:21:21.482066 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.482080 kubelet[2782]: E0905 06:21:21.482079 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.482372 kubelet[2782]: E0905 06:21:21.482352 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.482372 kubelet[2782]: W0905 06:21:21.482368 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.482425 kubelet[2782]: E0905 06:21:21.482383 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.482650 kubelet[2782]: E0905 06:21:21.482630 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.482650 kubelet[2782]: W0905 06:21:21.482645 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.482707 kubelet[2782]: E0905 06:21:21.482658 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.482943 kubelet[2782]: E0905 06:21:21.482924 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.482943 kubelet[2782]: W0905 06:21:21.482940 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.482989 kubelet[2782]: E0905 06:21:21.482953 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.483226 kubelet[2782]: E0905 06:21:21.483206 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.483226 kubelet[2782]: W0905 06:21:21.483221 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.483281 kubelet[2782]: E0905 06:21:21.483233 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.483572 kubelet[2782]: E0905 06:21:21.483554 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.483572 kubelet[2782]: W0905 06:21:21.483568 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.483639 kubelet[2782]: E0905 06:21:21.483578 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.483794 kubelet[2782]: E0905 06:21:21.483774 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.483794 kubelet[2782]: W0905 06:21:21.483787 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.483794 kubelet[2782]: E0905 06:21:21.483796 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.484023 kubelet[2782]: E0905 06:21:21.484010 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.484023 kubelet[2782]: W0905 06:21:21.484019 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.484073 kubelet[2782]: E0905 06:21:21.484029 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.484190 kubelet[2782]: E0905 06:21:21.484177 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.484190 kubelet[2782]: W0905 06:21:21.484186 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.484242 kubelet[2782]: E0905 06:21:21.484193 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.484466 kubelet[2782]: E0905 06:21:21.484357 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.484466 kubelet[2782]: W0905 06:21:21.484370 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.484466 kubelet[2782]: E0905 06:21:21.484380 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.484593 kubelet[2782]: E0905 06:21:21.484581 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.484716 kubelet[2782]: W0905 06:21:21.484632 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.484716 kubelet[2782]: E0905 06:21:21.484655 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.485085 kubelet[2782]: E0905 06:21:21.485048 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.485085 kubelet[2782]: W0905 06:21:21.485057 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.485085 kubelet[2782]: E0905 06:21:21.485066 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.485268 kubelet[2782]: E0905 06:21:21.485249 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.485268 kubelet[2782]: W0905 06:21:21.485260 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.485268 kubelet[2782]: E0905 06:21:21.485269 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.485462 kubelet[2782]: E0905 06:21:21.485426 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.485462 kubelet[2782]: W0905 06:21:21.485435 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.485462 kubelet[2782]: E0905 06:21:21.485445 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.485724 kubelet[2782]: E0905 06:21:21.485704 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.485724 kubelet[2782]: W0905 06:21:21.485717 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.485852 kubelet[2782]: E0905 06:21:21.485727 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.486011 kubelet[2782]: E0905 06:21:21.485994 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.486011 kubelet[2782]: W0905 06:21:21.486007 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.486062 kubelet[2782]: E0905 06:21:21.486016 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.486190 kubelet[2782]: E0905 06:21:21.486171 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.486190 kubelet[2782]: W0905 06:21:21.486183 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.486190 kubelet[2782]: E0905 06:21:21.486190 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.486342 kubelet[2782]: E0905 06:21:21.486327 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.486342 kubelet[2782]: W0905 06:21:21.486337 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.486395 kubelet[2782]: E0905 06:21:21.486348 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.486554 kubelet[2782]: E0905 06:21:21.486537 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.486554 kubelet[2782]: W0905 06:21:21.486548 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.486605 kubelet[2782]: E0905 06:21:21.486557 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.492800 kubelet[2782]: E0905 06:21:21.492747 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:21.492800 kubelet[2782]: W0905 06:21:21.492786 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:21.492897 kubelet[2782]: E0905 06:21:21.492829 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:21.661112 containerd[1585]: time="2025-09-05T06:21:21.661070568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wqdqv,Uid:703cccdc-4845-4624-88f5-32cc362bbcc9,Namespace:calico-system,Attempt:0,} returns sandbox id \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\"" Sep 5 06:21:22.723927 kubelet[2782]: E0905 06:21:22.723849 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:23.189832 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3994615568.mount: Deactivated successfully. Sep 5 06:21:24.725068 kubelet[2782]: E0905 06:21:24.723370 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:25.331045 containerd[1585]: time="2025-09-05T06:21:25.330991183Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:25.332691 containerd[1585]: time="2025-09-05T06:21:25.332634821Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 5 06:21:25.335219 containerd[1585]: time="2025-09-05T06:21:25.334205355Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:25.336460 containerd[1585]: time="2025-09-05T06:21:25.336392067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:25.336831 containerd[1585]: time="2025-09-05T06:21:25.336771046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 4.176998027s" Sep 5 06:21:25.336831 containerd[1585]: time="2025-09-05T06:21:25.336832527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 5 06:21:25.338015 containerd[1585]: time="2025-09-05T06:21:25.337904035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 5 06:21:25.355340 containerd[1585]: time="2025-09-05T06:21:25.355292191Z" level=info msg="CreateContainer within sandbox \"c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 5 06:21:25.362881 containerd[1585]: time="2025-09-05T06:21:25.362841481Z" level=info msg="Container fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:25.370715 containerd[1585]: time="2025-09-05T06:21:25.370675063Z" level=info msg="CreateContainer within sandbox \"c98d4e97baa9124566a48ca64865d5d789bc03ab763f080b1dd3dc205a048dcb\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804\"" Sep 5 06:21:25.371223 containerd[1585]: time="2025-09-05T06:21:25.371189429Z" level=info msg="StartContainer for \"fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804\"" Sep 5 06:21:25.372297 containerd[1585]: time="2025-09-05T06:21:25.372273211Z" level=info msg="connecting to shim fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804" address="unix:///run/containerd/s/bc8184b56e2551990b73c8c1c74c72e47c69b110d68eab20e7cdda34d0c02b34" protocol=ttrpc version=3 Sep 5 06:21:25.399971 systemd[1]: Started cri-containerd-fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804.scope - libcontainer container fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804. Sep 5 06:21:25.455734 containerd[1585]: time="2025-09-05T06:21:25.455693961Z" level=info msg="StartContainer for \"fd3f41234e8918dde238fb5f7a9071bb88998abd53c785d634b664b93f0c9804\" returns successfully" Sep 5 06:21:25.809805 kubelet[2782]: I0905 06:21:25.809717 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7c9644548d-vzhrr" podStartSLOduration=1.631290173 podStartE2EDuration="5.809695085s" podCreationTimestamp="2025-09-05 06:21:20 +0000 UTC" firstStartedPulling="2025-09-05 06:21:21.159348325 +0000 UTC m=+22.566590203" lastFinishedPulling="2025-09-05 06:21:25.337753237 +0000 UTC m=+26.744995115" observedRunningTime="2025-09-05 06:21:25.809360052 +0000 UTC m=+27.216601940" watchObservedRunningTime="2025-09-05 06:21:25.809695085 +0000 UTC m=+27.216936963" Sep 5 06:21:25.894507 kubelet[2782]: E0905 06:21:25.894448 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.894507 kubelet[2782]: W0905 06:21:25.894477 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.894507 kubelet[2782]: E0905 06:21:25.894502 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.894803 kubelet[2782]: E0905 06:21:25.894783 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.894803 kubelet[2782]: W0905 06:21:25.894795 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.894884 kubelet[2782]: E0905 06:21:25.894839 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.895071 kubelet[2782]: E0905 06:21:25.895053 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.895071 kubelet[2782]: W0905 06:21:25.895065 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.895154 kubelet[2782]: E0905 06:21:25.895073 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.895341 kubelet[2782]: E0905 06:21:25.895300 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.895341 kubelet[2782]: W0905 06:21:25.895317 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.895341 kubelet[2782]: E0905 06:21:25.895328 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.895559 kubelet[2782]: E0905 06:21:25.895541 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.895559 kubelet[2782]: W0905 06:21:25.895553 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.895621 kubelet[2782]: E0905 06:21:25.895562 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.895757 kubelet[2782]: E0905 06:21:25.895738 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.895757 kubelet[2782]: W0905 06:21:25.895753 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.895839 kubelet[2782]: E0905 06:21:25.895762 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.895991 kubelet[2782]: E0905 06:21:25.895975 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.895991 kubelet[2782]: W0905 06:21:25.895985 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.896053 kubelet[2782]: E0905 06:21:25.895994 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.896234 kubelet[2782]: E0905 06:21:25.896216 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.896234 kubelet[2782]: W0905 06:21:25.896231 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.896285 kubelet[2782]: E0905 06:21:25.896242 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.896537 kubelet[2782]: E0905 06:21:25.896501 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.896575 kubelet[2782]: W0905 06:21:25.896534 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.896575 kubelet[2782]: E0905 06:21:25.896564 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.896845 kubelet[2782]: E0905 06:21:25.896828 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.896845 kubelet[2782]: W0905 06:21:25.896840 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.896902 kubelet[2782]: E0905 06:21:25.896850 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.897029 kubelet[2782]: E0905 06:21:25.897013 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.897029 kubelet[2782]: W0905 06:21:25.897024 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.897078 kubelet[2782]: E0905 06:21:25.897032 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.897243 kubelet[2782]: E0905 06:21:25.897227 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.897243 kubelet[2782]: W0905 06:21:25.897239 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.897296 kubelet[2782]: E0905 06:21:25.897247 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.897483 kubelet[2782]: E0905 06:21:25.897466 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.897483 kubelet[2782]: W0905 06:21:25.897479 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.897531 kubelet[2782]: E0905 06:21:25.897490 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.897706 kubelet[2782]: E0905 06:21:25.897690 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.897706 kubelet[2782]: W0905 06:21:25.897702 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.897758 kubelet[2782]: E0905 06:21:25.897711 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.897942 kubelet[2782]: E0905 06:21:25.897923 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.897942 kubelet[2782]: W0905 06:21:25.897936 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.898006 kubelet[2782]: E0905 06:21:25.897945 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.911664 kubelet[2782]: E0905 06:21:25.911604 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.911664 kubelet[2782]: W0905 06:21:25.911638 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.911664 kubelet[2782]: E0905 06:21:25.911668 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.912799 kubelet[2782]: E0905 06:21:25.912768 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.912799 kubelet[2782]: W0905 06:21:25.912781 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.912799 kubelet[2782]: E0905 06:21:25.912790 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913139 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914031 kubelet[2782]: W0905 06:21:25.913157 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913166 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913490 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914031 kubelet[2782]: W0905 06:21:25.913499 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913508 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913782 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914031 kubelet[2782]: W0905 06:21:25.913805 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914031 kubelet[2782]: E0905 06:21:25.913853 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914306 kubelet[2782]: E0905 06:21:25.914154 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914306 kubelet[2782]: W0905 06:21:25.914167 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914306 kubelet[2782]: E0905 06:21:25.914192 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914521 kubelet[2782]: E0905 06:21:25.914501 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914521 kubelet[2782]: W0905 06:21:25.914518 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914521 kubelet[2782]: E0905 06:21:25.914531 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.914858 kubelet[2782]: E0905 06:21:25.914838 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.914858 kubelet[2782]: W0905 06:21:25.914854 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.914946 kubelet[2782]: E0905 06:21:25.914864 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.915159 kubelet[2782]: E0905 06:21:25.915123 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.915159 kubelet[2782]: W0905 06:21:25.915136 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.915159 kubelet[2782]: E0905 06:21:25.915147 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.915736 kubelet[2782]: E0905 06:21:25.915716 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.915736 kubelet[2782]: W0905 06:21:25.915731 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.915941 kubelet[2782]: E0905 06:21:25.915743 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.916034 kubelet[2782]: E0905 06:21:25.916012 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.916034 kubelet[2782]: W0905 06:21:25.916028 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.916177 kubelet[2782]: E0905 06:21:25.916039 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.916328 kubelet[2782]: E0905 06:21:25.916306 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.916328 kubelet[2782]: W0905 06:21:25.916322 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.916550 kubelet[2782]: E0905 06:21:25.916342 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.916615 kubelet[2782]: E0905 06:21:25.916596 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.916615 kubelet[2782]: W0905 06:21:25.916611 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.916784 kubelet[2782]: E0905 06:21:25.916623 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.916930 kubelet[2782]: E0905 06:21:25.916892 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.916930 kubelet[2782]: W0905 06:21:25.916908 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.916930 kubelet[2782]: E0905 06:21:25.916919 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.917372 kubelet[2782]: E0905 06:21:25.917344 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.917372 kubelet[2782]: W0905 06:21:25.917368 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.917606 kubelet[2782]: E0905 06:21:25.917391 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.917636 kubelet[2782]: E0905 06:21:25.917610 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.917636 kubelet[2782]: W0905 06:21:25.917619 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.917636 kubelet[2782]: E0905 06:21:25.917628 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.917995 kubelet[2782]: E0905 06:21:25.917970 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.917995 kubelet[2782]: W0905 06:21:25.917989 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.918066 kubelet[2782]: E0905 06:21:25.918001 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:25.918235 kubelet[2782]: E0905 06:21:25.918217 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:25.918235 kubelet[2782]: W0905 06:21:25.918234 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:25.918288 kubelet[2782]: E0905 06:21:25.918245 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.723922 kubelet[2782]: E0905 06:21:26.723864 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:26.798885 kubelet[2782]: I0905 06:21:26.798842 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:21:26.803113 kubelet[2782]: E0905 06:21:26.803083 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.803113 kubelet[2782]: W0905 06:21:26.803105 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.803278 kubelet[2782]: E0905 06:21:26.803129 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.803352 kubelet[2782]: E0905 06:21:26.803336 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.803381 kubelet[2782]: W0905 06:21:26.803350 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.803381 kubelet[2782]: E0905 06:21:26.803362 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.803605 kubelet[2782]: E0905 06:21:26.803579 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.803647 kubelet[2782]: W0905 06:21:26.803606 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.803647 kubelet[2782]: E0905 06:21:26.803622 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.803902 kubelet[2782]: E0905 06:21:26.803884 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.803902 kubelet[2782]: W0905 06:21:26.803898 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.803978 kubelet[2782]: E0905 06:21:26.803908 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.804189 kubelet[2782]: E0905 06:21:26.804159 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.804189 kubelet[2782]: W0905 06:21:26.804184 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.804316 kubelet[2782]: E0905 06:21:26.804196 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.804444 kubelet[2782]: E0905 06:21:26.804427 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.804444 kubelet[2782]: W0905 06:21:26.804440 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.804545 kubelet[2782]: E0905 06:21:26.804450 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.805109 kubelet[2782]: E0905 06:21:26.804846 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.805109 kubelet[2782]: W0905 06:21:26.804862 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.805109 kubelet[2782]: E0905 06:21:26.804874 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.805109 kubelet[2782]: E0905 06:21:26.805099 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.805109 kubelet[2782]: W0905 06:21:26.805110 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.805308 kubelet[2782]: E0905 06:21:26.805125 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.805358 kubelet[2782]: E0905 06:21:26.805350 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.805430 kubelet[2782]: W0905 06:21:26.805360 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.805430 kubelet[2782]: E0905 06:21:26.805387 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.805727 kubelet[2782]: E0905 06:21:26.805673 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.805727 kubelet[2782]: W0905 06:21:26.805698 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.805727 kubelet[2782]: E0905 06:21:26.805720 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.806170 kubelet[2782]: E0905 06:21:26.806020 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.806170 kubelet[2782]: W0905 06:21:26.806031 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.806170 kubelet[2782]: E0905 06:21:26.806045 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.806287 kubelet[2782]: E0905 06:21:26.806275 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.806287 kubelet[2782]: W0905 06:21:26.806284 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.806361 kubelet[2782]: E0905 06:21:26.806297 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.806530 kubelet[2782]: E0905 06:21:26.806513 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.806530 kubelet[2782]: W0905 06:21:26.806522 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.806685 kubelet[2782]: E0905 06:21:26.806531 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.806836 kubelet[2782]: E0905 06:21:26.806804 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.806836 kubelet[2782]: W0905 06:21:26.806834 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.806941 kubelet[2782]: E0905 06:21:26.806843 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.807041 kubelet[2782]: E0905 06:21:26.807024 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.807041 kubelet[2782]: W0905 06:21:26.807034 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.807122 kubelet[2782]: E0905 06:21:26.807043 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.819938 kubelet[2782]: E0905 06:21:26.819878 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.819938 kubelet[2782]: W0905 06:21:26.819915 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.819938 kubelet[2782]: E0905 06:21:26.819943 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.820576 kubelet[2782]: E0905 06:21:26.820224 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.820576 kubelet[2782]: W0905 06:21:26.820236 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.820576 kubelet[2782]: E0905 06:21:26.820247 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.820675 kubelet[2782]: E0905 06:21:26.820630 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.820675 kubelet[2782]: W0905 06:21:26.820664 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.820758 kubelet[2782]: E0905 06:21:26.820691 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.820948 kubelet[2782]: E0905 06:21:26.820931 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.820948 kubelet[2782]: W0905 06:21:26.820942 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.821031 kubelet[2782]: E0905 06:21:26.820954 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.821214 kubelet[2782]: E0905 06:21:26.821196 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.821214 kubelet[2782]: W0905 06:21:26.821206 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.821214 kubelet[2782]: E0905 06:21:26.821215 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.821537 kubelet[2782]: E0905 06:21:26.821518 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.821537 kubelet[2782]: W0905 06:21:26.821528 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.821537 kubelet[2782]: E0905 06:21:26.821537 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.821804 kubelet[2782]: E0905 06:21:26.821786 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.821804 kubelet[2782]: W0905 06:21:26.821796 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.821804 kubelet[2782]: E0905 06:21:26.821805 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.822099 kubelet[2782]: E0905 06:21:26.822079 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.822099 kubelet[2782]: W0905 06:21:26.822094 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.822203 kubelet[2782]: E0905 06:21:26.822111 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.822336 kubelet[2782]: E0905 06:21:26.822312 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.822336 kubelet[2782]: W0905 06:21:26.822329 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.822445 kubelet[2782]: E0905 06:21:26.822345 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.822585 kubelet[2782]: E0905 06:21:26.822567 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.822585 kubelet[2782]: W0905 06:21:26.822580 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.822648 kubelet[2782]: E0905 06:21:26.822591 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.822840 kubelet[2782]: E0905 06:21:26.822820 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.822840 kubelet[2782]: W0905 06:21:26.822834 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.822918 kubelet[2782]: E0905 06:21:26.822849 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.823205 kubelet[2782]: E0905 06:21:26.823185 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.823205 kubelet[2782]: W0905 06:21:26.823203 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.823263 kubelet[2782]: E0905 06:21:26.823217 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.823493 kubelet[2782]: E0905 06:21:26.823467 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.823493 kubelet[2782]: W0905 06:21:26.823486 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.823589 kubelet[2782]: E0905 06:21:26.823501 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.823840 kubelet[2782]: E0905 06:21:26.823822 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.823840 kubelet[2782]: W0905 06:21:26.823835 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.823910 kubelet[2782]: E0905 06:21:26.823846 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.824155 kubelet[2782]: E0905 06:21:26.824119 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.824155 kubelet[2782]: W0905 06:21:26.824140 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.824237 kubelet[2782]: E0905 06:21:26.824160 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.825204 kubelet[2782]: E0905 06:21:26.825173 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.825204 kubelet[2782]: W0905 06:21:26.825197 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.825267 kubelet[2782]: E0905 06:21:26.825208 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.825483 kubelet[2782]: E0905 06:21:26.825452 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.825483 kubelet[2782]: W0905 06:21:26.825473 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.825553 kubelet[2782]: E0905 06:21:26.825489 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:26.825866 kubelet[2782]: E0905 06:21:26.825850 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 06:21:26.825866 kubelet[2782]: W0905 06:21:26.825863 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 06:21:26.825931 kubelet[2782]: E0905 06:21:26.825874 2782 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 06:21:27.091142 containerd[1585]: time="2025-09-05T06:21:27.091003011Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:27.091858 containerd[1585]: time="2025-09-05T06:21:27.091762670Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 5 06:21:27.093332 containerd[1585]: time="2025-09-05T06:21:27.093293209Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:27.095662 containerd[1585]: time="2025-09-05T06:21:27.095331429Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:27.096036 containerd[1585]: time="2025-09-05T06:21:27.096004917Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.758067306s" Sep 5 06:21:27.096036 containerd[1585]: time="2025-09-05T06:21:27.096033644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 5 06:21:27.106152 containerd[1585]: time="2025-09-05T06:21:27.104046318Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 5 06:21:27.119973 containerd[1585]: time="2025-09-05T06:21:27.119904083Z" level=info msg="Container 2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:27.130371 containerd[1585]: time="2025-09-05T06:21:27.130311921Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\"" Sep 5 06:21:27.130952 containerd[1585]: time="2025-09-05T06:21:27.130920150Z" level=info msg="StartContainer for \"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\"" Sep 5 06:21:27.132592 containerd[1585]: time="2025-09-05T06:21:27.132557620Z" level=info msg="connecting to shim 2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b" address="unix:///run/containerd/s/7d6166a1c6bb772089047bfa35911cb17a835a5eb362c79ccb4bf4e3c7aa93d9" protocol=ttrpc version=3 Sep 5 06:21:27.161946 systemd[1]: Started cri-containerd-2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b.scope - libcontainer container 2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b. Sep 5 06:21:27.214940 containerd[1585]: time="2025-09-05T06:21:27.214851990Z" level=info msg="StartContainer for \"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\" returns successfully" Sep 5 06:21:27.226123 systemd[1]: cri-containerd-2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b.scope: Deactivated successfully. Sep 5 06:21:27.229083 containerd[1585]: time="2025-09-05T06:21:27.229040885Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\" id:\"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\" pid:3514 exited_at:{seconds:1757053287 nanos:228569795}" Sep 5 06:21:27.229253 containerd[1585]: time="2025-09-05T06:21:27.229150741Z" level=info msg="received exit event container_id:\"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\" id:\"2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b\" pid:3514 exited_at:{seconds:1757053287 nanos:228569795}" Sep 5 06:21:27.252505 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2cebef73609379ac150389edcb6c00e223c9205be560efeea009267fdfbb3c3b-rootfs.mount: Deactivated successfully. Sep 5 06:21:28.723597 kubelet[2782]: E0905 06:21:28.723533 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:28.806654 containerd[1585]: time="2025-09-05T06:21:28.806617395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 5 06:21:30.723584 kubelet[2782]: E0905 06:21:30.723504 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:32.723379 kubelet[2782]: E0905 06:21:32.723301 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:33.520876 containerd[1585]: time="2025-09-05T06:21:33.520822171Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:33.521872 containerd[1585]: time="2025-09-05T06:21:33.521842103Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 5 06:21:33.523501 containerd[1585]: time="2025-09-05T06:21:33.523414760Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:33.526201 containerd[1585]: time="2025-09-05T06:21:33.526160560Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:33.526753 containerd[1585]: time="2025-09-05T06:21:33.526715619Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.720062934s" Sep 5 06:21:33.526753 containerd[1585]: time="2025-09-05T06:21:33.526747692Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 5 06:21:33.532434 containerd[1585]: time="2025-09-05T06:21:33.532380218Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 5 06:21:33.545074 containerd[1585]: time="2025-09-05T06:21:33.545000780Z" level=info msg="Container 93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:33.556819 containerd[1585]: time="2025-09-05T06:21:33.556755122Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\"" Sep 5 06:21:33.557452 containerd[1585]: time="2025-09-05T06:21:33.557407583Z" level=info msg="StartContainer for \"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\"" Sep 5 06:21:33.558966 containerd[1585]: time="2025-09-05T06:21:33.558940111Z" level=info msg="connecting to shim 93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3" address="unix:///run/containerd/s/7d6166a1c6bb772089047bfa35911cb17a835a5eb362c79ccb4bf4e3c7aa93d9" protocol=ttrpc version=3 Sep 5 06:21:33.584026 systemd[1]: Started cri-containerd-93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3.scope - libcontainer container 93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3. Sep 5 06:21:33.660582 containerd[1585]: time="2025-09-05T06:21:33.660535057Z" level=info msg="StartContainer for \"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\" returns successfully" Sep 5 06:21:34.724041 kubelet[2782]: E0905 06:21:34.723965 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:35.317322 systemd[1]: cri-containerd-93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3.scope: Deactivated successfully. Sep 5 06:21:35.317770 systemd[1]: cri-containerd-93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3.scope: Consumed 668ms CPU time, 184.3M memory peak, 3.3M read from disk, 171.3M written to disk. Sep 5 06:21:35.319145 containerd[1585]: time="2025-09-05T06:21:35.319081325Z" level=info msg="received exit event container_id:\"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\" id:\"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\" pid:3574 exited_at:{seconds:1757053295 nanos:318846935}" Sep 5 06:21:35.319613 containerd[1585]: time="2025-09-05T06:21:35.319187292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\" id:\"93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3\" pid:3574 exited_at:{seconds:1757053295 nanos:318846935}" Sep 5 06:21:35.341684 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-93445fdbafdafd58ef1e61c52e71db16b6e32c22d07fc62164700e317f231cc3-rootfs.mount: Deactivated successfully. Sep 5 06:21:35.346644 kubelet[2782]: I0905 06:21:35.346614 2782 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 5 06:21:35.989032 systemd[1]: Created slice kubepods-burstable-pod6db321e6_384e_461f_b494_486fb3abaa39.slice - libcontainer container kubepods-burstable-pod6db321e6_384e_461f_b494_486fb3abaa39.slice. Sep 5 06:21:35.996996 systemd[1]: Created slice kubepods-besteffort-pod90d4fc6b_e10d_48c1_a100_581b1e45bea7.slice - libcontainer container kubepods-besteffort-pod90d4fc6b_e10d_48c1_a100_581b1e45bea7.slice. Sep 5 06:21:36.007198 systemd[1]: Created slice kubepods-besteffort-pode538d525_2f5b_4679_bb7b_2d7b2bfcc773.slice - libcontainer container kubepods-besteffort-pode538d525_2f5b_4679_bb7b_2d7b2bfcc773.slice. Sep 5 06:21:36.016115 systemd[1]: Created slice kubepods-burstable-podcbe88b9e_342d_4d7f_93cf_136f2f4d94ec.slice - libcontainer container kubepods-burstable-podcbe88b9e_342d_4d7f_93cf_136f2f4d94ec.slice. Sep 5 06:21:36.023465 systemd[1]: Created slice kubepods-besteffort-pod7d8d19a7_2cb7_4172_9d5e_ed2788eb3120.slice - libcontainer container kubepods-besteffort-pod7d8d19a7_2cb7_4172_9d5e_ed2788eb3120.slice. Sep 5 06:21:36.029606 systemd[1]: Created slice kubepods-besteffort-pod26a5a23e_432c_4622_993d_2e73ea07fb80.slice - libcontainer container kubepods-besteffort-pod26a5a23e_432c_4622_993d_2e73ea07fb80.slice. Sep 5 06:21:36.031322 kubelet[2782]: I0905 06:21:36.031277 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkw4\" (UniqueName: \"kubernetes.io/projected/26a5a23e-432c-4622-993d-2e73ea07fb80-kube-api-access-xlkw4\") pod \"calico-apiserver-cf68b5bff-zqmnv\" (UID: \"26a5a23e-432c-4622-993d-2e73ea07fb80\") " pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" Sep 5 06:21:36.031797 kubelet[2782]: I0905 06:21:36.031323 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pfxh\" (UniqueName: \"kubernetes.io/projected/90d4fc6b-e10d-48c1-a100-581b1e45bea7-kube-api-access-5pfxh\") pod \"whisker-65988bcf59-9jlv5\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " pod="calico-system/whisker-65988bcf59-9jlv5" Sep 5 06:21:36.031797 kubelet[2782]: I0905 06:21:36.031342 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcsw\" (UniqueName: \"kubernetes.io/projected/6db321e6-384e-461f-b494-486fb3abaa39-kube-api-access-xxcsw\") pod \"coredns-674b8bbfcf-pbptv\" (UID: \"6db321e6-384e-461f-b494-486fb3abaa39\") " pod="kube-system/coredns-674b8bbfcf-pbptv" Sep 5 06:21:36.031797 kubelet[2782]: I0905 06:21:36.031360 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/26a5a23e-432c-4622-993d-2e73ea07fb80-calico-apiserver-certs\") pod \"calico-apiserver-cf68b5bff-zqmnv\" (UID: \"26a5a23e-432c-4622-993d-2e73ea07fb80\") " pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" Sep 5 06:21:36.031797 kubelet[2782]: I0905 06:21:36.031375 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbe88b9e-342d-4d7f-93cf-136f2f4d94ec-config-volume\") pod \"coredns-674b8bbfcf-xhgtj\" (UID: \"cbe88b9e-342d-4d7f-93cf-136f2f4d94ec\") " pod="kube-system/coredns-674b8bbfcf-xhgtj" Sep 5 06:21:36.031797 kubelet[2782]: I0905 06:21:36.031390 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/164bbda0-024b-4642-aebe-7f8294f38db0-config\") pod \"goldmane-54d579b49d-kg56r\" (UID: \"164bbda0-024b-4642-aebe-7f8294f38db0\") " pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.031941 kubelet[2782]: I0905 06:21:36.031403 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/164bbda0-024b-4642-aebe-7f8294f38db0-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-kg56r\" (UID: \"164bbda0-024b-4642-aebe-7f8294f38db0\") " pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.031941 kubelet[2782]: I0905 06:21:36.031426 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d8d19a7-2cb7-4172-9d5e-ed2788eb3120-tigera-ca-bundle\") pod \"calico-kube-controllers-7b8d7b8bf-wqzvf\" (UID: \"7d8d19a7-2cb7-4172-9d5e-ed2788eb3120\") " pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" Sep 5 06:21:36.031941 kubelet[2782]: I0905 06:21:36.031446 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/164bbda0-024b-4642-aebe-7f8294f38db0-goldmane-key-pair\") pod \"goldmane-54d579b49d-kg56r\" (UID: \"164bbda0-024b-4642-aebe-7f8294f38db0\") " pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.031941 kubelet[2782]: I0905 06:21:36.031476 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx627\" (UniqueName: \"kubernetes.io/projected/7d8d19a7-2cb7-4172-9d5e-ed2788eb3120-kube-api-access-rx627\") pod \"calico-kube-controllers-7b8d7b8bf-wqzvf\" (UID: \"7d8d19a7-2cb7-4172-9d5e-ed2788eb3120\") " pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" Sep 5 06:21:36.031941 kubelet[2782]: I0905 06:21:36.031506 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e538d525-2f5b-4679-bb7b-2d7b2bfcc773-calico-apiserver-certs\") pod \"calico-apiserver-cf68b5bff-qcvfw\" (UID: \"e538d525-2f5b-4679-bb7b-2d7b2bfcc773\") " pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" Sep 5 06:21:36.032868 kubelet[2782]: I0905 06:21:36.031533 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrsf\" (UniqueName: \"kubernetes.io/projected/cbe88b9e-342d-4d7f-93cf-136f2f4d94ec-kube-api-access-8wrsf\") pod \"coredns-674b8bbfcf-xhgtj\" (UID: \"cbe88b9e-342d-4d7f-93cf-136f2f4d94ec\") " pod="kube-system/coredns-674b8bbfcf-xhgtj" Sep 5 06:21:36.032868 kubelet[2782]: I0905 06:21:36.031562 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2sz\" (UniqueName: \"kubernetes.io/projected/164bbda0-024b-4642-aebe-7f8294f38db0-kube-api-access-lz2sz\") pod \"goldmane-54d579b49d-kg56r\" (UID: \"164bbda0-024b-4642-aebe-7f8294f38db0\") " pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.032868 kubelet[2782]: I0905 06:21:36.031595 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5p84\" (UniqueName: \"kubernetes.io/projected/e538d525-2f5b-4679-bb7b-2d7b2bfcc773-kube-api-access-x5p84\") pod \"calico-apiserver-cf68b5bff-qcvfw\" (UID: \"e538d525-2f5b-4679-bb7b-2d7b2bfcc773\") " pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" Sep 5 06:21:36.032868 kubelet[2782]: I0905 06:21:36.031612 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-ca-bundle\") pod \"whisker-65988bcf59-9jlv5\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " pod="calico-system/whisker-65988bcf59-9jlv5" Sep 5 06:21:36.032868 kubelet[2782]: I0905 06:21:36.031630 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6db321e6-384e-461f-b494-486fb3abaa39-config-volume\") pod \"coredns-674b8bbfcf-pbptv\" (UID: \"6db321e6-384e-461f-b494-486fb3abaa39\") " pod="kube-system/coredns-674b8bbfcf-pbptv" Sep 5 06:21:36.032992 kubelet[2782]: I0905 06:21:36.031650 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-backend-key-pair\") pod \"whisker-65988bcf59-9jlv5\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " pod="calico-system/whisker-65988bcf59-9jlv5" Sep 5 06:21:36.038974 systemd[1]: Created slice kubepods-besteffort-pod164bbda0_024b_4642_aebe_7f8294f38db0.slice - libcontainer container kubepods-besteffort-pod164bbda0_024b_4642_aebe_7f8294f38db0.slice. Sep 5 06:21:36.296029 containerd[1585]: time="2025-09-05T06:21:36.295918417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pbptv,Uid:6db321e6-384e-461f-b494-486fb3abaa39,Namespace:kube-system,Attempt:0,}" Sep 5 06:21:36.301662 containerd[1585]: time="2025-09-05T06:21:36.301611462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65988bcf59-9jlv5,Uid:90d4fc6b-e10d-48c1-a100-581b1e45bea7,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:36.311582 containerd[1585]: time="2025-09-05T06:21:36.311524755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-qcvfw,Uid:e538d525-2f5b-4679-bb7b-2d7b2bfcc773,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:21:36.321646 containerd[1585]: time="2025-09-05T06:21:36.321387520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xhgtj,Uid:cbe88b9e-342d-4d7f-93cf-136f2f4d94ec,Namespace:kube-system,Attempt:0,}" Sep 5 06:21:36.327142 containerd[1585]: time="2025-09-05T06:21:36.327064222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b8d7b8bf-wqzvf,Uid:7d8d19a7-2cb7-4172-9d5e-ed2788eb3120,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:36.337692 containerd[1585]: time="2025-09-05T06:21:36.337655103Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-zqmnv,Uid:26a5a23e-432c-4622-993d-2e73ea07fb80,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:21:36.349509 containerd[1585]: time="2025-09-05T06:21:36.348036073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kg56r,Uid:164bbda0-024b-4642-aebe-7f8294f38db0,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:36.392963 containerd[1585]: time="2025-09-05T06:21:36.392890728Z" level=error msg="Failed to destroy network for sandbox \"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.394731 containerd[1585]: time="2025-09-05T06:21:36.394675435Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pbptv,Uid:6db321e6-384e-461f-b494-486fb3abaa39,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.395543 kubelet[2782]: E0905 06:21:36.395487 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.395601 kubelet[2782]: E0905 06:21:36.395588 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-pbptv" Sep 5 06:21:36.395633 kubelet[2782]: E0905 06:21:36.395610 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-pbptv" Sep 5 06:21:36.395710 kubelet[2782]: E0905 06:21:36.395674 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-pbptv_kube-system(6db321e6-384e-461f-b494-486fb3abaa39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-pbptv_kube-system(6db321e6-384e-461f-b494-486fb3abaa39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"277e3faf017b0fabce15fb32671596c8db90778cbf545b563d321ac7d6a482c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-pbptv" podUID="6db321e6-384e-461f-b494-486fb3abaa39" Sep 5 06:21:36.397501 systemd[1]: run-netns-cni\x2d15c992cf\x2d23fd\x2d0faf\x2dee36\x2de40799014b08.mount: Deactivated successfully. Sep 5 06:21:36.411350 containerd[1585]: time="2025-09-05T06:21:36.411291542Z" level=error msg="Failed to destroy network for sandbox \"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.416095 systemd[1]: run-netns-cni\x2d637f4a48\x2d0b84\x2d9c9f\x2d36b5\x2db857dc1aaf39.mount: Deactivated successfully. Sep 5 06:21:36.419077 containerd[1585]: time="2025-09-05T06:21:36.418991829Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65988bcf59-9jlv5,Uid:90d4fc6b-e10d-48c1-a100-581b1e45bea7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.419364 kubelet[2782]: E0905 06:21:36.419308 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.419466 kubelet[2782]: E0905 06:21:36.419391 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65988bcf59-9jlv5" Sep 5 06:21:36.419466 kubelet[2782]: E0905 06:21:36.419418 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65988bcf59-9jlv5" Sep 5 06:21:36.419625 kubelet[2782]: E0905 06:21:36.419476 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65988bcf59-9jlv5_calico-system(90d4fc6b-e10d-48c1-a100-581b1e45bea7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65988bcf59-9jlv5_calico-system(90d4fc6b-e10d-48c1-a100-581b1e45bea7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9f59bc6ba4a96cadb126a7556551ef42fd75df1a54d9bb0cbf0cd4681e244c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65988bcf59-9jlv5" podUID="90d4fc6b-e10d-48c1-a100-581b1e45bea7" Sep 5 06:21:36.447132 containerd[1585]: time="2025-09-05T06:21:36.446993945Z" level=error msg="Failed to destroy network for sandbox \"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.447322 containerd[1585]: time="2025-09-05T06:21:36.447247581Z" level=error msg="Failed to destroy network for sandbox \"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.449475 containerd[1585]: time="2025-09-05T06:21:36.449439106Z" level=error msg="Failed to destroy network for sandbox \"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.457578 containerd[1585]: time="2025-09-05T06:21:36.457541000Z" level=error msg="Failed to destroy network for sandbox \"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.460515 containerd[1585]: time="2025-09-05T06:21:36.460466432Z" level=error msg="Failed to destroy network for sandbox \"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.483834 containerd[1585]: time="2025-09-05T06:21:36.483778679Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-qcvfw,Uid:e538d525-2f5b-4679-bb7b-2d7b2bfcc773,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.484092 kubelet[2782]: E0905 06:21:36.484039 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.484141 kubelet[2782]: E0905 06:21:36.484124 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" Sep 5 06:21:36.484180 kubelet[2782]: E0905 06:21:36.484147 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" Sep 5 06:21:36.484266 kubelet[2782]: E0905 06:21:36.484205 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cf68b5bff-qcvfw_calico-apiserver(e538d525-2f5b-4679-bb7b-2d7b2bfcc773)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cf68b5bff-qcvfw_calico-apiserver(e538d525-2f5b-4679-bb7b-2d7b2bfcc773)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4951dd26094326c1ac4cfe28eb0a23ccaa7f1ea4b79cd67f418a40001dece61e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" podUID="e538d525-2f5b-4679-bb7b-2d7b2bfcc773" Sep 5 06:21:36.485128 containerd[1585]: time="2025-09-05T06:21:36.485087614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kg56r,Uid:164bbda0-024b-4642-aebe-7f8294f38db0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.485477 kubelet[2782]: E0905 06:21:36.485279 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.485477 kubelet[2782]: E0905 06:21:36.485339 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.485477 kubelet[2782]: E0905 06:21:36.485377 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kg56r" Sep 5 06:21:36.485649 kubelet[2782]: E0905 06:21:36.485468 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-kg56r_calico-system(164bbda0-024b-4642-aebe-7f8294f38db0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-kg56r_calico-system(164bbda0-024b-4642-aebe-7f8294f38db0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"475194349186bc93edaf4b1402aa8377e9b71e6a85dacc1eb794a2eca5875b7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-kg56r" podUID="164bbda0-024b-4642-aebe-7f8294f38db0" Sep 5 06:21:36.486137 containerd[1585]: time="2025-09-05T06:21:36.486015932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xhgtj,Uid:cbe88b9e-342d-4d7f-93cf-136f2f4d94ec,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.486260 kubelet[2782]: E0905 06:21:36.486236 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.486305 kubelet[2782]: E0905 06:21:36.486270 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xhgtj" Sep 5 06:21:36.486305 kubelet[2782]: E0905 06:21:36.486286 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xhgtj" Sep 5 06:21:36.486355 kubelet[2782]: E0905 06:21:36.486325 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-xhgtj_kube-system(cbe88b9e-342d-4d7f-93cf-136f2f4d94ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-xhgtj_kube-system(cbe88b9e-342d-4d7f-93cf-136f2f4d94ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9647e3c7f7ebf075033e3f8b45cf10d6516902d4ffc1c7003be8b03fdcfd3f4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-xhgtj" podUID="cbe88b9e-342d-4d7f-93cf-136f2f4d94ec" Sep 5 06:21:36.487105 containerd[1585]: time="2025-09-05T06:21:36.487072522Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-zqmnv,Uid:26a5a23e-432c-4622-993d-2e73ea07fb80,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.487215 kubelet[2782]: E0905 06:21:36.487191 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.487257 kubelet[2782]: E0905 06:21:36.487221 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" Sep 5 06:21:36.487257 kubelet[2782]: E0905 06:21:36.487234 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" Sep 5 06:21:36.487307 kubelet[2782]: E0905 06:21:36.487267 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cf68b5bff-zqmnv_calico-apiserver(26a5a23e-432c-4622-993d-2e73ea07fb80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cf68b5bff-zqmnv_calico-apiserver(26a5a23e-432c-4622-993d-2e73ea07fb80)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cfe055c1a2e1591b916ffbc5c0de3e01db28cc9488429a9aa13794da54aab11\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" podUID="26a5a23e-432c-4622-993d-2e73ea07fb80" Sep 5 06:21:36.488501 containerd[1585]: time="2025-09-05T06:21:36.488457426Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b8d7b8bf-wqzvf,Uid:7d8d19a7-2cb7-4172-9d5e-ed2788eb3120,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.488653 kubelet[2782]: E0905 06:21:36.488619 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.488687 kubelet[2782]: E0905 06:21:36.488655 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" Sep 5 06:21:36.488687 kubelet[2782]: E0905 06:21:36.488672 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" Sep 5 06:21:36.488737 kubelet[2782]: E0905 06:21:36.488708 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b8d7b8bf-wqzvf_calico-system(7d8d19a7-2cb7-4172-9d5e-ed2788eb3120)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b8d7b8bf-wqzvf_calico-system(7d8d19a7-2cb7-4172-9d5e-ed2788eb3120)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bb473b0308c7ba8b6cea259c2e25c9d437fd19a3a6bd95c95684e2dc36d609f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" podUID="7d8d19a7-2cb7-4172-9d5e-ed2788eb3120" Sep 5 06:21:36.729554 systemd[1]: Created slice kubepods-besteffort-pod184974df_56dd_43e9_9fda_6672bf4bc449.slice - libcontainer container kubepods-besteffort-pod184974df_56dd_43e9_9fda_6672bf4bc449.slice. Sep 5 06:21:36.731606 containerd[1585]: time="2025-09-05T06:21:36.731562450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmvm2,Uid:184974df-56dd-43e9-9fda-6672bf4bc449,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:36.782389 containerd[1585]: time="2025-09-05T06:21:36.782334630Z" level=error msg="Failed to destroy network for sandbox \"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.783828 containerd[1585]: time="2025-09-05T06:21:36.783767618Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmvm2,Uid:184974df-56dd-43e9-9fda-6672bf4bc449,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.784105 kubelet[2782]: E0905 06:21:36.784051 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 06:21:36.784179 kubelet[2782]: E0905 06:21:36.784124 2782 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:36.784179 kubelet[2782]: E0905 06:21:36.784148 2782 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fmvm2" Sep 5 06:21:36.784244 kubelet[2782]: E0905 06:21:36.784216 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fmvm2_calico-system(184974df-56dd-43e9-9fda-6672bf4bc449)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fmvm2_calico-system(184974df-56dd-43e9-9fda-6672bf4bc449)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6aed47dc7db383eac89dc7a6ac52241dd82a6f55b47d30e4ae6434cc6f0df8d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fmvm2" podUID="184974df-56dd-43e9-9fda-6672bf4bc449" Sep 5 06:21:36.824981 containerd[1585]: time="2025-09-05T06:21:36.824940008Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 5 06:21:37.343035 systemd[1]: run-netns-cni\x2d2f8f1e51\x2de5c1\x2db015\x2da572\x2d1e4d4ba477d0.mount: Deactivated successfully. Sep 5 06:21:37.343148 systemd[1]: run-netns-cni\x2ded99b167\x2d27ca\x2d1c4a\x2d20fb\x2d7b0591f59df1.mount: Deactivated successfully. Sep 5 06:21:37.343217 systemd[1]: run-netns-cni\x2d25346c2c\x2d2a43\x2dce8f\x2d2e37\x2dc85324be0fb0.mount: Deactivated successfully. Sep 5 06:21:37.343285 systemd[1]: run-netns-cni\x2dd74f85b4\x2d4c89\x2d631b\x2d3ce1\x2d125398151fe5.mount: Deactivated successfully. Sep 5 06:21:37.343363 systemd[1]: run-netns-cni\x2d68e1ad92\x2df582\x2d9ad2\x2d54c6\x2dbc981f01491f.mount: Deactivated successfully. Sep 5 06:21:41.528019 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount495595078.mount: Deactivated successfully. Sep 5 06:21:43.197192 containerd[1585]: time="2025-09-05T06:21:43.196934890Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 5 06:21:43.197192 containerd[1585]: time="2025-09-05T06:21:43.196995147Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:43.198348 containerd[1585]: time="2025-09-05T06:21:43.198309214Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:43.201827 containerd[1585]: time="2025-09-05T06:21:43.201482642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:43.201992 containerd[1585]: time="2025-09-05T06:21:43.201960637Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.37698637s" Sep 5 06:21:43.202032 containerd[1585]: time="2025-09-05T06:21:43.201993801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 5 06:21:43.225963 containerd[1585]: time="2025-09-05T06:21:43.225923134Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 5 06:21:43.240024 containerd[1585]: time="2025-09-05T06:21:43.238925161Z" level=info msg="Container 75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:43.240984 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1214453804.mount: Deactivated successfully. Sep 5 06:21:43.250123 containerd[1585]: time="2025-09-05T06:21:43.250081534Z" level=info msg="CreateContainer within sandbox \"5ea8d65aa53a248caa487ba17e71657efcbbea8951ff8b41a207a8157cb62539\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\"" Sep 5 06:21:43.250562 containerd[1585]: time="2025-09-05T06:21:43.250529980Z" level=info msg="StartContainer for \"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\"" Sep 5 06:21:43.252773 containerd[1585]: time="2025-09-05T06:21:43.252620673Z" level=info msg="connecting to shim 75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3" address="unix:///run/containerd/s/7d6166a1c6bb772089047bfa35911cb17a835a5eb362c79ccb4bf4e3c7aa93d9" protocol=ttrpc version=3 Sep 5 06:21:43.281975 systemd[1]: Started cri-containerd-75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3.scope - libcontainer container 75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3. Sep 5 06:21:43.364213 containerd[1585]: time="2025-09-05T06:21:43.364148156Z" level=info msg="StartContainer for \"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\" returns successfully" Sep 5 06:21:43.473056 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 5 06:21:43.473720 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 5 06:21:43.483599 systemd[1]: Started sshd@7-10.0.0.140:22-10.0.0.1:60160.service - OpenSSH per-connection server daemon (10.0.0.1:60160). Sep 5 06:21:43.553362 sshd[3932]: Accepted publickey for core from 10.0.0.1 port 60160 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:21:43.554635 sshd-session[3932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:21:43.563115 systemd-logind[1570]: New session 8 of user core. Sep 5 06:21:43.572042 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 5 06:21:43.679906 kubelet[2782]: I0905 06:21:43.679794 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-backend-key-pair\") pod \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " Sep 5 06:21:43.679906 kubelet[2782]: I0905 06:21:43.679849 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pfxh\" (UniqueName: \"kubernetes.io/projected/90d4fc6b-e10d-48c1-a100-581b1e45bea7-kube-api-access-5pfxh\") pod \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " Sep 5 06:21:43.679906 kubelet[2782]: I0905 06:21:43.679884 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-ca-bundle\") pod \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\" (UID: \"90d4fc6b-e10d-48c1-a100-581b1e45bea7\") " Sep 5 06:21:43.683064 kubelet[2782]: I0905 06:21:43.682934 2782 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "90d4fc6b-e10d-48c1-a100-581b1e45bea7" (UID: "90d4fc6b-e10d-48c1-a100-581b1e45bea7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 5 06:21:43.687010 kubelet[2782]: I0905 06:21:43.686949 2782 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "90d4fc6b-e10d-48c1-a100-581b1e45bea7" (UID: "90d4fc6b-e10d-48c1-a100-581b1e45bea7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 5 06:21:43.687187 kubelet[2782]: I0905 06:21:43.687104 2782 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d4fc6b-e10d-48c1-a100-581b1e45bea7-kube-api-access-5pfxh" (OuterVolumeSpecName: "kube-api-access-5pfxh") pod "90d4fc6b-e10d-48c1-a100-581b1e45bea7" (UID: "90d4fc6b-e10d-48c1-a100-581b1e45bea7"). InnerVolumeSpecName "kube-api-access-5pfxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 5 06:21:43.766743 sshd[3943]: Connection closed by 10.0.0.1 port 60160 Sep 5 06:21:43.768989 sshd-session[3932]: pam_unix(sshd:session): session closed for user core Sep 5 06:21:43.773457 systemd-logind[1570]: Session 8 logged out. Waiting for processes to exit. Sep 5 06:21:43.773977 systemd[1]: sshd@7-10.0.0.140:22-10.0.0.1:60160.service: Deactivated successfully. Sep 5 06:21:43.776324 systemd[1]: session-8.scope: Deactivated successfully. Sep 5 06:21:43.779453 systemd-logind[1570]: Removed session 8. Sep 5 06:21:43.780931 kubelet[2782]: I0905 06:21:43.780891 2782 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 5 06:21:43.780931 kubelet[2782]: I0905 06:21:43.780921 2782 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90d4fc6b-e10d-48c1-a100-581b1e45bea7-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 5 06:21:43.780931 kubelet[2782]: I0905 06:21:43.780931 2782 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5pfxh\" (UniqueName: \"kubernetes.io/projected/90d4fc6b-e10d-48c1-a100-581b1e45bea7-kube-api-access-5pfxh\") on node \"localhost\" DevicePath \"\"" Sep 5 06:21:43.874892 systemd[1]: Removed slice kubepods-besteffort-pod90d4fc6b_e10d_48c1_a100_581b1e45bea7.slice - libcontainer container kubepods-besteffort-pod90d4fc6b_e10d_48c1_a100_581b1e45bea7.slice. Sep 5 06:21:43.956318 kubelet[2782]: I0905 06:21:43.956239 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wqdqv" podStartSLOduration=1.414245326 podStartE2EDuration="22.955836281s" podCreationTimestamp="2025-09-05 06:21:21 +0000 UTC" firstStartedPulling="2025-09-05 06:21:21.662057148 +0000 UTC m=+23.069299026" lastFinishedPulling="2025-09-05 06:21:43.203648103 +0000 UTC m=+44.610889981" observedRunningTime="2025-09-05 06:21:43.945050573 +0000 UTC m=+45.352292451" watchObservedRunningTime="2025-09-05 06:21:43.955836281 +0000 UTC m=+45.363078159" Sep 5 06:21:44.005625 systemd[1]: Created slice kubepods-besteffort-podb55f3c88_0f18_4efd_a4dd_d2eeded0c332.slice - libcontainer container kubepods-besteffort-podb55f3c88_0f18_4efd_a4dd_d2eeded0c332.slice. Sep 5 06:21:44.065687 containerd[1585]: time="2025-09-05T06:21:44.065539528Z" level=info msg="TaskExit event in podsandbox handler container_id:\"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\" id:\"e6c9bd7015828b9e1161bb963bbbfb01a30729e19db0bb3f876be5dfbd952d32\" pid:3985 exit_status:1 exited_at:{seconds:1757053304 nanos:65134958}" Sep 5 06:21:44.083069 kubelet[2782]: I0905 06:21:44.082969 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b55f3c88-0f18-4efd-a4dd-d2eeded0c332-whisker-backend-key-pair\") pod \"whisker-69b9887498-zbvfm\" (UID: \"b55f3c88-0f18-4efd-a4dd-d2eeded0c332\") " pod="calico-system/whisker-69b9887498-zbvfm" Sep 5 06:21:44.083069 kubelet[2782]: I0905 06:21:44.083007 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55f3c88-0f18-4efd-a4dd-d2eeded0c332-whisker-ca-bundle\") pod \"whisker-69b9887498-zbvfm\" (UID: \"b55f3c88-0f18-4efd-a4dd-d2eeded0c332\") " pod="calico-system/whisker-69b9887498-zbvfm" Sep 5 06:21:44.083069 kubelet[2782]: I0905 06:21:44.083071 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgc7\" (UniqueName: \"kubernetes.io/projected/b55f3c88-0f18-4efd-a4dd-d2eeded0c332-kube-api-access-zkgc7\") pod \"whisker-69b9887498-zbvfm\" (UID: \"b55f3c88-0f18-4efd-a4dd-d2eeded0c332\") " pod="calico-system/whisker-69b9887498-zbvfm" Sep 5 06:21:44.211650 systemd[1]: var-lib-kubelet-pods-90d4fc6b\x2de10d\x2d48c1\x2da100\x2d581b1e45bea7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5pfxh.mount: Deactivated successfully. Sep 5 06:21:44.211779 systemd[1]: var-lib-kubelet-pods-90d4fc6b\x2de10d\x2d48c1\x2da100\x2d581b1e45bea7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 5 06:21:44.312705 containerd[1585]: time="2025-09-05T06:21:44.312380523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69b9887498-zbvfm,Uid:b55f3c88-0f18-4efd-a4dd-d2eeded0c332,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:44.485798 systemd-networkd[1494]: cali00c4a4c43cb: Link UP Sep 5 06:21:44.486262 systemd-networkd[1494]: cali00c4a4c43cb: Gained carrier Sep 5 06:21:44.499574 containerd[1585]: 2025-09-05 06:21:44.337 [INFO][4000] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 5 06:21:44.499574 containerd[1585]: 2025-09-05 06:21:44.357 [INFO][4000] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--69b9887498--zbvfm-eth0 whisker-69b9887498- calico-system b55f3c88-0f18-4efd-a4dd-d2eeded0c332 954 0 2025-09-05 06:21:43 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:69b9887498 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-69b9887498-zbvfm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali00c4a4c43cb [] [] }} ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-" Sep 5 06:21:44.499574 containerd[1585]: 2025-09-05 06:21:44.357 [INFO][4000] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.499574 containerd[1585]: 2025-09-05 06:21:44.427 [INFO][4017] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" HandleID="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Workload="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.428 [INFO][4017] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" HandleID="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Workload="localhost-k8s-whisker--69b9887498--zbvfm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ce910), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-69b9887498-zbvfm", "timestamp":"2025-09-05 06:21:44.427366286 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.428 [INFO][4017] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.428 [INFO][4017] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.428 [INFO][4017] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.441 [INFO][4017] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" host="localhost" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.450 [INFO][4017] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.455 [INFO][4017] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.457 [INFO][4017] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.459 [INFO][4017] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:44.499863 containerd[1585]: 2025-09-05 06:21:44.459 [INFO][4017] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" host="localhost" Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.464 [INFO][4017] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.468 [INFO][4017] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" host="localhost" Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.474 [INFO][4017] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" host="localhost" Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.474 [INFO][4017] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" host="localhost" Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.474 [INFO][4017] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:44.500096 containerd[1585]: 2025-09-05 06:21:44.474 [INFO][4017] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" HandleID="k8s-pod-network.6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Workload="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.500228 containerd[1585]: 2025-09-05 06:21:44.478 [INFO][4000] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--69b9887498--zbvfm-eth0", GenerateName:"whisker-69b9887498-", Namespace:"calico-system", SelfLink:"", UID:"b55f3c88-0f18-4efd-a4dd-d2eeded0c332", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69b9887498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-69b9887498-zbvfm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali00c4a4c43cb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:44.500228 containerd[1585]: 2025-09-05 06:21:44.478 [INFO][4000] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.500305 containerd[1585]: 2025-09-05 06:21:44.478 [INFO][4000] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali00c4a4c43cb ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.500305 containerd[1585]: 2025-09-05 06:21:44.486 [INFO][4000] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.500350 containerd[1585]: 2025-09-05 06:21:44.487 [INFO][4000] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--69b9887498--zbvfm-eth0", GenerateName:"whisker-69b9887498-", Namespace:"calico-system", SelfLink:"", UID:"b55f3c88-0f18-4efd-a4dd-d2eeded0c332", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69b9887498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da", Pod:"whisker-69b9887498-zbvfm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali00c4a4c43cb", MAC:"4e:99:a9:67:37:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:44.500396 containerd[1585]: 2025-09-05 06:21:44.496 [INFO][4000] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" Namespace="calico-system" Pod="whisker-69b9887498-zbvfm" WorkloadEndpoint="localhost-k8s-whisker--69b9887498--zbvfm-eth0" Sep 5 06:21:44.726079 kubelet[2782]: I0905 06:21:44.726019 2782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d4fc6b-e10d-48c1-a100-581b1e45bea7" path="/var/lib/kubelet/pods/90d4fc6b-e10d-48c1-a100-581b1e45bea7/volumes" Sep 5 06:21:44.945589 containerd[1585]: time="2025-09-05T06:21:44.945542327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\" id:\"5d06a7a1c7abf53cf83c20e8a929d1d334c68ef31cceed2c1a97620f6b37b7cb\" pid:4054 exit_status:1 exited_at:{seconds:1757053304 nanos:945217183}" Sep 5 06:21:45.106241 containerd[1585]: time="2025-09-05T06:21:45.105021468Z" level=info msg="connecting to shim 6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da" address="unix:///run/containerd/s/7decb934e993e1665d051ae4648aebe77ef55fd933e2dff2ad999807373b71b3" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:45.133228 systemd[1]: Started cri-containerd-6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da.scope - libcontainer container 6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da. Sep 5 06:21:45.159105 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:45.224230 containerd[1585]: time="2025-09-05T06:21:45.224107293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69b9887498-zbvfm,Uid:b55f3c88-0f18-4efd-a4dd-d2eeded0c332,Namespace:calico-system,Attempt:0,} returns sandbox id \"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da\"" Sep 5 06:21:45.225893 containerd[1585]: time="2025-09-05T06:21:45.225755550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 5 06:21:45.783582 kubelet[2782]: I0905 06:21:45.783522 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:21:45.948017 systemd-networkd[1494]: cali00c4a4c43cb: Gained IPv6LL Sep 5 06:21:46.657090 systemd-networkd[1494]: vxlan.calico: Link UP Sep 5 06:21:46.657102 systemd-networkd[1494]: vxlan.calico: Gained carrier Sep 5 06:21:47.069209 containerd[1585]: time="2025-09-05T06:21:47.069087681Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 5 06:21:47.072864 containerd[1585]: time="2025-09-05T06:21:47.072805625Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.847010788s" Sep 5 06:21:47.072864 containerd[1585]: time="2025-09-05T06:21:47.072859364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 5 06:21:47.075875 containerd[1585]: time="2025-09-05T06:21:47.075831763Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:47.076707 containerd[1585]: time="2025-09-05T06:21:47.076523999Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:47.077900 containerd[1585]: time="2025-09-05T06:21:47.077823354Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:47.125774 containerd[1585]: time="2025-09-05T06:21:47.125737889Z" level=info msg="CreateContainer within sandbox \"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 5 06:21:47.135315 containerd[1585]: time="2025-09-05T06:21:47.135278356Z" level=info msg="Container 8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:47.142881 containerd[1585]: time="2025-09-05T06:21:47.142847198Z" level=info msg="CreateContainer within sandbox \"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405\"" Sep 5 06:21:47.143572 containerd[1585]: time="2025-09-05T06:21:47.143387414Z" level=info msg="StartContainer for \"8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405\"" Sep 5 06:21:47.144428 containerd[1585]: time="2025-09-05T06:21:47.144395041Z" level=info msg="connecting to shim 8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405" address="unix:///run/containerd/s/7decb934e993e1665d051ae4648aebe77ef55fd933e2dff2ad999807373b71b3" protocol=ttrpc version=3 Sep 5 06:21:47.178934 systemd[1]: Started cri-containerd-8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405.scope - libcontainer container 8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405. Sep 5 06:21:47.253863 containerd[1585]: time="2025-09-05T06:21:47.253803673Z" level=info msg="StartContainer for \"8ea97f9408090ddced2c59709c7de1138503a988dbc45958e5d367de14e18405\" returns successfully" Sep 5 06:21:47.255113 containerd[1585]: time="2025-09-05T06:21:47.255068515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 5 06:21:47.724061 containerd[1585]: time="2025-09-05T06:21:47.724018329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kg56r,Uid:164bbda0-024b-4642-aebe-7f8294f38db0,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:47.724245 containerd[1585]: time="2025-09-05T06:21:47.724019882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmvm2,Uid:184974df-56dd-43e9-9fda-6672bf4bc449,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:47.724245 containerd[1585]: time="2025-09-05T06:21:47.724018279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-qcvfw,Uid:e538d525-2f5b-4679-bb7b-2d7b2bfcc773,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:21:47.739994 systemd-networkd[1494]: vxlan.calico: Gained IPv6LL Sep 5 06:21:47.849426 systemd-networkd[1494]: cali7aead5335b9: Link UP Sep 5 06:21:47.849934 systemd-networkd[1494]: cali7aead5335b9: Gained carrier Sep 5 06:21:47.863070 containerd[1585]: 2025-09-05 06:21:47.768 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--kg56r-eth0 goldmane-54d579b49d- calico-system 164bbda0-024b-4642-aebe-7f8294f38db0 862 0 2025-09-05 06:21:20 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-kg56r eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7aead5335b9 [] [] }} ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-" Sep 5 06:21:47.863070 containerd[1585]: 2025-09-05 06:21:47.768 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863070 containerd[1585]: 2025-09-05 06:21:47.808 [INFO][4422] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" HandleID="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Workload="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.808 [INFO][4422] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" HandleID="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Workload="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f730), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-kg56r", "timestamp":"2025-09-05 06:21:47.808083844 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.808 [INFO][4422] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.808 [INFO][4422] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.809 [INFO][4422] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.817 [INFO][4422] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" host="localhost" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.822 [INFO][4422] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.827 [INFO][4422] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.829 [INFO][4422] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.831 [INFO][4422] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:47.863330 containerd[1585]: 2025-09-05 06:21:47.831 [INFO][4422] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" host="localhost" Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.832 [INFO][4422] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9 Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.836 [INFO][4422] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" host="localhost" Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.842 [INFO][4422] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" host="localhost" Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.842 [INFO][4422] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" host="localhost" Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.842 [INFO][4422] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:47.863549 containerd[1585]: 2025-09-05 06:21:47.842 [INFO][4422] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" HandleID="k8s-pod-network.8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Workload="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863669 containerd[1585]: 2025-09-05 06:21:47.847 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kg56r-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"164bbda0-024b-4642-aebe-7f8294f38db0", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-kg56r", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7aead5335b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:47.863669 containerd[1585]: 2025-09-05 06:21:47.847 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863749 containerd[1585]: 2025-09-05 06:21:47.847 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7aead5335b9 ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863749 containerd[1585]: 2025-09-05 06:21:47.850 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.863796 containerd[1585]: 2025-09-05 06:21:47.851 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kg56r-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"164bbda0-024b-4642-aebe-7f8294f38db0", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9", Pod:"goldmane-54d579b49d-kg56r", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7aead5335b9", MAC:"e2:36:06:c2:b6:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:47.863917 containerd[1585]: 2025-09-05 06:21:47.860 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" Namespace="calico-system" Pod="goldmane-54d579b49d-kg56r" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kg56r-eth0" Sep 5 06:21:47.888477 containerd[1585]: time="2025-09-05T06:21:47.888428710Z" level=info msg="connecting to shim 8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9" address="unix:///run/containerd/s/ec334388f11cafdee212e36d7e0637b0b5dd92cbafb441a5451631336d0e7712" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:47.920017 systemd[1]: Started cri-containerd-8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9.scope - libcontainer container 8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9. Sep 5 06:21:47.936187 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:47.954704 systemd-networkd[1494]: calib08fe801d71: Link UP Sep 5 06:21:47.955323 systemd-networkd[1494]: calib08fe801d71: Gained carrier Sep 5 06:21:47.972838 containerd[1585]: 2025-09-05 06:21:47.781 [INFO][4390] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--fmvm2-eth0 csi-node-driver- calico-system 184974df-56dd-43e9-9fda-6672bf4bc449 739 0 2025-09-05 06:21:21 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-fmvm2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib08fe801d71 [] [] }} ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-" Sep 5 06:21:47.972838 containerd[1585]: 2025-09-05 06:21:47.781 [INFO][4390] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.972838 containerd[1585]: 2025-09-05 06:21:47.820 [INFO][4434] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" HandleID="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Workload="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.820 [INFO][4434] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" HandleID="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Workload="localhost-k8s-csi--node--driver--fmvm2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000584b50), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-fmvm2", "timestamp":"2025-09-05 06:21:47.820138695 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.820 [INFO][4434] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.842 [INFO][4434] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.843 [INFO][4434] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.916 [INFO][4434] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" host="localhost" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.923 [INFO][4434] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.927 [INFO][4434] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.929 [INFO][4434] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.931 [INFO][4434] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:47.973037 containerd[1585]: 2025-09-05 06:21:47.932 [INFO][4434] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" host="localhost" Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.933 [INFO][4434] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.937 [INFO][4434] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" host="localhost" Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4434] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" host="localhost" Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4434] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" host="localhost" Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4434] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:47.973356 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4434] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" HandleID="k8s-pod-network.c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Workload="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.973475 containerd[1585]: 2025-09-05 06:21:47.947 [INFO][4390] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fmvm2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"184974df-56dd-43e9-9fda-6672bf4bc449", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-fmvm2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib08fe801d71", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:47.973532 containerd[1585]: 2025-09-05 06:21:47.948 [INFO][4390] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.973532 containerd[1585]: 2025-09-05 06:21:47.948 [INFO][4390] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib08fe801d71 ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.973532 containerd[1585]: 2025-09-05 06:21:47.955 [INFO][4390] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.973601 containerd[1585]: 2025-09-05 06:21:47.956 [INFO][4390] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fmvm2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"184974df-56dd-43e9-9fda-6672bf4bc449", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a", Pod:"csi-node-driver-fmvm2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib08fe801d71", MAC:"be:2d:b3:27:b2:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:47.973651 containerd[1585]: 2025-09-05 06:21:47.966 [INFO][4390] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" Namespace="calico-system" Pod="csi-node-driver-fmvm2" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmvm2-eth0" Sep 5 06:21:47.980679 containerd[1585]: time="2025-09-05T06:21:47.980598154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kg56r,Uid:164bbda0-024b-4642-aebe-7f8294f38db0,Namespace:calico-system,Attempt:0,} returns sandbox id \"8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9\"" Sep 5 06:21:48.006060 containerd[1585]: time="2025-09-05T06:21:48.006003610Z" level=info msg="connecting to shim c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a" address="unix:///run/containerd/s/a9b091bb5299d4a100543273f50985c88b7678a4362b363a9641006ba3596c31" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:48.035984 systemd[1]: Started cri-containerd-c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a.scope - libcontainer container c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a. Sep 5 06:21:48.052720 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:48.072654 systemd-networkd[1494]: calicdbfd8a7c74: Link UP Sep 5 06:21:48.073739 systemd-networkd[1494]: calicdbfd8a7c74: Gained carrier Sep 5 06:21:48.092018 containerd[1585]: 2025-09-05 06:21:47.780 [INFO][4398] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0 calico-apiserver-cf68b5bff- calico-apiserver e538d525-2f5b-4679-bb7b-2d7b2bfcc773 858 0 2025-09-05 06:21:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cf68b5bff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cf68b5bff-qcvfw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicdbfd8a7c74 [] [] }} ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-" Sep 5 06:21:48.092018 containerd[1585]: 2025-09-05 06:21:47.780 [INFO][4398] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.092018 containerd[1585]: 2025-09-05 06:21:47.825 [INFO][4428] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" HandleID="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Workload="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:47.825 [INFO][4428] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" HandleID="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Workload="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f490), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cf68b5bff-qcvfw", "timestamp":"2025-09-05 06:21:47.825601986 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:47.825 [INFO][4428] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4428] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:47.943 [INFO][4428] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.019 [INFO][4428] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" host="localhost" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.025 [INFO][4428] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.031 [INFO][4428] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.033 [INFO][4428] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.035 [INFO][4428] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.092513 containerd[1585]: 2025-09-05 06:21:48.035 [INFO][4428] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" host="localhost" Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.036 [INFO][4428] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.041 [INFO][4428] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" host="localhost" Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.050 [INFO][4428] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" host="localhost" Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.050 [INFO][4428] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" host="localhost" Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.051 [INFO][4428] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:48.092790 containerd[1585]: 2025-09-05 06:21:48.051 [INFO][4428] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" HandleID="k8s-pod-network.0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Workload="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.092928 containerd[1585]: 2025-09-05 06:21:48.064 [INFO][4398] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0", GenerateName:"calico-apiserver-cf68b5bff-", Namespace:"calico-apiserver", SelfLink:"", UID:"e538d525-2f5b-4679-bb7b-2d7b2bfcc773", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cf68b5bff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cf68b5bff-qcvfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdbfd8a7c74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.092984 containerd[1585]: 2025-09-05 06:21:48.064 [INFO][4398] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.092984 containerd[1585]: 2025-09-05 06:21:48.064 [INFO][4398] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicdbfd8a7c74 ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.092984 containerd[1585]: 2025-09-05 06:21:48.073 [INFO][4398] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.093051 containerd[1585]: 2025-09-05 06:21:48.074 [INFO][4398] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0", GenerateName:"calico-apiserver-cf68b5bff-", Namespace:"calico-apiserver", SelfLink:"", UID:"e538d525-2f5b-4679-bb7b-2d7b2bfcc773", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cf68b5bff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f", Pod:"calico-apiserver-cf68b5bff-qcvfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdbfd8a7c74", MAC:"0e:1f:7f:ce:78:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.093128 containerd[1585]: 2025-09-05 06:21:48.086 [INFO][4398] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-qcvfw" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--qcvfw-eth0" Sep 5 06:21:48.093128 containerd[1585]: time="2025-09-05T06:21:48.093045572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmvm2,Uid:184974df-56dd-43e9-9fda-6672bf4bc449,Namespace:calico-system,Attempt:0,} returns sandbox id \"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a\"" Sep 5 06:21:48.129340 containerd[1585]: time="2025-09-05T06:21:48.129288301Z" level=info msg="connecting to shim 0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f" address="unix:///run/containerd/s/8ee52e66a1cb8b227e257a32cbb2819d826fc1fb37eee76ef4d1d6536f5681e9" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:48.161014 systemd[1]: Started cri-containerd-0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f.scope - libcontainer container 0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f. Sep 5 06:21:48.174222 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:48.207568 containerd[1585]: time="2025-09-05T06:21:48.207517813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-qcvfw,Uid:e538d525-2f5b-4679-bb7b-2d7b2bfcc773,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f\"" Sep 5 06:21:48.723969 containerd[1585]: time="2025-09-05T06:21:48.723908050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xhgtj,Uid:cbe88b9e-342d-4d7f-93cf-136f2f4d94ec,Namespace:kube-system,Attempt:0,}" Sep 5 06:21:48.724433 containerd[1585]: time="2025-09-05T06:21:48.724340539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b8d7b8bf-wqzvf,Uid:7d8d19a7-2cb7-4172-9d5e-ed2788eb3120,Namespace:calico-system,Attempt:0,}" Sep 5 06:21:48.779139 systemd[1]: Started sshd@8-10.0.0.140:22-10.0.0.1:60176.service - OpenSSH per-connection server daemon (10.0.0.1:60176). Sep 5 06:21:48.839178 systemd-networkd[1494]: cali3d9556587c8: Link UP Sep 5 06:21:48.840308 systemd-networkd[1494]: cali3d9556587c8: Gained carrier Sep 5 06:21:48.853519 containerd[1585]: 2025-09-05 06:21:48.767 [INFO][4628] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0 calico-kube-controllers-7b8d7b8bf- calico-system 7d8d19a7-2cb7-4172-9d5e-ed2788eb3120 859 0 2025-09-05 06:21:21 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b8d7b8bf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7b8d7b8bf-wqzvf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3d9556587c8 [] [] }} ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-" Sep 5 06:21:48.853519 containerd[1585]: 2025-09-05 06:21:48.767 [INFO][4628] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.853519 containerd[1585]: 2025-09-05 06:21:48.792 [INFO][4651] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" HandleID="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Workload="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.793 [INFO][4651] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" HandleID="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Workload="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a48a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7b8d7b8bf-wqzvf", "timestamp":"2025-09-05 06:21:48.792746374 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.793 [INFO][4651] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.793 [INFO][4651] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.793 [INFO][4651] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.799 [INFO][4651] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" host="localhost" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.802 [INFO][4651] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.805 [INFO][4651] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.807 [INFO][4651] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.809 [INFO][4651] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.853792 containerd[1585]: 2025-09-05 06:21:48.810 [INFO][4651] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" host="localhost" Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.811 [INFO][4651] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6 Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.818 [INFO][4651] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" host="localhost" Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.825 [INFO][4651] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" host="localhost" Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.825 [INFO][4651] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" host="localhost" Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.826 [INFO][4651] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:48.854982 containerd[1585]: 2025-09-05 06:21:48.826 [INFO][4651] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" HandleID="k8s-pod-network.cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Workload="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.855110 sshd[4657]: Accepted publickey for core from 10.0.0.1 port 60176 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:21:48.855426 containerd[1585]: 2025-09-05 06:21:48.831 [INFO][4628] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0", GenerateName:"calico-kube-controllers-7b8d7b8bf-", Namespace:"calico-system", SelfLink:"", UID:"7d8d19a7-2cb7-4172-9d5e-ed2788eb3120", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b8d7b8bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7b8d7b8bf-wqzvf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d9556587c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.855489 containerd[1585]: 2025-09-05 06:21:48.832 [INFO][4628] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.855489 containerd[1585]: 2025-09-05 06:21:48.832 [INFO][4628] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d9556587c8 ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.855489 containerd[1585]: 2025-09-05 06:21:48.840 [INFO][4628] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.855554 containerd[1585]: 2025-09-05 06:21:48.841 [INFO][4628] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0", GenerateName:"calico-kube-controllers-7b8d7b8bf-", Namespace:"calico-system", SelfLink:"", UID:"7d8d19a7-2cb7-4172-9d5e-ed2788eb3120", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b8d7b8bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6", Pod:"calico-kube-controllers-7b8d7b8bf-wqzvf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d9556587c8", MAC:"02:24:50:55:75:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.855607 containerd[1585]: 2025-09-05 06:21:48.849 [INFO][4628] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" Namespace="calico-system" Pod="calico-kube-controllers-7b8d7b8bf-wqzvf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7b8d7b8bf--wqzvf-eth0" Sep 5 06:21:48.856487 sshd-session[4657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:21:48.862249 systemd-logind[1570]: New session 9 of user core. Sep 5 06:21:48.867143 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 5 06:21:48.882371 containerd[1585]: time="2025-09-05T06:21:48.881636448Z" level=info msg="connecting to shim cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6" address="unix:///run/containerd/s/67274d3d49e4ff9f18b90c7e91db447d74dd4127a23c65d628f8e4f4b048c7cd" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:48.919126 systemd[1]: Started cri-containerd-cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6.scope - libcontainer container cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6. Sep 5 06:21:48.936673 systemd-networkd[1494]: calib8e21a313b6: Link UP Sep 5 06:21:48.937975 systemd-networkd[1494]: calib8e21a313b6: Gained carrier Sep 5 06:21:48.939765 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:48.963047 containerd[1585]: 2025-09-05 06:21:48.761 [INFO][4615] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0 coredns-674b8bbfcf- kube-system cbe88b9e-342d-4d7f-93cf-136f2f4d94ec 861 0 2025-09-05 06:21:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-xhgtj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib8e21a313b6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-" Sep 5 06:21:48.963047 containerd[1585]: 2025-09-05 06:21:48.761 [INFO][4615] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963047 containerd[1585]: 2025-09-05 06:21:48.798 [INFO][4644] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" HandleID="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Workload="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.798 [INFO][4644] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" HandleID="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Workload="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f560), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-xhgtj", "timestamp":"2025-09-05 06:21:48.798362373 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.798 [INFO][4644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.825 [INFO][4644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.826 [INFO][4644] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.900 [INFO][4644] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" host="localhost" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.905 [INFO][4644] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.908 [INFO][4644] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.911 [INFO][4644] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.914 [INFO][4644] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:48.963270 containerd[1585]: 2025-09-05 06:21:48.914 [INFO][4644] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" host="localhost" Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.916 [INFO][4644] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.918 [INFO][4644] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" host="localhost" Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.926 [INFO][4644] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" host="localhost" Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.926 [INFO][4644] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" host="localhost" Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.926 [INFO][4644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:48.963483 containerd[1585]: 2025-09-05 06:21:48.926 [INFO][4644] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" HandleID="k8s-pod-network.28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Workload="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963620 containerd[1585]: 2025-09-05 06:21:48.931 [INFO][4615] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cbe88b9e-342d-4d7f-93cf-136f2f4d94ec", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-xhgtj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8e21a313b6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.963690 containerd[1585]: 2025-09-05 06:21:48.931 [INFO][4615] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963690 containerd[1585]: 2025-09-05 06:21:48.931 [INFO][4615] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8e21a313b6 ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963690 containerd[1585]: 2025-09-05 06:21:48.938 [INFO][4615] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.963761 containerd[1585]: 2025-09-05 06:21:48.939 [INFO][4615] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cbe88b9e-342d-4d7f-93cf-136f2f4d94ec", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a", Pod:"coredns-674b8bbfcf-xhgtj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib8e21a313b6", MAC:"5e:ea:4c:93:bf:e1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:48.963761 containerd[1585]: 2025-09-05 06:21:48.951 [INFO][4615] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" Namespace="kube-system" Pod="coredns-674b8bbfcf-xhgtj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xhgtj-eth0" Sep 5 06:21:48.998889 containerd[1585]: time="2025-09-05T06:21:48.997561692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b8d7b8bf-wqzvf,Uid:7d8d19a7-2cb7-4172-9d5e-ed2788eb3120,Namespace:calico-system,Attempt:0,} returns sandbox id \"cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6\"" Sep 5 06:21:49.006866 containerd[1585]: time="2025-09-05T06:21:49.006771520Z" level=info msg="connecting to shim 28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a" address="unix:///run/containerd/s/2c6d72255dd463b8dc54845b9faa800dac885b8a8a23ab9ac500b9285d8763fb" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:49.030949 systemd[1]: Started cri-containerd-28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a.scope - libcontainer container 28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a. Sep 5 06:21:49.049388 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:49.053837 sshd[4675]: Connection closed by 10.0.0.1 port 60176 Sep 5 06:21:49.054206 sshd-session[4657]: pam_unix(sshd:session): session closed for user core Sep 5 06:21:49.060427 systemd[1]: sshd@8-10.0.0.140:22-10.0.0.1:60176.service: Deactivated successfully. Sep 5 06:21:49.062846 systemd[1]: session-9.scope: Deactivated successfully. Sep 5 06:21:49.063767 systemd-logind[1570]: Session 9 logged out. Waiting for processes to exit. Sep 5 06:21:49.066120 systemd-logind[1570]: Removed session 9. Sep 5 06:21:49.082692 containerd[1585]: time="2025-09-05T06:21:49.082652259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xhgtj,Uid:cbe88b9e-342d-4d7f-93cf-136f2f4d94ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a\"" Sep 5 06:21:49.089859 containerd[1585]: time="2025-09-05T06:21:49.089802077Z" level=info msg="CreateContainer within sandbox \"28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 06:21:49.103714 containerd[1585]: time="2025-09-05T06:21:49.103656632Z" level=info msg="Container 493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:49.110669 containerd[1585]: time="2025-09-05T06:21:49.110636286Z" level=info msg="CreateContainer within sandbox \"28e07e8e7a68c72cad4dcd0bedd24189a48cb02bcfade47aeeaf03922d6d6d6a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43\"" Sep 5 06:21:49.111574 containerd[1585]: time="2025-09-05T06:21:49.111514379Z" level=info msg="StartContainer for \"493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43\"" Sep 5 06:21:49.112400 containerd[1585]: time="2025-09-05T06:21:49.112374329Z" level=info msg="connecting to shim 493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43" address="unix:///run/containerd/s/2c6d72255dd463b8dc54845b9faa800dac885b8a8a23ab9ac500b9285d8763fb" protocol=ttrpc version=3 Sep 5 06:21:49.133976 systemd[1]: Started cri-containerd-493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43.scope - libcontainer container 493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43. Sep 5 06:21:49.180972 containerd[1585]: time="2025-09-05T06:21:49.180928102Z" level=info msg="StartContainer for \"493f393b3ab9d9351e76c0a88ba5356bbcdd8358761cd97a88e37b9342390d43\" returns successfully" Sep 5 06:21:49.211992 systemd-networkd[1494]: calicdbfd8a7c74: Gained IPv6LL Sep 5 06:21:49.596108 systemd-networkd[1494]: cali7aead5335b9: Gained IPv6LL Sep 5 06:21:49.596433 systemd-networkd[1494]: calib08fe801d71: Gained IPv6LL Sep 5 06:21:49.724690 containerd[1585]: time="2025-09-05T06:21:49.724632474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-zqmnv,Uid:26a5a23e-432c-4622-993d-2e73ea07fb80,Namespace:calico-apiserver,Attempt:0,}" Sep 5 06:21:49.750665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount659187275.mount: Deactivated successfully. Sep 5 06:21:49.954877 kubelet[2782]: I0905 06:21:49.954798 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-xhgtj" podStartSLOduration=44.954780323 podStartE2EDuration="44.954780323s" podCreationTimestamp="2025-09-05 06:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:49.947122344 +0000 UTC m=+51.354364212" watchObservedRunningTime="2025-09-05 06:21:49.954780323 +0000 UTC m=+51.362022201" Sep 5 06:21:50.004043 containerd[1585]: time="2025-09-05T06:21:50.003989338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:50.004397 containerd[1585]: time="2025-09-05T06:21:50.004354756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 5 06:21:50.006847 containerd[1585]: time="2025-09-05T06:21:50.005709824Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:50.011921 containerd[1585]: time="2025-09-05T06:21:50.011886319Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:50.012889 containerd[1585]: time="2025-09-05T06:21:50.012854491Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.75774387s" Sep 5 06:21:50.012945 containerd[1585]: time="2025-09-05T06:21:50.012891520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 5 06:21:50.017281 containerd[1585]: time="2025-09-05T06:21:50.017250570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 5 06:21:50.026747 containerd[1585]: time="2025-09-05T06:21:50.025901546Z" level=info msg="CreateContainer within sandbox \"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 5 06:21:50.039590 containerd[1585]: time="2025-09-05T06:21:50.039541189Z" level=info msg="Container 69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:50.061707 containerd[1585]: time="2025-09-05T06:21:50.061636086Z" level=info msg="CreateContainer within sandbox \"6cdaf92853b540c3735ef038ac95ad6c9293cdf8ed8d5d47dc78cd7a8b9a95da\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8\"" Sep 5 06:21:50.066123 containerd[1585]: time="2025-09-05T06:21:50.066074712Z" level=info msg="StartContainer for \"69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8\"" Sep 5 06:21:50.071017 containerd[1585]: time="2025-09-05T06:21:50.070985044Z" level=info msg="connecting to shim 69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8" address="unix:///run/containerd/s/7decb934e993e1665d051ae4648aebe77ef55fd933e2dff2ad999807373b71b3" protocol=ttrpc version=3 Sep 5 06:21:50.121250 systemd-networkd[1494]: cali3e9ff311974: Link UP Sep 5 06:21:50.121972 systemd[1]: Started cri-containerd-69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8.scope - libcontainer container 69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8. Sep 5 06:21:50.126868 systemd-networkd[1494]: cali3e9ff311974: Gained carrier Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:49.997 [INFO][4829] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0 calico-apiserver-cf68b5bff- calico-apiserver 26a5a23e-432c-4622-993d-2e73ea07fb80 860 0 2025-09-05 06:21:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cf68b5bff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cf68b5bff-zqmnv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3e9ff311974 [] [] }} ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:49.997 [INFO][4829] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.032 [INFO][4848] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" HandleID="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Workload="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.033 [INFO][4848] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" HandleID="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Workload="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7030), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cf68b5bff-zqmnv", "timestamp":"2025-09-05 06:21:50.032906746 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.033 [INFO][4848] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.033 [INFO][4848] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.033 [INFO][4848] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.042 [INFO][4848] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.058 [INFO][4848] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.074 [INFO][4848] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.078 [INFO][4848] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.084 [INFO][4848] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.085 [INFO][4848] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.087 [INFO][4848] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2 Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.095 [INFO][4848] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.105 [INFO][4848] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.105 [INFO][4848] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" host="localhost" Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.105 [INFO][4848] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:50.142343 containerd[1585]: 2025-09-05 06:21:50.105 [INFO][4848] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" HandleID="k8s-pod-network.af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Workload="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.113 [INFO][4829] cni-plugin/k8s.go 418: Populated endpoint ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0", GenerateName:"calico-apiserver-cf68b5bff-", Namespace:"calico-apiserver", SelfLink:"", UID:"26a5a23e-432c-4622-993d-2e73ea07fb80", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cf68b5bff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cf68b5bff-zqmnv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3e9ff311974", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.113 [INFO][4829] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.115 [INFO][4829] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e9ff311974 ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.126 [INFO][4829] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.127 [INFO][4829] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0", GenerateName:"calico-apiserver-cf68b5bff-", Namespace:"calico-apiserver", SelfLink:"", UID:"26a5a23e-432c-4622-993d-2e73ea07fb80", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cf68b5bff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2", Pod:"calico-apiserver-cf68b5bff-zqmnv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3e9ff311974", MAC:"7e:45:18:3e:63:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:50.143208 containerd[1585]: 2025-09-05 06:21:50.138 [INFO][4829] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" Namespace="calico-apiserver" Pod="calico-apiserver-cf68b5bff-zqmnv" WorkloadEndpoint="localhost-k8s-calico--apiserver--cf68b5bff--zqmnv-eth0" Sep 5 06:21:50.171737 containerd[1585]: time="2025-09-05T06:21:50.171690308Z" level=info msg="connecting to shim af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2" address="unix:///run/containerd/s/16b2aef0cd553e773e939a8f3615bae8b70be1729ffeaa61eec0ec77dd9af8c5" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:50.172000 systemd-networkd[1494]: cali3d9556587c8: Gained IPv6LL Sep 5 06:21:50.199115 containerd[1585]: time="2025-09-05T06:21:50.198430706Z" level=info msg="StartContainer for \"69b510c3ef78dc6c5d0c4a134168b675db97d9455a73f43f8243232f75f510d8\" returns successfully" Sep 5 06:21:50.205023 systemd[1]: Started cri-containerd-af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2.scope - libcontainer container af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2. Sep 5 06:21:50.223039 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:50.258545 containerd[1585]: time="2025-09-05T06:21:50.258493595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cf68b5bff-zqmnv,Uid:26a5a23e-432c-4622-993d-2e73ea07fb80,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2\"" Sep 5 06:21:50.491984 systemd-networkd[1494]: calib8e21a313b6: Gained IPv6LL Sep 5 06:21:51.324015 systemd-networkd[1494]: cali3e9ff311974: Gained IPv6LL Sep 5 06:21:51.724571 containerd[1585]: time="2025-09-05T06:21:51.724524531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pbptv,Uid:6db321e6-384e-461f-b494-486fb3abaa39,Namespace:kube-system,Attempt:0,}" Sep 5 06:21:51.875019 systemd-networkd[1494]: cali891c9cbe0ad: Link UP Sep 5 06:21:51.876381 systemd-networkd[1494]: cali891c9cbe0ad: Gained carrier Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.761 [INFO][4948] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--pbptv-eth0 coredns-674b8bbfcf- kube-system 6db321e6-384e-461f-b494-486fb3abaa39 852 0 2025-09-05 06:21:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-pbptv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali891c9cbe0ad [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.761 [INFO][4948] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.790 [INFO][4962] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" HandleID="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Workload="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.794 [INFO][4962] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" HandleID="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Workload="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df0f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-pbptv", "timestamp":"2025-09-05 06:21:51.790061459 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.794 [INFO][4962] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.794 [INFO][4962] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.794 [INFO][4962] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.801 [INFO][4962] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.807 [INFO][4962] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.811 [INFO][4962] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.812 [INFO][4962] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.815 [INFO][4962] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.815 [INFO][4962] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.816 [INFO][4962] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982 Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.838 [INFO][4962] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.869 [INFO][4962] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.869 [INFO][4962] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" host="localhost" Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.869 [INFO][4962] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 06:21:51.890894 containerd[1585]: 2025-09-05 06:21:51.869 [INFO][4962] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" HandleID="k8s-pod-network.ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Workload="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.872 [INFO][4948] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--pbptv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6db321e6-384e-461f-b494-486fb3abaa39", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-pbptv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali891c9cbe0ad", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.872 [INFO][4948] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.872 [INFO][4948] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali891c9cbe0ad ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.875 [INFO][4948] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.877 [INFO][4948] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--pbptv-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6db321e6-384e-461f-b494-486fb3abaa39", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 6, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982", Pod:"coredns-674b8bbfcf-pbptv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali891c9cbe0ad", MAC:"0a:dc:a7:f9:8d:e7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 06:21:51.891518 containerd[1585]: 2025-09-05 06:21:51.886 [INFO][4948] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" Namespace="kube-system" Pod="coredns-674b8bbfcf-pbptv" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--pbptv-eth0" Sep 5 06:21:51.892244 kubelet[2782]: I0905 06:21:51.892189 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-69b9887498-zbvfm" podStartSLOduration=4.101125817 podStartE2EDuration="8.892098775s" podCreationTimestamp="2025-09-05 06:21:43 +0000 UTC" firstStartedPulling="2025-09-05 06:21:45.225505722 +0000 UTC m=+46.632747601" lastFinishedPulling="2025-09-05 06:21:50.016478681 +0000 UTC m=+51.423720559" observedRunningTime="2025-09-05 06:21:50.996554864 +0000 UTC m=+52.403796732" watchObservedRunningTime="2025-09-05 06:21:51.892098775 +0000 UTC m=+53.299340653" Sep 5 06:21:51.920167 containerd[1585]: time="2025-09-05T06:21:51.920122248Z" level=info msg="connecting to shim ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982" address="unix:///run/containerd/s/6f7f84f54331da58ec79b94d0f9bee054f1cdfb514f687d2f0b80f6bf77b1eef" namespace=k8s.io protocol=ttrpc version=3 Sep 5 06:21:51.950965 systemd[1]: Started cri-containerd-ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982.scope - libcontainer container ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982. Sep 5 06:21:51.964299 systemd-resolved[1408]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 5 06:21:51.993981 containerd[1585]: time="2025-09-05T06:21:51.993868435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pbptv,Uid:6db321e6-384e-461f-b494-486fb3abaa39,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982\"" Sep 5 06:21:52.000060 containerd[1585]: time="2025-09-05T06:21:52.000028706Z" level=info msg="CreateContainer within sandbox \"ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 5 06:21:52.010174 containerd[1585]: time="2025-09-05T06:21:52.009656043Z" level=info msg="Container f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:52.015796 containerd[1585]: time="2025-09-05T06:21:52.015755606Z" level=info msg="CreateContainer within sandbox \"ed6a5ebc150869acbda82f238628520f5f4ae163e31e106b8b2aaa842be79982\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206\"" Sep 5 06:21:52.016199 containerd[1585]: time="2025-09-05T06:21:52.016166158Z" level=info msg="StartContainer for \"f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206\"" Sep 5 06:21:52.017098 containerd[1585]: time="2025-09-05T06:21:52.017061169Z" level=info msg="connecting to shim f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206" address="unix:///run/containerd/s/6f7f84f54331da58ec79b94d0f9bee054f1cdfb514f687d2f0b80f6bf77b1eef" protocol=ttrpc version=3 Sep 5 06:21:52.036972 systemd[1]: Started cri-containerd-f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206.scope - libcontainer container f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206. Sep 5 06:21:52.070831 containerd[1585]: time="2025-09-05T06:21:52.070719727Z" level=info msg="StartContainer for \"f612bdb8ba7c3cdfdfe468d07a6a9877e95043f261577962b5a1219213c32206\" returns successfully" Sep 5 06:21:52.926194 kubelet[2782]: I0905 06:21:52.926072 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-pbptv" podStartSLOduration=47.926053277 podStartE2EDuration="47.926053277s" podCreationTimestamp="2025-09-05 06:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 06:21:52.925663895 +0000 UTC m=+54.332905783" watchObservedRunningTime="2025-09-05 06:21:52.926053277 +0000 UTC m=+54.333295155" Sep 5 06:21:52.994634 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1197523504.mount: Deactivated successfully. Sep 5 06:21:53.245040 systemd-networkd[1494]: cali891c9cbe0ad: Gained IPv6LL Sep 5 06:21:54.066997 systemd[1]: Started sshd@9-10.0.0.140:22-10.0.0.1:33540.service - OpenSSH per-connection server daemon (10.0.0.1:33540). Sep 5 06:21:54.072834 containerd[1585]: time="2025-09-05T06:21:54.072138083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:54.073566 containerd[1585]: time="2025-09-05T06:21:54.073535403Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 5 06:21:54.074797 containerd[1585]: time="2025-09-05T06:21:54.074771233Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:54.080216 containerd[1585]: time="2025-09-05T06:21:54.079909665Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:54.082015 containerd[1585]: time="2025-09-05T06:21:54.081827884Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.064533834s" Sep 5 06:21:54.082362 containerd[1585]: time="2025-09-05T06:21:54.082329127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 5 06:21:54.084259 containerd[1585]: time="2025-09-05T06:21:54.084226248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 5 06:21:54.088455 containerd[1585]: time="2025-09-05T06:21:54.088378134Z" level=info msg="CreateContainer within sandbox \"8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 5 06:21:54.100240 containerd[1585]: time="2025-09-05T06:21:54.100133239Z" level=info msg="Container 49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:54.294045 sshd[5084]: Accepted publickey for core from 10.0.0.1 port 33540 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:21:54.296013 sshd-session[5084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:21:54.302506 systemd-logind[1570]: New session 10 of user core. Sep 5 06:21:54.312963 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 5 06:21:54.615348 sshd[5087]: Connection closed by 10.0.0.1 port 33540 Sep 5 06:21:54.616066 sshd-session[5084]: pam_unix(sshd:session): session closed for user core Sep 5 06:21:54.623432 systemd-logind[1570]: Session 10 logged out. Waiting for processes to exit. Sep 5 06:21:54.624033 systemd[1]: sshd@9-10.0.0.140:22-10.0.0.1:33540.service: Deactivated successfully. Sep 5 06:21:54.627418 systemd[1]: session-10.scope: Deactivated successfully. Sep 5 06:21:54.627869 containerd[1585]: time="2025-09-05T06:21:54.627456559Z" level=info msg="CreateContainer within sandbox \"8dd1e5c8da23745cb382a65ccee19945d5ea27b07cee451c1cc8b861df355fa9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\"" Sep 5 06:21:54.630194 containerd[1585]: time="2025-09-05T06:21:54.629972150Z" level=info msg="StartContainer for \"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\"" Sep 5 06:21:54.632850 containerd[1585]: time="2025-09-05T06:21:54.631859022Z" level=info msg="connecting to shim 49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5" address="unix:///run/containerd/s/ec334388f11cafdee212e36d7e0637b0b5dd92cbafb441a5451631336d0e7712" protocol=ttrpc version=3 Sep 5 06:21:54.630870 systemd-logind[1570]: Removed session 10. Sep 5 06:21:54.692493 systemd[1]: Started cri-containerd-49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5.scope - libcontainer container 49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5. Sep 5 06:21:54.880782 containerd[1585]: time="2025-09-05T06:21:54.880736739Z" level=info msg="StartContainer for \"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\" returns successfully" Sep 5 06:21:55.007028 containerd[1585]: time="2025-09-05T06:21:55.006987626Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\" id:\"68266d9aabae0a57422c9d65ecb508479922258aa911e51327de08b46093f1dc\" pid:5148 exit_status:1 exited_at:{seconds:1757053315 nanos:6524143}" Sep 5 06:21:55.064153 kubelet[2782]: I0905 06:21:55.064062 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-kg56r" podStartSLOduration=28.963336886 podStartE2EDuration="35.064039011s" podCreationTimestamp="2025-09-05 06:21:20 +0000 UTC" firstStartedPulling="2025-09-05 06:21:47.982684951 +0000 UTC m=+49.389926829" lastFinishedPulling="2025-09-05 06:21:54.083387086 +0000 UTC m=+55.490628954" observedRunningTime="2025-09-05 06:21:55.061992769 +0000 UTC m=+56.469234657" watchObservedRunningTime="2025-09-05 06:21:55.064039011 +0000 UTC m=+56.471280889" Sep 5 06:21:56.001614 containerd[1585]: time="2025-09-05T06:21:56.001558270Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:56.002732 containerd[1585]: time="2025-09-05T06:21:56.002685612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 5 06:21:56.003902 containerd[1585]: time="2025-09-05T06:21:56.003876985Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:56.006057 containerd[1585]: time="2025-09-05T06:21:56.006019601Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:56.006441 containerd[1585]: time="2025-09-05T06:21:56.006417123Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.922158886s" Sep 5 06:21:56.006492 containerd[1585]: time="2025-09-05T06:21:56.006445506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 5 06:21:56.008169 containerd[1585]: time="2025-09-05T06:21:56.007984196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 06:21:56.008169 containerd[1585]: time="2025-09-05T06:21:56.007997080Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\" id:\"ab71d43cf84163f89727bb9ad3182389f7a82fb7dcec4ee551e2da8f2e41a14c\" pid:5179 exit_status:1 exited_at:{seconds:1757053316 nanos:7520000}" Sep 5 06:21:56.012127 containerd[1585]: time="2025-09-05T06:21:56.012095284Z" level=info msg="CreateContainer within sandbox \"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 5 06:21:56.024430 containerd[1585]: time="2025-09-05T06:21:56.024396128Z" level=info msg="Container be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:56.033114 containerd[1585]: time="2025-09-05T06:21:56.033087099Z" level=info msg="CreateContainer within sandbox \"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3\"" Sep 5 06:21:56.033667 containerd[1585]: time="2025-09-05T06:21:56.033618191Z" level=info msg="StartContainer for \"be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3\"" Sep 5 06:21:56.035545 containerd[1585]: time="2025-09-05T06:21:56.035511492Z" level=info msg="connecting to shim be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3" address="unix:///run/containerd/s/a9b091bb5299d4a100543273f50985c88b7678a4362b363a9641006ba3596c31" protocol=ttrpc version=3 Sep 5 06:21:56.057956 systemd[1]: Started cri-containerd-be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3.scope - libcontainer container be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3. Sep 5 06:21:56.108916 containerd[1585]: time="2025-09-05T06:21:56.108863986Z" level=info msg="StartContainer for \"be2a0d4bc6ab22aafd3555008de21d07baefd751b7ecb32cbe232901c50d04c3\" returns successfully" Sep 5 06:21:56.999151 containerd[1585]: time="2025-09-05T06:21:56.999071416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\" id:\"52385417edd4f855e99cea97fddb86f372513a8b671e5b47c614308faa1ca129\" pid:5231 exit_status:1 exited_at:{seconds:1757053316 nanos:998704502}" Sep 5 06:21:58.612313 containerd[1585]: time="2025-09-05T06:21:58.612246542Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:58.613053 containerd[1585]: time="2025-09-05T06:21:58.613019758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 5 06:21:58.614242 containerd[1585]: time="2025-09-05T06:21:58.614183123Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:58.616371 containerd[1585]: time="2025-09-05T06:21:58.616322832Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:21:58.616977 containerd[1585]: time="2025-09-05T06:21:58.616941490Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.608924133s" Sep 5 06:21:58.616977 containerd[1585]: time="2025-09-05T06:21:58.616973819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 5 06:21:58.618887 containerd[1585]: time="2025-09-05T06:21:58.618348831Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 5 06:21:58.623488 containerd[1585]: time="2025-09-05T06:21:58.623456348Z" level=info msg="CreateContainer within sandbox \"0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 06:21:58.631833 containerd[1585]: time="2025-09-05T06:21:58.631774497Z" level=info msg="Container e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:21:58.640087 containerd[1585]: time="2025-09-05T06:21:58.640049226Z" level=info msg="CreateContainer within sandbox \"0f19292639f365239c2b7e5a9da7ccd1741e3253ef7ee17824c2b6191a4eaf2f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca\"" Sep 5 06:21:58.640947 containerd[1585]: time="2025-09-05T06:21:58.640914383Z" level=info msg="StartContainer for \"e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca\"" Sep 5 06:21:58.641904 containerd[1585]: time="2025-09-05T06:21:58.641877684Z" level=info msg="connecting to shim e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca" address="unix:///run/containerd/s/8ee52e66a1cb8b227e257a32cbb2819d826fc1fb37eee76ef4d1d6536f5681e9" protocol=ttrpc version=3 Sep 5 06:21:58.674984 systemd[1]: Started cri-containerd-e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca.scope - libcontainer container e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca. Sep 5 06:21:58.728943 containerd[1585]: time="2025-09-05T06:21:58.728885112Z" level=info msg="StartContainer for \"e4d7e7737ad6674bcc823b2944010bda924ec8200f60af9060064a023b1385ca\" returns successfully" Sep 5 06:21:58.939694 kubelet[2782]: I0905 06:21:58.939303 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cf68b5bff-qcvfw" podStartSLOduration=30.530127768 podStartE2EDuration="40.939283692s" podCreationTimestamp="2025-09-05 06:21:18 +0000 UTC" firstStartedPulling="2025-09-05 06:21:48.208738025 +0000 UTC m=+49.615979903" lastFinishedPulling="2025-09-05 06:21:58.617893949 +0000 UTC m=+60.025135827" observedRunningTime="2025-09-05 06:21:58.939083569 +0000 UTC m=+60.346325447" watchObservedRunningTime="2025-09-05 06:21:58.939283692 +0000 UTC m=+60.346525570" Sep 5 06:21:59.638892 systemd[1]: Started sshd@10-10.0.0.140:22-10.0.0.1:33542.service - OpenSSH per-connection server daemon (10.0.0.1:33542). Sep 5 06:21:59.709967 sshd[5302]: Accepted publickey for core from 10.0.0.1 port 33542 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:21:59.712242 sshd-session[5302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:21:59.716779 systemd-logind[1570]: New session 11 of user core. Sep 5 06:21:59.725945 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 5 06:21:59.860077 sshd[5305]: Connection closed by 10.0.0.1 port 33542 Sep 5 06:21:59.860468 sshd-session[5302]: pam_unix(sshd:session): session closed for user core Sep 5 06:21:59.876804 systemd[1]: sshd@10-10.0.0.140:22-10.0.0.1:33542.service: Deactivated successfully. Sep 5 06:21:59.878991 systemd[1]: session-11.scope: Deactivated successfully. Sep 5 06:21:59.879754 systemd-logind[1570]: Session 11 logged out. Waiting for processes to exit. Sep 5 06:21:59.881541 systemd-logind[1570]: Removed session 11. Sep 5 06:21:59.883271 systemd[1]: Started sshd@11-10.0.0.140:22-10.0.0.1:33550.service - OpenSSH per-connection server daemon (10.0.0.1:33550). Sep 5 06:21:59.930981 kubelet[2782]: I0905 06:21:59.930940 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:21:59.941313 sshd[5319]: Accepted publickey for core from 10.0.0.1 port 33550 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:21:59.942995 sshd-session[5319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:21:59.947942 systemd-logind[1570]: New session 12 of user core. Sep 5 06:21:59.954948 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 5 06:22:00.114405 sshd[5322]: Connection closed by 10.0.0.1 port 33550 Sep 5 06:22:00.116469 sshd-session[5319]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:00.126212 systemd[1]: sshd@11-10.0.0.140:22-10.0.0.1:33550.service: Deactivated successfully. Sep 5 06:22:00.129599 systemd[1]: session-12.scope: Deactivated successfully. Sep 5 06:22:00.132994 systemd-logind[1570]: Session 12 logged out. Waiting for processes to exit. Sep 5 06:22:00.137007 systemd[1]: Started sshd@12-10.0.0.140:22-10.0.0.1:55480.service - OpenSSH per-connection server daemon (10.0.0.1:55480). Sep 5 06:22:00.139482 systemd-logind[1570]: Removed session 12. Sep 5 06:22:00.186881 sshd[5333]: Accepted publickey for core from 10.0.0.1 port 55480 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:00.188685 sshd-session[5333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:00.193715 systemd-logind[1570]: New session 13 of user core. Sep 5 06:22:00.203994 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 5 06:22:00.328934 sshd[5336]: Connection closed by 10.0.0.1 port 55480 Sep 5 06:22:00.329334 sshd-session[5333]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:00.334878 systemd[1]: sshd@12-10.0.0.140:22-10.0.0.1:55480.service: Deactivated successfully. Sep 5 06:22:00.337207 systemd[1]: session-13.scope: Deactivated successfully. Sep 5 06:22:00.339180 systemd-logind[1570]: Session 13 logged out. Waiting for processes to exit. Sep 5 06:22:00.340728 systemd-logind[1570]: Removed session 13. Sep 5 06:22:02.085687 containerd[1585]: time="2025-09-05T06:22:02.085580676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:02.086350 containerd[1585]: time="2025-09-05T06:22:02.086284957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 5 06:22:02.087488 containerd[1585]: time="2025-09-05T06:22:02.087442991Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:02.090404 containerd[1585]: time="2025-09-05T06:22:02.090343605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:02.091310 containerd[1585]: time="2025-09-05T06:22:02.091228776Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.472839642s" Sep 5 06:22:02.091310 containerd[1585]: time="2025-09-05T06:22:02.091300441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 5 06:22:02.092496 containerd[1585]: time="2025-09-05T06:22:02.092471218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 5 06:22:02.106469 containerd[1585]: time="2025-09-05T06:22:02.106428192Z" level=info msg="CreateContainer within sandbox \"cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 5 06:22:02.116084 containerd[1585]: time="2025-09-05T06:22:02.116053767Z" level=info msg="Container 1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:22:02.124765 containerd[1585]: time="2025-09-05T06:22:02.124717286Z" level=info msg="CreateContainer within sandbox \"cca08677e55226ec7ca2b462cc43a154648b5904ff7adea6b090eb58a3d707f6\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\"" Sep 5 06:22:02.125575 containerd[1585]: time="2025-09-05T06:22:02.125340727Z" level=info msg="StartContainer for \"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\"" Sep 5 06:22:02.126840 containerd[1585]: time="2025-09-05T06:22:02.126784015Z" level=info msg="connecting to shim 1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1" address="unix:///run/containerd/s/67274d3d49e4ff9f18b90c7e91db447d74dd4127a23c65d628f8e4f4b048c7cd" protocol=ttrpc version=3 Sep 5 06:22:02.155957 systemd[1]: Started cri-containerd-1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1.scope - libcontainer container 1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1. Sep 5 06:22:02.208117 containerd[1585]: time="2025-09-05T06:22:02.208075240Z" level=info msg="StartContainer for \"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\" returns successfully" Sep 5 06:22:02.536516 containerd[1585]: time="2025-09-05T06:22:02.536443833Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:02.537179 containerd[1585]: time="2025-09-05T06:22:02.537113840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 5 06:22:02.539180 containerd[1585]: time="2025-09-05T06:22:02.539149090Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 446.645491ms" Sep 5 06:22:02.539180 containerd[1585]: time="2025-09-05T06:22:02.539178004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 5 06:22:02.540269 containerd[1585]: time="2025-09-05T06:22:02.540223396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 5 06:22:02.545428 containerd[1585]: time="2025-09-05T06:22:02.545362551Z" level=info msg="CreateContainer within sandbox \"af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 5 06:22:02.554208 containerd[1585]: time="2025-09-05T06:22:02.554170101Z" level=info msg="Container 48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:22:02.564314 containerd[1585]: time="2025-09-05T06:22:02.564273332Z" level=info msg="CreateContainer within sandbox \"af0ef7bbd35d968654c1bac4c7ac7a4e1d3c43d2fe9a03953f506a0944a978f2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736\"" Sep 5 06:22:02.564931 containerd[1585]: time="2025-09-05T06:22:02.564738866Z" level=info msg="StartContainer for \"48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736\"" Sep 5 06:22:02.565739 containerd[1585]: time="2025-09-05T06:22:02.565716220Z" level=info msg="connecting to shim 48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736" address="unix:///run/containerd/s/16b2aef0cd553e773e939a8f3615bae8b70be1729ffeaa61eec0ec77dd9af8c5" protocol=ttrpc version=3 Sep 5 06:22:02.595986 systemd[1]: Started cri-containerd-48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736.scope - libcontainer container 48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736. Sep 5 06:22:02.650856 containerd[1585]: time="2025-09-05T06:22:02.650801736Z" level=info msg="StartContainer for \"48eb955064f4092df7d7fd2f11842eb8bae4f63294f5281d1817bb4b128c1736\" returns successfully" Sep 5 06:22:03.006278 kubelet[2782]: I0905 06:22:03.006213 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7b8d7b8bf-wqzvf" podStartSLOduration=28.913324522 podStartE2EDuration="42.006196338s" podCreationTimestamp="2025-09-05 06:21:21 +0000 UTC" firstStartedPulling="2025-09-05 06:21:48.999370452 +0000 UTC m=+50.406612320" lastFinishedPulling="2025-09-05 06:22:02.092242248 +0000 UTC m=+63.499484136" observedRunningTime="2025-09-05 06:22:03.005668998 +0000 UTC m=+64.412910876" watchObservedRunningTime="2025-09-05 06:22:03.006196338 +0000 UTC m=+64.413438216" Sep 5 06:22:03.068624 kubelet[2782]: I0905 06:22:03.068474 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cf68b5bff-zqmnv" podStartSLOduration=32.788331758 podStartE2EDuration="45.06845474s" podCreationTimestamp="2025-09-05 06:21:18 +0000 UTC" firstStartedPulling="2025-09-05 06:21:50.259922901 +0000 UTC m=+51.667164779" lastFinishedPulling="2025-09-05 06:22:02.540045863 +0000 UTC m=+63.947287761" observedRunningTime="2025-09-05 06:22:03.06808732 +0000 UTC m=+64.475329198" watchObservedRunningTime="2025-09-05 06:22:03.06845474 +0000 UTC m=+64.475696608" Sep 5 06:22:03.948253 kubelet[2782]: I0905 06:22:03.948178 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:22:04.022181 containerd[1585]: time="2025-09-05T06:22:04.022127641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\" id:\"993a21271930d4ca040e2e7d0ee928b6dcfb95d7bd746e9c2d568e95e427b467\" pid:5451 exited_at:{seconds:1757053324 nanos:21685499}" Sep 5 06:22:04.711611 containerd[1585]: time="2025-09-05T06:22:04.711509072Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:04.712338 containerd[1585]: time="2025-09-05T06:22:04.712242552Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 5 06:22:04.713725 containerd[1585]: time="2025-09-05T06:22:04.713670416Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:04.716113 containerd[1585]: time="2025-09-05T06:22:04.716050772Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 06:22:04.716866 containerd[1585]: time="2025-09-05T06:22:04.716778330Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.176518786s" Sep 5 06:22:04.716927 containerd[1585]: time="2025-09-05T06:22:04.716868880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 5 06:22:04.727837 containerd[1585]: time="2025-09-05T06:22:04.727758505Z" level=info msg="CreateContainer within sandbox \"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 5 06:22:04.745899 containerd[1585]: time="2025-09-05T06:22:04.745796008Z" level=info msg="Container f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c: CDI devices from CRI Config.CDIDevices: []" Sep 5 06:22:04.762997 containerd[1585]: time="2025-09-05T06:22:04.762927176Z" level=info msg="CreateContainer within sandbox \"c22ffdaabf2085436ff13476b7228183f171e36ef59869c7f37ce5d357cdc86a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c\"" Sep 5 06:22:04.763625 containerd[1585]: time="2025-09-05T06:22:04.763576517Z" level=info msg="StartContainer for \"f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c\"" Sep 5 06:22:04.765266 containerd[1585]: time="2025-09-05T06:22:04.765224295Z" level=info msg="connecting to shim f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c" address="unix:///run/containerd/s/a9b091bb5299d4a100543273f50985c88b7678a4362b363a9641006ba3596c31" protocol=ttrpc version=3 Sep 5 06:22:04.806142 systemd[1]: Started cri-containerd-f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c.scope - libcontainer container f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c. Sep 5 06:22:05.110759 containerd[1585]: time="2025-09-05T06:22:05.110547627Z" level=info msg="StartContainer for \"f8092a89f2863a1544d409b106965b8a24386454ae1ed76c5adbb981d03e448c\" returns successfully" Sep 5 06:22:05.345104 systemd[1]: Started sshd@13-10.0.0.140:22-10.0.0.1:55482.service - OpenSSH per-connection server daemon (10.0.0.1:55482). Sep 5 06:22:05.423279 sshd[5502]: Accepted publickey for core from 10.0.0.1 port 55482 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:05.426342 sshd-session[5502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:05.432637 systemd-logind[1570]: New session 14 of user core. Sep 5 06:22:05.442959 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 5 06:22:05.604987 sshd[5505]: Connection closed by 10.0.0.1 port 55482 Sep 5 06:22:05.605474 sshd-session[5502]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:05.610606 systemd[1]: sshd@13-10.0.0.140:22-10.0.0.1:55482.service: Deactivated successfully. Sep 5 06:22:05.613414 systemd[1]: session-14.scope: Deactivated successfully. Sep 5 06:22:05.615417 systemd-logind[1570]: Session 14 logged out. Waiting for processes to exit. Sep 5 06:22:05.618307 systemd-logind[1570]: Removed session 14. Sep 5 06:22:05.816030 kubelet[2782]: I0905 06:22:05.815898 2782 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 5 06:22:05.817792 kubelet[2782]: I0905 06:22:05.817756 2782 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 5 06:22:06.140183 kubelet[2782]: I0905 06:22:06.139782 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fmvm2" podStartSLOduration=28.514233954 podStartE2EDuration="45.139758603s" podCreationTimestamp="2025-09-05 06:21:21 +0000 UTC" firstStartedPulling="2025-09-05 06:21:48.097498723 +0000 UTC m=+49.504740591" lastFinishedPulling="2025-09-05 06:22:04.723023372 +0000 UTC m=+66.130265240" observedRunningTime="2025-09-05 06:22:06.13931168 +0000 UTC m=+67.546553548" watchObservedRunningTime="2025-09-05 06:22:06.139758603 +0000 UTC m=+67.547000481" Sep 5 06:22:07.675092 kubelet[2782]: I0905 06:22:07.675027 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:22:10.622003 systemd[1]: Started sshd@14-10.0.0.140:22-10.0.0.1:55450.service - OpenSSH per-connection server daemon (10.0.0.1:55450). Sep 5 06:22:10.688724 sshd[5529]: Accepted publickey for core from 10.0.0.1 port 55450 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:10.690119 sshd-session[5529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:10.694944 systemd-logind[1570]: New session 15 of user core. Sep 5 06:22:10.705967 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 5 06:22:10.832934 sshd[5532]: Connection closed by 10.0.0.1 port 55450 Sep 5 06:22:10.833273 sshd-session[5529]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:10.838418 systemd[1]: sshd@14-10.0.0.140:22-10.0.0.1:55450.service: Deactivated successfully. Sep 5 06:22:10.840629 systemd[1]: session-15.scope: Deactivated successfully. Sep 5 06:22:10.841453 systemd-logind[1570]: Session 15 logged out. Waiting for processes to exit. Sep 5 06:22:10.842568 systemd-logind[1570]: Removed session 15. Sep 5 06:22:13.723752 kubelet[2782]: E0905 06:22:13.723702 2782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 5 06:22:14.955965 containerd[1585]: time="2025-09-05T06:22:14.955913015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"75519761823d64aa67cd2fdda3d05cc513baf8dab048b99acb05471a8e3bf4d3\" id:\"3a540ea407946341c24038bcf2b22ef3db3cf053f43a19598903b2fc846dfc9d\" pid:5558 exited_at:{seconds:1757053334 nanos:955523075}" Sep 5 06:22:15.849439 systemd[1]: Started sshd@15-10.0.0.140:22-10.0.0.1:55462.service - OpenSSH per-connection server daemon (10.0.0.1:55462). Sep 5 06:22:15.898118 sshd[5574]: Accepted publickey for core from 10.0.0.1 port 55462 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:15.900034 sshd-session[5574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:15.904655 systemd-logind[1570]: New session 16 of user core. Sep 5 06:22:15.915957 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 5 06:22:16.051002 sshd[5577]: Connection closed by 10.0.0.1 port 55462 Sep 5 06:22:16.051352 sshd-session[5574]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:16.056307 systemd[1]: sshd@15-10.0.0.140:22-10.0.0.1:55462.service: Deactivated successfully. Sep 5 06:22:16.058576 systemd[1]: session-16.scope: Deactivated successfully. Sep 5 06:22:16.059486 systemd-logind[1570]: Session 16 logged out. Waiting for processes to exit. Sep 5 06:22:16.060842 systemd-logind[1570]: Removed session 16. Sep 5 06:22:19.361287 containerd[1585]: time="2025-09-05T06:22:19.361237456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\" id:\"f51a1bc06a1896019ab877e6484a2d86fc2a86c23efb7655403b7cea04db7454\" pid:5605 exited_at:{seconds:1757053339 nanos:360871130}" Sep 5 06:22:20.431590 kubelet[2782]: I0905 06:22:20.431525 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 06:22:21.067739 systemd[1]: Started sshd@16-10.0.0.140:22-10.0.0.1:55118.service - OpenSSH per-connection server daemon (10.0.0.1:55118). Sep 5 06:22:21.118481 sshd[5619]: Accepted publickey for core from 10.0.0.1 port 55118 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:21.120216 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:21.124574 systemd-logind[1570]: New session 17 of user core. Sep 5 06:22:21.131957 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 5 06:22:21.254044 sshd[5622]: Connection closed by 10.0.0.1 port 55118 Sep 5 06:22:21.254400 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:21.271701 systemd[1]: sshd@16-10.0.0.140:22-10.0.0.1:55118.service: Deactivated successfully. Sep 5 06:22:21.273802 systemd[1]: session-17.scope: Deactivated successfully. Sep 5 06:22:21.274604 systemd-logind[1570]: Session 17 logged out. Waiting for processes to exit. Sep 5 06:22:21.278142 systemd[1]: Started sshd@17-10.0.0.140:22-10.0.0.1:55124.service - OpenSSH per-connection server daemon (10.0.0.1:55124). Sep 5 06:22:21.279310 systemd-logind[1570]: Removed session 17. Sep 5 06:22:21.334792 sshd[5635]: Accepted publickey for core from 10.0.0.1 port 55124 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:21.336033 sshd-session[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:21.340475 systemd-logind[1570]: New session 18 of user core. Sep 5 06:22:21.347946 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 5 06:22:21.697014 sshd[5638]: Connection closed by 10.0.0.1 port 55124 Sep 5 06:22:21.698057 sshd-session[5635]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:21.707294 systemd[1]: Started sshd@18-10.0.0.140:22-10.0.0.1:55130.service - OpenSSH per-connection server daemon (10.0.0.1:55130). Sep 5 06:22:21.711865 systemd[1]: sshd@17-10.0.0.140:22-10.0.0.1:55124.service: Deactivated successfully. Sep 5 06:22:21.721382 systemd[1]: session-18.scope: Deactivated successfully. Sep 5 06:22:21.730086 systemd-logind[1570]: Session 18 logged out. Waiting for processes to exit. Sep 5 06:22:21.734226 systemd-logind[1570]: Removed session 18. Sep 5 06:22:21.784839 sshd[5647]: Accepted publickey for core from 10.0.0.1 port 55130 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:21.786220 sshd-session[5647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:21.792063 systemd-logind[1570]: New session 19 of user core. Sep 5 06:22:21.799050 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 5 06:22:22.464747 sshd[5654]: Connection closed by 10.0.0.1 port 55130 Sep 5 06:22:22.467188 sshd-session[5647]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:22.479947 systemd[1]: sshd@18-10.0.0.140:22-10.0.0.1:55130.service: Deactivated successfully. Sep 5 06:22:22.484505 systemd[1]: session-19.scope: Deactivated successfully. Sep 5 06:22:22.487009 systemd-logind[1570]: Session 19 logged out. Waiting for processes to exit. Sep 5 06:22:22.490615 systemd[1]: Started sshd@19-10.0.0.140:22-10.0.0.1:55136.service - OpenSSH per-connection server daemon (10.0.0.1:55136). Sep 5 06:22:22.492683 systemd-logind[1570]: Removed session 19. Sep 5 06:22:22.548179 sshd[5673]: Accepted publickey for core from 10.0.0.1 port 55136 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:22.550326 sshd-session[5673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:22.555859 systemd-logind[1570]: New session 20 of user core. Sep 5 06:22:22.561011 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 5 06:22:22.781907 sshd[5677]: Connection closed by 10.0.0.1 port 55136 Sep 5 06:22:22.783107 sshd-session[5673]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:22.795317 systemd[1]: sshd@19-10.0.0.140:22-10.0.0.1:55136.service: Deactivated successfully. Sep 5 06:22:22.797635 systemd[1]: session-20.scope: Deactivated successfully. Sep 5 06:22:22.798757 systemd-logind[1570]: Session 20 logged out. Waiting for processes to exit. Sep 5 06:22:22.802073 systemd[1]: Started sshd@20-10.0.0.140:22-10.0.0.1:55144.service - OpenSSH per-connection server daemon (10.0.0.1:55144). Sep 5 06:22:22.802766 systemd-logind[1570]: Removed session 20. Sep 5 06:22:22.860684 sshd[5689]: Accepted publickey for core from 10.0.0.1 port 55144 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:22.863051 sshd-session[5689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:22.868504 systemd-logind[1570]: New session 21 of user core. Sep 5 06:22:22.876973 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 5 06:22:23.125397 sshd[5692]: Connection closed by 10.0.0.1 port 55144 Sep 5 06:22:23.125743 sshd-session[5689]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:23.129793 systemd[1]: sshd@20-10.0.0.140:22-10.0.0.1:55144.service: Deactivated successfully. Sep 5 06:22:23.132016 systemd[1]: session-21.scope: Deactivated successfully. Sep 5 06:22:23.132996 systemd-logind[1570]: Session 21 logged out. Waiting for processes to exit. Sep 5 06:22:23.134092 systemd-logind[1570]: Removed session 21. Sep 5 06:22:25.724200 kubelet[2782]: E0905 06:22:25.724129 2782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 5 06:22:27.007790 containerd[1585]: time="2025-09-05T06:22:27.007742652Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49989e3e8db6d49ed804513522cb5a70ea775b2a0fa61242d86a81a72dc35da5\" id:\"4656fe2ec62baa8564e1bef2278a20d36201964c6a71da6a5dec6db88f28a421\" pid:5715 exited_at:{seconds:1757053347 nanos:7433222}" Sep 5 06:22:28.141666 systemd[1]: Started sshd@21-10.0.0.140:22-10.0.0.1:55146.service - OpenSSH per-connection server daemon (10.0.0.1:55146). Sep 5 06:22:28.193495 sshd[5735]: Accepted publickey for core from 10.0.0.1 port 55146 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:28.194859 sshd-session[5735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:28.199613 systemd-logind[1570]: New session 22 of user core. Sep 5 06:22:28.208943 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 5 06:22:28.319735 sshd[5740]: Connection closed by 10.0.0.1 port 55146 Sep 5 06:22:28.320112 sshd-session[5735]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:28.324252 systemd[1]: sshd@21-10.0.0.140:22-10.0.0.1:55146.service: Deactivated successfully. Sep 5 06:22:28.326431 systemd[1]: session-22.scope: Deactivated successfully. Sep 5 06:22:28.327391 systemd-logind[1570]: Session 22 logged out. Waiting for processes to exit. Sep 5 06:22:28.328689 systemd-logind[1570]: Removed session 22. Sep 5 06:22:33.333627 systemd[1]: Started sshd@22-10.0.0.140:22-10.0.0.1:54608.service - OpenSSH per-connection server daemon (10.0.0.1:54608). Sep 5 06:22:33.389692 sshd[5755]: Accepted publickey for core from 10.0.0.1 port 54608 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:33.391012 sshd-session[5755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:33.395433 systemd-logind[1570]: New session 23 of user core. Sep 5 06:22:33.406999 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 5 06:22:33.511854 sshd[5758]: Connection closed by 10.0.0.1 port 54608 Sep 5 06:22:33.512221 sshd-session[5755]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:33.516205 systemd[1]: sshd@22-10.0.0.140:22-10.0.0.1:54608.service: Deactivated successfully. Sep 5 06:22:33.518307 systemd[1]: session-23.scope: Deactivated successfully. Sep 5 06:22:33.520581 systemd-logind[1570]: Session 23 logged out. Waiting for processes to exit. Sep 5 06:22:33.521469 systemd-logind[1570]: Removed session 23. Sep 5 06:22:33.724217 kubelet[2782]: E0905 06:22:33.724173 2782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 5 06:22:33.992927 containerd[1585]: time="2025-09-05T06:22:33.992792572Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1020380adf3ebdb07c806add7d45925416134036db97449a3096f3fca0a232e1\" id:\"c321feccd64bf1164287a83835f901937cdc14782dbe94bcc978e4541828dd30\" pid:5783 exited_at:{seconds:1757053353 nanos:992403498}" Sep 5 06:22:35.723627 kubelet[2782]: E0905 06:22:35.723558 2782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 5 06:22:38.530238 systemd[1]: Started sshd@23-10.0.0.140:22-10.0.0.1:54622.service - OpenSSH per-connection server daemon (10.0.0.1:54622). Sep 5 06:22:38.621783 sshd[5796]: Accepted publickey for core from 10.0.0.1 port 54622 ssh2: RSA SHA256:HqMeIfrf4KQXAwx0HmR8sqFIxN2NpL8j8iF6H1odrA8 Sep 5 06:22:38.623541 sshd-session[5796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 06:22:38.628053 systemd-logind[1570]: New session 24 of user core. Sep 5 06:22:38.640044 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 5 06:22:38.814490 sshd[5800]: Connection closed by 10.0.0.1 port 54622 Sep 5 06:22:38.814746 sshd-session[5796]: pam_unix(sshd:session): session closed for user core Sep 5 06:22:38.821511 systemd[1]: sshd@23-10.0.0.140:22-10.0.0.1:54622.service: Deactivated successfully. Sep 5 06:22:38.823760 systemd[1]: session-24.scope: Deactivated successfully. Sep 5 06:22:38.824703 systemd-logind[1570]: Session 24 logged out. Waiting for processes to exit. Sep 5 06:22:38.825832 systemd-logind[1570]: Removed session 24.