May 13 23:52:12.084065 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 22:08:35 -00 2025 May 13 23:52:12.084095 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:52:12.084110 kernel: BIOS-provided physical RAM map: May 13 23:52:12.084119 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 13 23:52:12.084127 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable May 13 23:52:12.084136 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 13 23:52:12.084146 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable May 13 23:52:12.084155 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 13 23:52:12.084164 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable May 13 23:52:12.084173 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 13 23:52:12.084182 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable May 13 23:52:12.084195 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 13 23:52:12.084209 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 13 23:52:12.084219 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 13 23:52:12.084233 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 13 23:52:12.084243 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 13 23:52:12.084258 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce91fff] usable May 13 23:52:12.084267 kernel: BIOS-e820: [mem 0x000000009ce92000-0x000000009ce95fff] reserved May 13 23:52:12.084277 kernel: BIOS-e820: [mem 0x000000009ce96000-0x000000009ce97fff] ACPI NVS May 13 23:52:12.084286 kernel: BIOS-e820: [mem 0x000000009ce98000-0x000000009cedbfff] usable May 13 23:52:12.084295 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 13 23:52:12.084305 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 13 23:52:12.084314 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 13 23:52:12.084324 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 23:52:12.084334 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 13 23:52:12.084343 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 23:52:12.084353 kernel: NX (Execute Disable) protection: active May 13 23:52:12.084367 kernel: APIC: Static calls initialized May 13 23:52:12.084377 kernel: e820: update [mem 0x9b351018-0x9b35ac57] usable ==> usable May 13 23:52:12.084387 kernel: e820: update [mem 0x9b351018-0x9b35ac57] usable ==> usable May 13 23:52:12.084492 kernel: e820: update [mem 0x9b314018-0x9b350e57] usable ==> usable May 13 23:52:12.084504 kernel: e820: update [mem 0x9b314018-0x9b350e57] usable ==> usable May 13 23:52:12.084513 kernel: extended physical RAM map: May 13 23:52:12.084523 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable May 13 23:52:12.084532 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable May 13 23:52:12.084542 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 13 23:52:12.084551 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable May 13 23:52:12.084561 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 13 23:52:12.084571 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable May 13 23:52:12.084588 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 13 23:52:12.084603 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b314017] usable May 13 23:52:12.084613 kernel: reserve setup_data: [mem 0x000000009b314018-0x000000009b350e57] usable May 13 23:52:12.084624 kernel: reserve setup_data: [mem 0x000000009b350e58-0x000000009b351017] usable May 13 23:52:12.084635 kernel: reserve setup_data: [mem 0x000000009b351018-0x000000009b35ac57] usable May 13 23:52:12.084645 kernel: reserve setup_data: [mem 0x000000009b35ac58-0x000000009bd3efff] usable May 13 23:52:12.084665 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 13 23:52:12.084676 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 13 23:52:12.084686 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 13 23:52:12.084697 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 13 23:52:12.084707 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 13 23:52:12.084717 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce91fff] usable May 13 23:52:12.084727 kernel: reserve setup_data: [mem 0x000000009ce92000-0x000000009ce95fff] reserved May 13 23:52:12.084737 kernel: reserve setup_data: [mem 0x000000009ce96000-0x000000009ce97fff] ACPI NVS May 13 23:52:12.084748 kernel: reserve setup_data: [mem 0x000000009ce98000-0x000000009cedbfff] usable May 13 23:52:12.084758 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 13 23:52:12.084773 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 13 23:52:12.084784 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 13 23:52:12.084795 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 23:52:12.084810 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 13 23:52:12.084820 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 23:52:12.084830 kernel: efi: EFI v2.7 by EDK II May 13 23:52:12.084841 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9ba0d198 RNG=0x9cb73018 May 13 23:52:12.084851 kernel: random: crng init done May 13 23:52:12.084862 kernel: efi: Remove mem142: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map May 13 23:52:12.084873 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved May 13 23:52:12.084887 kernel: secureboot: Secure boot disabled May 13 23:52:12.084903 kernel: SMBIOS 2.8 present. May 13 23:52:12.084913 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 May 13 23:52:12.084924 kernel: Hypervisor detected: KVM May 13 23:52:12.084935 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 13 23:52:12.084946 kernel: kvm-clock: using sched offset of 3695297131 cycles May 13 23:52:12.084958 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 13 23:52:12.084980 kernel: tsc: Detected 2794.748 MHz processor May 13 23:52:12.084990 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 23:52:12.085002 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 23:52:12.085013 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 May 13 23:52:12.085029 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 13 23:52:12.085040 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 23:52:12.085051 kernel: Using GB pages for direct mapping May 13 23:52:12.085062 kernel: ACPI: Early table checksum verification disabled May 13 23:52:12.085073 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) May 13 23:52:12.085084 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) May 13 23:52:12.085096 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085107 kernel: ACPI: DSDT 0x000000009CB7A000 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085118 kernel: ACPI: FACS 0x000000009CBDD000 000040 May 13 23:52:12.085133 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085145 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085156 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085168 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:52:12.085179 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) May 13 23:52:12.085190 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] May 13 23:52:12.085201 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1a7] May 13 23:52:12.085212 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] May 13 23:52:12.085223 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] May 13 23:52:12.085238 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] May 13 23:52:12.085249 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] May 13 23:52:12.085261 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] May 13 23:52:12.085271 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] May 13 23:52:12.085282 kernel: No NUMA configuration found May 13 23:52:12.085293 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] May 13 23:52:12.085305 kernel: NODE_DATA(0) allocated [mem 0x9ce3a000-0x9ce3ffff] May 13 23:52:12.085315 kernel: Zone ranges: May 13 23:52:12.085326 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 23:52:12.085341 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] May 13 23:52:12.085352 kernel: Normal empty May 13 23:52:12.085368 kernel: Movable zone start for each node May 13 23:52:12.085379 kernel: Early memory node ranges May 13 23:52:12.085390 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 13 23:52:12.085417 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] May 13 23:52:12.085429 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] May 13 23:52:12.085439 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] May 13 23:52:12.085451 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] May 13 23:52:12.085466 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] May 13 23:52:12.085478 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce91fff] May 13 23:52:12.085489 kernel: node 0: [mem 0x000000009ce98000-0x000000009cedbfff] May 13 23:52:12.085500 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] May 13 23:52:12.085511 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 23:52:12.085522 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 13 23:52:12.085546 kernel: On node 0, zone DMA: 8 pages in unavailable ranges May 13 23:52:12.085561 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 23:52:12.085571 kernel: On node 0, zone DMA: 239 pages in unavailable ranges May 13 23:52:12.085582 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges May 13 23:52:12.085592 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges May 13 23:52:12.085608 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges May 13 23:52:12.085623 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges May 13 23:52:12.085634 kernel: ACPI: PM-Timer IO Port: 0x608 May 13 23:52:12.085645 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 13 23:52:12.085655 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 13 23:52:12.085665 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 13 23:52:12.085679 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 13 23:52:12.085689 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 23:52:12.085700 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 13 23:52:12.085711 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 13 23:52:12.085719 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 23:52:12.085727 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 13 23:52:12.085735 kernel: TSC deadline timer available May 13 23:52:12.085743 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs May 13 23:52:12.085751 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 13 23:52:12.085762 kernel: kvm-guest: KVM setup pv remote TLB flush May 13 23:52:12.085770 kernel: kvm-guest: setup PV sched yield May 13 23:52:12.085778 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices May 13 23:52:12.085786 kernel: Booting paravirtualized kernel on KVM May 13 23:52:12.085794 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 23:52:12.085802 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 13 23:52:12.085810 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u524288 May 13 23:52:12.085818 kernel: pcpu-alloc: s197032 r8192 d32344 u524288 alloc=1*2097152 May 13 23:52:12.085826 kernel: pcpu-alloc: [0] 0 1 2 3 May 13 23:52:12.085837 kernel: kvm-guest: PV spinlocks enabled May 13 23:52:12.085845 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 13 23:52:12.085854 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:52:12.085862 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:52:12.085870 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:52:12.085882 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 23:52:12.085890 kernel: Fallback order for Node 0: 0 May 13 23:52:12.085898 kernel: Built 1 zonelists, mobility grouping on. Total pages: 629460 May 13 23:52:12.085911 kernel: Policy zone: DMA32 May 13 23:52:12.085922 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:52:12.085933 kernel: Memory: 2385672K/2565800K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 179872K reserved, 0K cma-reserved) May 13 23:52:12.085942 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 13 23:52:12.085950 kernel: ftrace: allocating 37993 entries in 149 pages May 13 23:52:12.085958 kernel: ftrace: allocated 149 pages with 4 groups May 13 23:52:12.085975 kernel: Dynamic Preempt: voluntary May 13 23:52:12.085985 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:52:12.085995 kernel: rcu: RCU event tracing is enabled. May 13 23:52:12.086008 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 13 23:52:12.086017 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:52:12.086025 kernel: Rude variant of Tasks RCU enabled. May 13 23:52:12.086033 kernel: Tracing variant of Tasks RCU enabled. May 13 23:52:12.086041 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:52:12.086049 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 13 23:52:12.086057 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 13 23:52:12.086065 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:52:12.086073 kernel: Console: colour dummy device 80x25 May 13 23:52:12.086081 kernel: printk: console [ttyS0] enabled May 13 23:52:12.086091 kernel: ACPI: Core revision 20230628 May 13 23:52:12.086100 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 13 23:52:12.086108 kernel: APIC: Switch to symmetric I/O mode setup May 13 23:52:12.086116 kernel: x2apic enabled May 13 23:52:12.086124 kernel: APIC: Switched APIC routing to: physical x2apic May 13 23:52:12.086135 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 13 23:52:12.086143 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 13 23:52:12.086151 kernel: kvm-guest: setup PV IPIs May 13 23:52:12.086159 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 13 23:52:12.086169 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 13 23:52:12.086177 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) May 13 23:52:12.086185 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 13 23:52:12.086193 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 13 23:52:12.086201 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 13 23:52:12.086210 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 23:52:12.086218 kernel: Spectre V2 : Mitigation: Retpolines May 13 23:52:12.086226 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 23:52:12.086234 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 13 23:52:12.086244 kernel: RETBleed: Mitigation: untrained return thunk May 13 23:52:12.086253 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 13 23:52:12.086261 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 13 23:52:12.086269 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 13 23:52:12.086278 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 13 23:52:12.086286 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 13 23:52:12.086296 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 13 23:52:12.086304 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 13 23:52:12.086315 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 13 23:52:12.086323 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 13 23:52:12.086331 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 13 23:52:12.086339 kernel: Freeing SMP alternatives memory: 32K May 13 23:52:12.086347 kernel: pid_max: default: 32768 minimum: 301 May 13 23:52:12.086355 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:52:12.086363 kernel: landlock: Up and running. May 13 23:52:12.086371 kernel: SELinux: Initializing. May 13 23:52:12.086379 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:52:12.086390 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:52:12.086421 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 13 23:52:12.086429 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:52:12.086437 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:52:12.086445 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:52:12.086454 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 13 23:52:12.086462 kernel: ... version: 0 May 13 23:52:12.086470 kernel: ... bit width: 48 May 13 23:52:12.086478 kernel: ... generic registers: 6 May 13 23:52:12.086489 kernel: ... value mask: 0000ffffffffffff May 13 23:52:12.086497 kernel: ... max period: 00007fffffffffff May 13 23:52:12.086505 kernel: ... fixed-purpose events: 0 May 13 23:52:12.086513 kernel: ... event mask: 000000000000003f May 13 23:52:12.086521 kernel: signal: max sigframe size: 1776 May 13 23:52:12.086529 kernel: rcu: Hierarchical SRCU implementation. May 13 23:52:12.086537 kernel: rcu: Max phase no-delay instances is 400. May 13 23:52:12.086545 kernel: smp: Bringing up secondary CPUs ... May 13 23:52:12.086552 kernel: smpboot: x86: Booting SMP configuration: May 13 23:52:12.086563 kernel: .... node #0, CPUs: #1 #2 #3 May 13 23:52:12.086571 kernel: smp: Brought up 1 node, 4 CPUs May 13 23:52:12.086579 kernel: smpboot: Max logical packages: 1 May 13 23:52:12.086587 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) May 13 23:52:12.086595 kernel: devtmpfs: initialized May 13 23:52:12.086602 kernel: x86/mm: Memory block size: 128MB May 13 23:52:12.086610 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) May 13 23:52:12.086618 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) May 13 23:52:12.086627 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) May 13 23:52:12.086637 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) May 13 23:52:12.086645 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce96000-0x9ce97fff] (8192 bytes) May 13 23:52:12.086653 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) May 13 23:52:12.086662 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:52:12.086670 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 13 23:52:12.086678 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:52:12.086685 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:52:12.086693 kernel: audit: initializing netlink subsys (disabled) May 13 23:52:12.086701 kernel: audit: type=2000 audit(1747180331.306:1): state=initialized audit_enabled=0 res=1 May 13 23:52:12.086712 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:52:12.086720 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 23:52:12.086728 kernel: cpuidle: using governor menu May 13 23:52:12.086736 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:52:12.086744 kernel: dca service started, version 1.12.1 May 13 23:52:12.086752 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) May 13 23:52:12.086760 kernel: PCI: Using configuration type 1 for base access May 13 23:52:12.086768 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 23:52:12.086776 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:52:12.086787 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:52:12.086795 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:52:12.086803 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:52:12.086810 kernel: ACPI: Added _OSI(Module Device) May 13 23:52:12.086818 kernel: ACPI: Added _OSI(Processor Device) May 13 23:52:12.086826 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:52:12.086834 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:52:12.086842 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:52:12.086850 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 13 23:52:12.086861 kernel: ACPI: Interpreter enabled May 13 23:52:12.086869 kernel: ACPI: PM: (supports S0 S3 S5) May 13 23:52:12.086877 kernel: ACPI: Using IOAPIC for interrupt routing May 13 23:52:12.086885 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 23:52:12.086893 kernel: PCI: Using E820 reservations for host bridge windows May 13 23:52:12.086901 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 13 23:52:12.086912 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 23:52:12.087167 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 23:52:12.087316 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 13 23:52:12.087467 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 13 23:52:12.087479 kernel: PCI host bridge to bus 0000:00 May 13 23:52:12.087616 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 13 23:52:12.087745 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 13 23:52:12.087868 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 13 23:52:12.088011 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] May 13 23:52:12.088141 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] May 13 23:52:12.088262 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] May 13 23:52:12.088448 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 23:52:12.088695 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 May 13 23:52:12.088850 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 May 13 23:52:12.089011 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xc0000000-0xc0ffffff pref] May 13 23:52:12.089156 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xc1044000-0xc1044fff] May 13 23:52:12.089288 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] May 13 23:52:12.089497 kernel: pci 0000:00:01.0: BAR 0: assigned to efifb May 13 23:52:12.089668 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 13 23:52:12.089832 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 May 13 23:52:12.090011 kernel: pci 0000:00:02.0: reg 0x10: [io 0x6100-0x611f] May 13 23:52:12.090243 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xc1043000-0xc1043fff] May 13 23:52:12.090485 kernel: pci 0000:00:02.0: reg 0x20: [mem 0x380000000000-0x380000003fff 64bit pref] May 13 23:52:12.090708 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 May 13 23:52:12.090870 kernel: pci 0000:00:03.0: reg 0x10: [io 0x6000-0x607f] May 13 23:52:12.091048 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xc1042000-0xc1042fff] May 13 23:52:12.091200 kernel: pci 0000:00:03.0: reg 0x20: [mem 0x380000004000-0x380000007fff 64bit pref] May 13 23:52:12.091382 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 May 13 23:52:12.091550 kernel: pci 0000:00:04.0: reg 0x10: [io 0x60e0-0x60ff] May 13 23:52:12.091690 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xc1041000-0xc1041fff] May 13 23:52:12.091820 kernel: pci 0000:00:04.0: reg 0x20: [mem 0x380000008000-0x38000000bfff 64bit pref] May 13 23:52:12.091993 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfffc0000-0xffffffff pref] May 13 23:52:12.092222 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 May 13 23:52:12.092422 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 13 23:52:12.092653 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 May 13 23:52:12.092806 kernel: pci 0000:00:1f.2: reg 0x20: [io 0x60c0-0x60df] May 13 23:52:12.092940 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xc1040000-0xc1040fff] May 13 23:52:12.093104 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 May 13 23:52:12.093240 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x6080-0x60bf] May 13 23:52:12.093252 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 13 23:52:12.093260 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 13 23:52:12.093269 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 13 23:52:12.093277 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 13 23:52:12.093291 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 13 23:52:12.093302 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 13 23:52:12.093313 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 13 23:52:12.093323 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 13 23:52:12.093335 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 13 23:52:12.093344 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 13 23:52:12.093352 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 13 23:52:12.093361 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 13 23:52:12.093369 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 13 23:52:12.093381 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 13 23:52:12.093389 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 13 23:52:12.093412 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 13 23:52:12.093420 kernel: iommu: Default domain type: Translated May 13 23:52:12.093428 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 23:52:12.093436 kernel: efivars: Registered efivars operations May 13 23:52:12.093444 kernel: PCI: Using ACPI for IRQ routing May 13 23:52:12.093452 kernel: PCI: pci_cache_line_size set to 64 bytes May 13 23:52:12.093461 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] May 13 23:52:12.093472 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] May 13 23:52:12.093480 kernel: e820: reserve RAM buffer [mem 0x9b314018-0x9bffffff] May 13 23:52:12.093488 kernel: e820: reserve RAM buffer [mem 0x9b351018-0x9bffffff] May 13 23:52:12.093496 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] May 13 23:52:12.093504 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] May 13 23:52:12.093512 kernel: e820: reserve RAM buffer [mem 0x9ce92000-0x9fffffff] May 13 23:52:12.093520 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] May 13 23:52:12.093732 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 13 23:52:12.093882 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 13 23:52:12.094034 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 13 23:52:12.094045 kernel: vgaarb: loaded May 13 23:52:12.094054 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 13 23:52:12.094062 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 13 23:52:12.094070 kernel: clocksource: Switched to clocksource kvm-clock May 13 23:52:12.094078 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:52:12.094087 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:52:12.094095 kernel: pnp: PnP ACPI init May 13 23:52:12.094269 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved May 13 23:52:12.094286 kernel: pnp: PnP ACPI: found 6 devices May 13 23:52:12.094297 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 23:52:12.094308 kernel: NET: Registered PF_INET protocol family May 13 23:52:12.094320 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:52:12.094348 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 23:52:12.094359 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:52:12.094368 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 23:52:12.094379 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 23:52:12.094387 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 23:52:12.094409 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:52:12.094418 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:52:12.094426 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:52:12.094434 kernel: NET: Registered PF_XDP protocol family May 13 23:52:12.094579 kernel: pci 0000:00:04.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window May 13 23:52:12.094732 kernel: pci 0000:00:04.0: BAR 6: assigned [mem 0x9d000000-0x9d03ffff pref] May 13 23:52:12.094860 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 13 23:52:12.095040 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 13 23:52:12.095185 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 13 23:52:12.095352 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] May 13 23:52:12.095554 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] May 13 23:52:12.095674 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] May 13 23:52:12.095685 kernel: PCI: CLS 0 bytes, default 64 May 13 23:52:12.095694 kernel: Initialise system trusted keyrings May 13 23:52:12.095703 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 23:52:12.095717 kernel: Key type asymmetric registered May 13 23:52:12.095726 kernel: Asymmetric key parser 'x509' registered May 13 23:52:12.095734 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 13 23:52:12.095743 kernel: io scheduler mq-deadline registered May 13 23:52:12.095751 kernel: io scheduler kyber registered May 13 23:52:12.095759 kernel: io scheduler bfq registered May 13 23:52:12.095768 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 23:52:12.095777 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 13 23:52:12.095785 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 13 23:52:12.095797 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 13 23:52:12.095808 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:52:12.095816 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 23:52:12.095825 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 13 23:52:12.095833 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 13 23:52:12.095842 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 13 23:52:12.096001 kernel: rtc_cmos 00:04: RTC can wake from S4 May 13 23:52:12.096016 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 13 23:52:12.096139 kernel: rtc_cmos 00:04: registered as rtc0 May 13 23:52:12.096263 kernel: rtc_cmos 00:04: setting system clock to 2025-05-13T23:52:11 UTC (1747180331) May 13 23:52:12.096437 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 13 23:52:12.096450 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 13 23:52:12.096460 kernel: efifb: probing for efifb May 13 23:52:12.096468 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k May 13 23:52:12.096481 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 May 13 23:52:12.096490 kernel: efifb: scrolling: redraw May 13 23:52:12.096498 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 13 23:52:12.096507 kernel: Console: switching to colour frame buffer device 160x50 May 13 23:52:12.096515 kernel: fb0: EFI VGA frame buffer device May 13 23:52:12.096523 kernel: pstore: Using crash dump compression: deflate May 13 23:52:12.096535 kernel: pstore: Registered efi_pstore as persistent store backend May 13 23:52:12.096543 kernel: NET: Registered PF_INET6 protocol family May 13 23:52:12.096552 kernel: Segment Routing with IPv6 May 13 23:52:12.096563 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:52:12.096571 kernel: NET: Registered PF_PACKET protocol family May 13 23:52:12.096580 kernel: Key type dns_resolver registered May 13 23:52:12.096588 kernel: IPI shorthand broadcast: enabled May 13 23:52:12.096596 kernel: sched_clock: Marking stable (1267003898, 160076242)->(1496347841, -69267701) May 13 23:52:12.096605 kernel: registered taskstats version 1 May 13 23:52:12.096613 kernel: Loading compiled-in X.509 certificates May 13 23:52:12.096622 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 166efda032ca4d6e9037c569aca9b53585ee6f94' May 13 23:52:12.096630 kernel: Key type .fscrypt registered May 13 23:52:12.096641 kernel: Key type fscrypt-provisioning registered May 13 23:52:12.096649 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:52:12.096658 kernel: ima: Allocated hash algorithm: sha1 May 13 23:52:12.096666 kernel: ima: No architecture policies found May 13 23:52:12.096675 kernel: clk: Disabling unused clocks May 13 23:52:12.096683 kernel: Freeing unused kernel image (initmem) memory: 43604K May 13 23:52:12.096691 kernel: Write protecting the kernel read-only data: 40960k May 13 23:52:12.096700 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 13 23:52:12.096709 kernel: Run /init as init process May 13 23:52:12.096720 kernel: with arguments: May 13 23:52:12.096728 kernel: /init May 13 23:52:12.096736 kernel: with environment: May 13 23:52:12.096745 kernel: HOME=/ May 13 23:52:12.096753 kernel: TERM=linux May 13 23:52:12.096761 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:52:12.096771 systemd[1]: Successfully made /usr/ read-only. May 13 23:52:12.096783 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:52:12.096795 systemd[1]: Detected virtualization kvm. May 13 23:52:12.096804 systemd[1]: Detected architecture x86-64. May 13 23:52:12.096813 systemd[1]: Running in initrd. May 13 23:52:12.096821 systemd[1]: No hostname configured, using default hostname. May 13 23:52:12.096831 systemd[1]: Hostname set to . May 13 23:52:12.096839 systemd[1]: Initializing machine ID from VM UUID. May 13 23:52:12.096848 systemd[1]: Queued start job for default target initrd.target. May 13 23:52:12.096858 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:52:12.096869 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:52:12.096879 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:52:12.096888 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:52:12.096897 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:52:12.096907 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:52:12.096918 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:52:12.096929 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:52:12.096938 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:52:12.096947 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:52:12.096956 systemd[1]: Reached target paths.target - Path Units. May 13 23:52:12.096975 systemd[1]: Reached target slices.target - Slice Units. May 13 23:52:12.096984 systemd[1]: Reached target swap.target - Swaps. May 13 23:52:12.096992 systemd[1]: Reached target timers.target - Timer Units. May 13 23:52:12.097002 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:52:12.097010 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:52:12.097022 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:52:12.097031 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:52:12.097041 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:52:12.097050 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:52:12.097059 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:52:12.097067 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:52:12.097076 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:52:12.097085 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:52:12.097097 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:52:12.097106 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:52:12.097115 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:52:12.097124 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:52:12.097133 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:12.097142 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:52:12.097151 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:52:12.097163 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:52:12.097197 systemd-journald[191]: Collecting audit messages is disabled. May 13 23:52:12.097222 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:52:12.097232 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:52:12.097241 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:12.097250 systemd-journald[191]: Journal started May 13 23:52:12.097270 systemd-journald[191]: Runtime Journal (/run/log/journal/f735f820bb334cd4908f99293722fc37) is 6M, max 48.2M, 42.2M free. May 13 23:52:12.090712 systemd-modules-load[193]: Inserted module 'overlay' May 13 23:52:12.100693 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:52:12.103788 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:52:12.130438 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:52:12.138430 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:52:12.138656 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:52:12.142410 kernel: Bridge firewalling registered May 13 23:52:12.142412 systemd-modules-load[193]: Inserted module 'br_netfilter' May 13 23:52:12.143770 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:52:12.145120 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:52:12.150837 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:52:12.153648 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:52:12.157067 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:52:12.170946 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:52:12.183209 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:52:12.185278 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:52:12.191553 dracut-cmdline[227]: dracut-dracut-053 May 13 23:52:12.195375 dracut-cmdline[227]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:52:12.236100 systemd-resolved[236]: Positive Trust Anchors: May 13 23:52:12.236118 systemd-resolved[236]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:52:12.236150 systemd-resolved[236]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:52:12.238784 systemd-resolved[236]: Defaulting to hostname 'linux'. May 13 23:52:12.240148 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:52:12.267004 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:52:12.376453 kernel: SCSI subsystem initialized May 13 23:52:12.389441 kernel: Loading iSCSI transport class v2.0-870. May 13 23:52:12.401450 kernel: iscsi: registered transport (tcp) May 13 23:52:12.457471 kernel: iscsi: registered transport (qla4xxx) May 13 23:52:12.457562 kernel: QLogic iSCSI HBA Driver May 13 23:52:12.517202 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:52:12.520478 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:52:12.556886 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:52:12.556944 kernel: device-mapper: uevent: version 1.0.3 May 13 23:52:12.558013 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:52:12.601424 kernel: raid6: avx2x4 gen() 29570 MB/s May 13 23:52:12.618447 kernel: raid6: avx2x2 gen() 28384 MB/s May 13 23:52:12.635557 kernel: raid6: avx2x1 gen() 21062 MB/s May 13 23:52:12.635645 kernel: raid6: using algorithm avx2x4 gen() 29570 MB/s May 13 23:52:12.653612 kernel: raid6: .... xor() 7053 MB/s, rmw enabled May 13 23:52:12.653666 kernel: raid6: using avx2x2 recovery algorithm May 13 23:52:12.675442 kernel: xor: automatically using best checksumming function avx May 13 23:52:12.848462 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:52:12.864975 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:52:12.869130 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:52:12.900784 systemd-udevd[415]: Using default interface naming scheme 'v255'. May 13 23:52:12.907335 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:52:12.921841 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:52:12.951893 dracut-pre-trigger[427]: rd.md=0: removing MD RAID activation May 13 23:52:12.999339 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:52:13.004292 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:52:13.102240 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:52:13.110119 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:52:13.135544 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:52:13.148416 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 13 23:52:13.156293 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 13 23:52:13.156523 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 23:52:13.156538 kernel: GPT:9289727 != 19775487 May 13 23:52:13.156549 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 23:52:13.156568 kernel: GPT:9289727 != 19775487 May 13 23:52:13.156579 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 23:52:13.156590 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:52:13.140509 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:52:13.145003 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:52:13.152130 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:52:13.157796 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:52:13.162080 kernel: cryptd: max_cpu_qlen set to 1000 May 13 23:52:13.179426 kernel: libata version 3.00 loaded. May 13 23:52:13.193041 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:52:13.193972 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:52:13.213614 kernel: AVX2 version of gcm_enc/dec engaged. May 13 23:52:13.213680 kernel: AES CTR mode by8 optimization enabled May 13 23:52:13.213564 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:52:13.215519 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:52:13.222961 kernel: ahci 0000:00:1f.2: version 3.0 May 13 23:52:13.215623 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:13.225234 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 13 23:52:13.217725 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:13.228519 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:13.228842 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:52:13.229140 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:52:13.239746 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (467) May 13 23:52:13.239769 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode May 13 23:52:13.240005 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 13 23:52:13.240162 kernel: BTRFS: device fsid d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (465) May 13 23:52:13.243163 kernel: scsi host0: ahci May 13 23:52:13.243460 kernel: scsi host1: ahci May 13 23:52:13.244696 kernel: scsi host2: ahci May 13 23:52:13.244987 kernel: scsi host3: ahci May 13 23:52:13.245205 kernel: scsi host4: ahci May 13 23:52:13.246552 kernel: scsi host5: ahci May 13 23:52:13.246811 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 May 13 23:52:13.248746 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 May 13 23:52:13.248785 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 May 13 23:52:13.250028 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 May 13 23:52:13.251729 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 May 13 23:52:13.251758 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 May 13 23:52:13.282249 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 23:52:13.332264 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 23:52:13.342042 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 23:52:13.343451 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 23:52:13.355924 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 23:52:13.379267 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:52:13.380728 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:52:13.380800 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:13.383692 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:13.396291 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:13.397852 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:52:13.434015 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:13.435441 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:52:13.472297 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:52:13.539875 disk-uuid[559]: Primary Header is updated. May 13 23:52:13.539875 disk-uuid[559]: Secondary Entries is updated. May 13 23:52:13.539875 disk-uuid[559]: Secondary Header is updated. May 13 23:52:13.545437 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:52:13.551477 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:52:13.561443 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 13 23:52:13.561564 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 13 23:52:13.566485 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 13 23:52:13.566560 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 13 23:52:13.569974 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 13 23:52:13.570038 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 13 23:52:13.571829 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 13 23:52:13.571910 kernel: ata3.00: applying bridge limits May 13 23:52:13.572601 kernel: ata3.00: configured for UDMA/100 May 13 23:52:13.574119 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 23:52:13.654606 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 13 23:52:13.654936 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:52:13.668431 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 13 23:52:14.552344 disk-uuid[572]: The operation has completed successfully. May 13 23:52:14.553876 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:52:14.592788 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:52:14.592942 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:52:14.631924 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:52:14.650227 sh[599]: Success May 13 23:52:14.664440 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" May 13 23:52:14.708188 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:52:14.712171 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:52:14.728573 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:52:14.741577 kernel: BTRFS info (device dm-0): first mount of filesystem d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 May 13 23:52:14.741634 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 23:52:14.741650 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:52:14.742831 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:52:14.743755 kernel: BTRFS info (device dm-0): using free space tree May 13 23:52:14.748965 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:52:14.749716 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:52:14.750672 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:52:14.755164 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:52:14.788058 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:52:14.788135 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:52:14.788147 kernel: BTRFS info (device vda6): using free space tree May 13 23:52:14.791425 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:52:14.796430 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:52:14.881790 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:52:14.883591 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:52:14.895715 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:52:14.900740 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:52:14.970091 systemd-networkd[777]: lo: Link UP May 13 23:52:14.970712 systemd-networkd[777]: lo: Gained carrier May 13 23:52:14.972564 systemd-networkd[777]: Enumeration completed May 13 23:52:14.972709 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:52:14.972959 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:52:14.972964 systemd-networkd[777]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:52:14.975191 systemd-networkd[777]: eth0: Link UP May 13 23:52:14.985022 ignition[772]: Ignition 2.20.0 May 13 23:52:14.975196 systemd-networkd[777]: eth0: Gained carrier May 13 23:52:14.985031 ignition[772]: Stage: fetch-offline May 13 23:52:14.975206 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:52:14.985071 ignition[772]: no configs at "/usr/lib/ignition/base.d" May 13 23:52:14.976683 systemd[1]: Reached target network.target - Network. May 13 23:52:14.985082 ignition[772]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:14.993538 systemd-networkd[777]: eth0: DHCPv4 address 10.0.0.42/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:52:14.985193 ignition[772]: parsed url from cmdline: "" May 13 23:52:14.985197 ignition[772]: no config URL provided May 13 23:52:14.985203 ignition[772]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:52:14.985215 ignition[772]: no config at "/usr/lib/ignition/user.ign" May 13 23:52:14.985245 ignition[772]: op(1): [started] loading QEMU firmware config module May 13 23:52:14.985251 ignition[772]: op(1): executing: "modprobe" "qemu_fw_cfg" May 13 23:52:15.007781 ignition[772]: op(1): [finished] loading QEMU firmware config module May 13 23:52:15.056992 ignition[772]: parsing config with SHA512: dbb52e1e3b3af00e1bbac8f1f2b8df7d919c8c4efb22183d97b5c11f0cf360b584c88d78158a0836856af8effa4030b112abf767629b164e5a846764cb51b11e May 13 23:52:15.070471 unknown[772]: fetched base config from "system" May 13 23:52:15.070840 unknown[772]: fetched user config from "qemu" May 13 23:52:15.071416 ignition[772]: fetch-offline: fetch-offline passed May 13 23:52:15.071507 ignition[772]: Ignition finished successfully May 13 23:52:15.076694 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:52:15.078178 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 13 23:52:15.079056 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:52:15.109032 ignition[791]: Ignition 2.20.0 May 13 23:52:15.109048 ignition[791]: Stage: kargs May 13 23:52:15.109256 ignition[791]: no configs at "/usr/lib/ignition/base.d" May 13 23:52:15.109273 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:15.113530 ignition[791]: kargs: kargs passed May 13 23:52:15.113582 ignition[791]: Ignition finished successfully May 13 23:52:15.118219 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:52:15.120868 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:52:15.148674 ignition[799]: Ignition 2.20.0 May 13 23:52:15.148696 ignition[799]: Stage: disks May 13 23:52:15.148896 ignition[799]: no configs at "/usr/lib/ignition/base.d" May 13 23:52:15.148909 ignition[799]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:15.152739 ignition[799]: disks: disks passed May 13 23:52:15.152808 ignition[799]: Ignition finished successfully May 13 23:52:15.156496 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:52:15.158053 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:52:15.160145 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:52:15.161493 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:52:15.163665 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:52:15.166100 systemd[1]: Reached target basic.target - Basic System. May 13 23:52:15.169005 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:52:15.197607 systemd-fsck[809]: ROOT: clean, 14/553520 files, 52654/553472 blocks May 13 23:52:15.269820 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:52:15.273617 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:52:15.392435 kernel: EXT4-fs (vda9): mounted filesystem c413e98b-da35-46b1-9852-45706e1b1f52 r/w with ordered data mode. Quota mode: none. May 13 23:52:15.393470 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:52:15.396299 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:52:15.400310 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:52:15.420733 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:52:15.422850 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 23:52:15.422930 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:52:15.422964 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:52:15.433715 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:52:15.439615 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (817) May 13 23:52:15.439649 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:52:15.439664 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:52:15.439677 kernel: BTRFS info (device vda6): using free space tree May 13 23:52:15.441180 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:52:15.443512 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:52:15.444699 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:52:15.495509 initrd-setup-root[841]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:52:15.524886 initrd-setup-root[848]: cut: /sysroot/etc/group: No such file or directory May 13 23:52:15.529593 initrd-setup-root[855]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:52:15.553845 initrd-setup-root[862]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:52:15.676955 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:52:15.681443 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:52:15.684388 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:52:15.706465 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:52:15.723753 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:52:15.741234 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:52:15.769328 ignition[931]: INFO : Ignition 2.20.0 May 13 23:52:15.769328 ignition[931]: INFO : Stage: mount May 13 23:52:15.771606 ignition[931]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:52:15.771606 ignition[931]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:15.775149 ignition[931]: INFO : mount: mount passed May 13 23:52:15.776118 ignition[931]: INFO : Ignition finished successfully May 13 23:52:15.779835 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:52:15.782600 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:52:15.807187 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:52:15.839445 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (944) May 13 23:52:15.842183 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:52:15.842230 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:52:15.842246 kernel: BTRFS info (device vda6): using free space tree May 13 23:52:15.846434 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:52:15.848752 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:52:15.899744 ignition[961]: INFO : Ignition 2.20.0 May 13 23:52:15.899744 ignition[961]: INFO : Stage: files May 13 23:52:15.902337 ignition[961]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:52:15.902337 ignition[961]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:15.902337 ignition[961]: DEBUG : files: compiled without relabeling support, skipping May 13 23:52:15.906624 ignition[961]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:52:15.906624 ignition[961]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:52:15.912331 ignition[961]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:52:15.914066 ignition[961]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:52:15.915896 unknown[961]: wrote ssh authorized keys file for user: core May 13 23:52:15.917356 ignition[961]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:52:15.918988 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:52:15.918988 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 13 23:52:15.964512 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:52:16.140153 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:52:16.140153 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:52:16.144498 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 13 23:52:16.614333 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:52:16.760599 systemd-networkd[777]: eth0: Gained IPv6LL May 13 23:52:17.393303 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:52:17.393303 ignition[961]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:52:17.436283 ignition[961]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:52:17.439249 ignition[961]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:52:17.439249 ignition[961]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:52:17.439249 ignition[961]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 23:52:17.444739 ignition[961]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 23:52:17.447256 ignition[961]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 23:52:17.447256 ignition[961]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 23:52:17.501197 ignition[961]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 13 23:52:17.521550 ignition[961]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 13 23:52:17.540902 ignition[961]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 13 23:52:17.543334 ignition[961]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 13 23:52:17.543334 ignition[961]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 13 23:52:17.546780 ignition[961]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:52:17.565656 ignition[961]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:52:17.567958 ignition[961]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:52:17.570228 ignition[961]: INFO : files: files passed May 13 23:52:17.571366 ignition[961]: INFO : Ignition finished successfully May 13 23:52:17.575598 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:52:17.578569 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:52:17.581807 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:52:17.601472 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:52:17.601731 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:52:17.649657 initrd-setup-root-after-ignition[990]: grep: /sysroot/oem/oem-release: No such file or directory May 13 23:52:17.654531 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:52:17.654531 initrd-setup-root-after-ignition[992]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:52:17.658575 initrd-setup-root-after-ignition[996]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:52:17.662360 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:52:17.662803 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:52:17.668309 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:52:17.731201 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:52:17.731379 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:52:17.785207 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:52:17.787388 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:52:17.826822 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:52:17.828014 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:52:17.891154 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:52:17.904076 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:52:17.925304 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:52:17.926927 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:52:17.929847 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:52:17.932246 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:52:17.932373 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:52:17.935934 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:52:17.938560 systemd[1]: Stopped target basic.target - Basic System. May 13 23:52:17.940860 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:52:17.970974 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:52:17.975072 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:52:17.977422 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:52:17.977676 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:52:17.979917 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:52:17.984093 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:52:17.985307 systemd[1]: Stopped target swap.target - Swaps. May 13 23:52:17.988715 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:52:17.988974 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:52:17.995720 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:52:17.995968 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:52:17.999923 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:52:18.001238 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:52:18.001800 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:52:18.001993 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:52:18.008752 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:52:18.009009 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:52:18.010438 systemd[1]: Stopped target paths.target - Path Units. May 13 23:52:18.010997 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:52:18.048539 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:52:18.050324 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:52:18.053315 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:52:18.053505 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:52:18.053615 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:52:18.056855 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:52:18.056952 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:52:18.058001 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:52:18.058135 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:52:18.060141 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:52:18.060256 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:52:18.064435 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:52:18.105173 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:52:18.108078 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:52:18.108244 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:52:18.109284 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:52:18.109486 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:52:18.129689 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:52:18.129868 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:52:18.142708 ignition[1017]: INFO : Ignition 2.20.0 May 13 23:52:18.142708 ignition[1017]: INFO : Stage: umount May 13 23:52:18.168592 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:52:18.168592 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:52:18.168592 ignition[1017]: INFO : umount: umount passed May 13 23:52:18.168592 ignition[1017]: INFO : Ignition finished successfully May 13 23:52:18.175003 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:52:18.175164 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:52:18.176446 systemd[1]: Stopped target network.target - Network. May 13 23:52:18.179262 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:52:18.179329 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:52:18.184831 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:52:18.184937 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:52:18.185832 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:52:18.185900 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:52:18.189114 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:52:18.189172 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:52:18.189945 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:52:18.190339 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:52:18.192102 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:52:18.212069 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:52:18.212220 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:52:18.235787 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:52:18.236031 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:52:18.236162 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:52:18.241253 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:52:18.242343 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:52:18.242538 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:52:18.246028 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:52:18.247875 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:52:18.247981 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:52:18.250603 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:52:18.250666 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:52:18.253224 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:52:18.253280 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:52:18.255970 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:52:18.256027 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:52:18.293563 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:52:18.294872 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:52:18.294959 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:52:18.308250 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:52:18.308437 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:52:18.312665 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:52:18.312876 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:52:18.315675 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:52:18.315732 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:52:18.318190 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:52:18.318236 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:52:18.319562 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:52:18.319624 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:52:18.322359 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:52:18.322478 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:52:18.375650 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:52:18.375772 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:52:18.379451 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:52:18.380844 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:52:18.380919 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:52:18.383677 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 23:52:18.383748 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:52:18.384845 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:52:18.384899 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:52:18.388978 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:52:18.389053 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:18.393967 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 13 23:52:18.394049 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:52:18.411420 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:52:18.411541 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:52:18.791549 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:52:18.791736 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:52:18.792169 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:52:18.795627 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:52:18.795714 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:52:18.797018 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:52:18.815627 systemd[1]: Switching root. May 13 23:52:18.855467 systemd-journald[191]: Journal stopped May 13 23:52:20.682498 systemd-journald[191]: Received SIGTERM from PID 1 (systemd). May 13 23:52:20.682571 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:52:20.682587 kernel: SELinux: policy capability open_perms=1 May 13 23:52:20.682599 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:52:20.682611 kernel: SELinux: policy capability always_check_network=0 May 13 23:52:20.682627 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:52:20.682639 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:52:20.682651 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:52:20.682670 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:52:20.682682 kernel: audit: type=1403 audit(1747180339.592:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:52:20.682697 systemd[1]: Successfully loaded SELinux policy in 53.968ms. May 13 23:52:20.682719 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 15.523ms. May 13 23:52:20.682740 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:52:20.682754 systemd[1]: Detected virtualization kvm. May 13 23:52:20.682769 systemd[1]: Detected architecture x86-64. May 13 23:52:20.682782 systemd[1]: Detected first boot. May 13 23:52:20.682801 systemd[1]: Initializing machine ID from VM UUID. May 13 23:52:20.682814 zram_generator::config[1064]: No configuration found. May 13 23:52:20.682833 kernel: Guest personality initialized and is inactive May 13 23:52:20.682850 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 13 23:52:20.682862 kernel: Initialized host personality May 13 23:52:20.682874 kernel: NET: Registered PF_VSOCK protocol family May 13 23:52:20.682886 systemd[1]: Populated /etc with preset unit settings. May 13 23:52:20.682903 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:52:20.682916 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:52:20.682929 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:52:20.682942 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:52:20.682955 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:52:20.682967 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:52:20.682980 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:52:20.682992 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:52:20.683009 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:52:20.683022 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:52:20.683035 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:52:20.683048 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:52:20.683060 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:52:20.683073 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:52:20.683086 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:52:20.683104 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:52:20.683117 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:52:20.683132 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:52:20.683145 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 23:52:20.683157 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:52:20.683171 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:52:20.683183 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:52:20.683196 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:52:20.683209 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:52:20.683224 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:52:20.683237 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:52:20.683249 systemd[1]: Reached target slices.target - Slice Units. May 13 23:52:20.683262 systemd[1]: Reached target swap.target - Swaps. May 13 23:52:20.683274 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:52:20.683288 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:52:20.683301 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:52:20.683313 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:52:20.683326 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:52:20.683338 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:52:20.683354 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:52:20.683367 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:52:20.683379 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:52:20.684535 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:52:20.684561 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:20.684575 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:52:20.684598 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:52:20.684622 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:52:20.684654 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:52:20.684670 systemd[1]: Reached target machines.target - Containers. May 13 23:52:20.684686 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:52:20.684703 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:52:20.684740 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:52:20.684759 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:52:20.684776 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:52:20.684793 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:52:20.684808 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:52:20.684825 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:52:20.684838 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:52:20.684851 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:52:20.684863 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:52:20.684876 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:52:20.684889 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:52:20.684901 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:52:20.684915 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:52:20.684930 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:52:20.684943 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:52:20.684956 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:52:20.684969 kernel: loop: module loaded May 13 23:52:20.684982 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:52:20.684995 kernel: fuse: init (API version 7.39) May 13 23:52:20.685010 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:52:20.685023 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:52:20.685036 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:52:20.685048 systemd[1]: Stopped verity-setup.service. May 13 23:52:20.685063 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:20.685076 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:52:20.685089 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:52:20.685104 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:52:20.685138 systemd-journald[1128]: Collecting audit messages is disabled. May 13 23:52:20.685161 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:52:20.685174 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:52:20.685187 systemd-journald[1128]: Journal started May 13 23:52:20.685213 systemd-journald[1128]: Runtime Journal (/run/log/journal/f735f820bb334cd4908f99293722fc37) is 6M, max 48.2M, 42.2M free. May 13 23:52:20.336280 systemd[1]: Queued start job for default target multi-user.target. May 13 23:52:20.351257 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 23:52:20.351880 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:52:20.743607 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:52:20.744770 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:52:20.746171 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:52:20.747831 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:52:20.748139 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:52:20.749790 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:52:20.750017 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:52:20.751549 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:52:20.751783 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:52:20.753597 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:52:20.753927 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:52:20.755429 kernel: ACPI: bus type drm_connector registered May 13 23:52:20.756087 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:52:20.756320 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:52:20.757895 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:52:20.758141 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:52:20.759645 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:52:20.761470 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:52:20.763288 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:52:20.765120 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:52:20.781383 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:52:20.784534 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:52:20.786988 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:52:20.788387 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:52:20.788560 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:52:20.810947 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:52:20.822608 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:52:20.825564 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:52:20.827590 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:52:20.831748 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:52:20.838268 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:52:20.839759 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:52:20.841162 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:52:20.842521 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:52:20.843620 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:52:20.846025 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:52:20.848454 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:52:20.859629 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:52:20.861150 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:52:20.862569 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:52:20.868156 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:52:20.893471 udevadm[1183]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 13 23:52:20.974887 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:52:20.980621 kernel: loop0: detected capacity change from 0 to 151640 May 13 23:52:20.980739 systemd-journald[1128]: Time spent on flushing to /var/log/journal/f735f820bb334cd4908f99293722fc37 is 18.524ms for 1065 entries. May 13 23:52:20.980739 systemd-journald[1128]: System Journal (/var/log/journal/f735f820bb334cd4908f99293722fc37) is 8M, max 195.6M, 187.6M free. May 13 23:52:21.268039 systemd-journald[1128]: Received client request to flush runtime journal. May 13 23:52:21.268088 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:52:21.268125 kernel: loop1: detected capacity change from 0 to 109808 May 13 23:52:21.268144 kernel: loop2: detected capacity change from 0 to 205544 May 13 23:52:21.268162 kernel: loop3: detected capacity change from 0 to 151640 May 13 23:52:21.268180 kernel: loop4: detected capacity change from 0 to 109808 May 13 23:52:20.985697 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:52:21.089633 systemd-tmpfiles[1177]: ACLs are not supported, ignoring. May 13 23:52:21.089652 systemd-tmpfiles[1177]: ACLs are not supported, ignoring. May 13 23:52:21.097489 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:52:21.250034 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:52:21.252093 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:52:21.255554 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:52:21.259219 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:52:21.264569 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:52:21.273430 kernel: loop5: detected capacity change from 0 to 205544 May 13 23:52:21.274991 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:52:21.391791 (sd-merge)[1200]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 13 23:52:21.392639 (sd-merge)[1200]: Merged extensions into '/usr'. May 13 23:52:21.401141 systemd[1]: Reload requested from client PID 1176 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:52:21.401448 systemd[1]: Reloading... May 13 23:52:21.462503 zram_generator::config[1233]: No configuration found. May 13 23:52:21.559652 ldconfig[1164]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:52:21.640868 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:52:21.711896 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:52:21.712065 systemd[1]: Reloading finished in 309 ms. May 13 23:52:21.744213 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:52:21.745962 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:52:21.747736 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:52:21.749475 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:52:21.773230 systemd[1]: Starting ensure-sysext.service... May 13 23:52:21.792718 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:52:21.795130 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:52:21.818159 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. May 13 23:52:21.818180 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. May 13 23:52:21.825290 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:52:21.886897 systemd-tmpfiles[1279]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:52:21.887150 systemd-tmpfiles[1279]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:52:21.888094 systemd-tmpfiles[1279]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:52:21.888355 systemd-tmpfiles[1279]: ACLs are not supported, ignoring. May 13 23:52:21.888457 systemd-tmpfiles[1279]: ACLs are not supported, ignoring. May 13 23:52:21.892882 systemd-tmpfiles[1279]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:52:21.892896 systemd-tmpfiles[1279]: Skipping /boot May 13 23:52:21.895927 systemd[1]: Reload requested from client PID 1277 ('systemctl') (unit ensure-sysext.service)... May 13 23:52:21.895947 systemd[1]: Reloading... May 13 23:52:21.908572 systemd-tmpfiles[1279]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:52:21.908591 systemd-tmpfiles[1279]: Skipping /boot May 13 23:52:21.954428 zram_generator::config[1310]: No configuration found. May 13 23:52:22.083199 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:52:22.153042 systemd[1]: Reloading finished in 256 ms. May 13 23:52:22.169232 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:52:22.193057 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:52:22.205569 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:52:22.209933 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:52:22.213790 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:52:22.228726 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:52:22.233316 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:52:22.238005 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:52:22.243446 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.243677 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:52:22.248715 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:52:22.256993 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:52:22.267970 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:52:22.269370 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:52:22.269535 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:52:22.272214 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:52:22.273687 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.277730 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:52:22.281653 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:52:22.281977 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:52:22.282757 systemd-udevd[1353]: Using default interface naming scheme 'v255'. May 13 23:52:22.286301 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:52:22.287454 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:52:22.309565 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:52:22.309878 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:52:22.320949 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.321539 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:52:22.323039 augenrules[1382]: No rules May 13 23:52:22.324364 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:52:22.327865 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:52:22.337105 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:52:22.359854 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:52:22.360142 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:52:22.365103 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:52:22.366552 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.368086 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:52:22.370353 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:52:22.371035 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:52:22.378026 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:52:22.399326 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:52:22.404427 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:52:22.404848 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:52:22.408288 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:52:22.408731 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:52:22.412380 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:52:22.413500 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:52:22.417487 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:52:22.435379 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:52:22.450544 systemd[1]: Finished ensure-sysext.service. May 13 23:52:22.460588 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 23:52:22.461348 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.463712 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:52:22.464978 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:52:22.466529 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:52:22.473594 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:52:22.480424 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1391) May 13 23:52:22.484712 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:52:22.489289 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:52:22.491042 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:52:22.491099 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:52:22.494286 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:52:22.501595 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 23:52:22.503035 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:52:22.503077 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:52:22.503891 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:52:22.504153 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:52:22.507360 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:52:22.507676 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:52:22.509390 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:52:22.509956 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:52:22.522834 augenrules[1428]: /sbin/augenrules: No change May 13 23:52:22.523902 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:52:22.524175 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:52:22.545421 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 13 23:52:22.544211 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:52:22.545599 augenrules[1456]: No rules May 13 23:52:22.547500 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:52:22.554437 kernel: ACPI: button: Power Button [PWRF] May 13 23:52:22.562772 systemd-resolved[1352]: Positive Trust Anchors: May 13 23:52:22.562790 systemd-resolved[1352]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:52:22.562823 systemd-resolved[1352]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:52:22.571106 systemd-resolved[1352]: Defaulting to hostname 'linux'. May 13 23:52:22.573892 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 23:52:22.591963 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:52:22.597123 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device May 13 23:52:22.597462 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 13 23:52:22.597655 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) May 13 23:52:22.599955 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 13 23:52:22.594254 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:52:22.602621 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:52:22.604541 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:52:22.604634 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:52:22.617468 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 13 23:52:22.647528 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:52:22.685313 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 23:52:22.687761 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:52:22.701788 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:22.706788 systemd-networkd[1435]: lo: Link UP May 13 23:52:22.706802 systemd-networkd[1435]: lo: Gained carrier May 13 23:52:22.708035 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:52:22.708356 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:22.711659 systemd-networkd[1435]: Enumeration completed May 13 23:52:22.712163 systemd-networkd[1435]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:52:22.712174 systemd-networkd[1435]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:52:22.712634 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:52:22.716023 systemd-networkd[1435]: eth0: Link UP May 13 23:52:22.716048 systemd-networkd[1435]: eth0: Gained carrier May 13 23:52:22.716086 systemd-networkd[1435]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:52:22.724904 systemd[1]: Reached target network.target - Network. May 13 23:52:22.731544 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:52:22.733461 systemd-networkd[1435]: eth0: DHCPv4 address 10.0.0.42/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:52:22.735090 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:52:22.735373 systemd-timesyncd[1441]: Network configuration changed, trying to establish connection. May 13 23:52:23.725927 systemd-timesyncd[1441]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 13 23:52:23.725994 systemd-timesyncd[1441]: Initial clock synchronization to Tue 2025-05-13 23:52:23.725767 UTC. May 13 23:52:23.726139 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:52:23.726294 systemd-resolved[1352]: Clock change detected. Flushing caches. May 13 23:52:23.740997 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:52:23.756411 kernel: kvm_amd: TSC scaling supported May 13 23:52:23.756494 kernel: kvm_amd: Nested Virtualization enabled May 13 23:52:23.756513 kernel: kvm_amd: Nested Paging enabled May 13 23:52:23.756527 kernel: kvm_amd: LBR virtualization supported May 13 23:52:23.757521 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 13 23:52:23.758223 kernel: kvm_amd: Virtual GIF supported May 13 23:52:23.759730 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:52:23.788973 kernel: EDAC MC: Ver: 3.0.0 May 13 23:52:23.823418 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:52:23.837595 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:52:23.842358 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:52:23.872751 lvm[1486]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:52:23.909698 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:52:23.911601 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:52:23.912891 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:52:23.914317 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:52:23.915695 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:52:23.917486 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:52:23.918758 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:52:23.920066 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:52:23.921430 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:52:23.921473 systemd[1]: Reached target paths.target - Path Units. May 13 23:52:23.922600 systemd[1]: Reached target timers.target - Timer Units. May 13 23:52:23.924788 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:52:23.928105 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:52:23.932229 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:52:23.934021 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:52:23.935571 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:52:23.939467 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:52:23.941112 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:52:23.944853 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:52:23.947200 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:52:23.948659 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:52:23.949850 systemd[1]: Reached target basic.target - Basic System. May 13 23:52:23.950955 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:52:23.950994 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:52:23.960619 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:52:23.963728 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:52:23.966065 lvm[1490]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:52:23.966578 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:52:23.972177 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:52:23.973599 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:52:23.974962 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:52:23.979615 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:52:23.982334 jq[1493]: false May 13 23:52:24.004059 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:52:24.008859 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:52:24.020955 dbus-daemon[1492]: [system] SELinux support is enabled May 13 23:52:24.023085 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:52:24.029208 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:52:24.030818 extend-filesystems[1494]: Found loop3 May 13 23:52:24.030818 extend-filesystems[1494]: Found loop4 May 13 23:52:24.030818 extend-filesystems[1494]: Found loop5 May 13 23:52:24.030818 extend-filesystems[1494]: Found sr0 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda May 13 23:52:24.030818 extend-filesystems[1494]: Found vda1 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda2 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda3 May 13 23:52:24.030818 extend-filesystems[1494]: Found usr May 13 23:52:24.030818 extend-filesystems[1494]: Found vda4 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda6 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda7 May 13 23:52:24.030818 extend-filesystems[1494]: Found vda9 May 13 23:52:24.030818 extend-filesystems[1494]: Checking size of /dev/vda9 May 13 23:52:24.030083 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:52:24.035240 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:52:24.049009 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:52:24.053256 extend-filesystems[1494]: Resized partition /dev/vda9 May 13 23:52:24.061184 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:52:24.069093 jq[1512]: true May 13 23:52:24.073455 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1401) May 13 23:52:24.070696 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:52:24.076108 extend-filesystems[1515]: resize2fs 1.47.2 (1-Jan-2025) May 13 23:52:24.076572 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:52:24.076961 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:52:24.077427 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:52:24.077764 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:52:24.082695 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:52:24.085076 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:52:24.090061 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 13 23:52:24.091097 update_engine[1508]: I20250513 23:52:24.090988 1508 main.cc:92] Flatcar Update Engine starting May 13 23:52:24.095200 update_engine[1508]: I20250513 23:52:24.095134 1508 update_check_scheduler.cc:74] Next update check in 10m48s May 13 23:52:24.105648 jq[1518]: true May 13 23:52:24.120489 (ntainerd)[1526]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:52:24.145248 systemd[1]: Started update-engine.service - Update Engine. May 13 23:52:24.147031 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:52:24.147074 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:52:24.148922 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:52:24.148957 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:52:24.154793 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:52:24.197785 systemd-logind[1504]: Watching system buttons on /dev/input/event1 (Power Button) May 13 23:52:24.197822 systemd-logind[1504]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 13 23:52:24.200650 tar[1517]: linux-amd64/helm May 13 23:52:24.201088 locksmithd[1545]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:52:24.202278 systemd-logind[1504]: New seat seat0. May 13 23:52:24.204339 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:52:24.246584 sshd_keygen[1511]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:52:24.279727 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:52:24.283906 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:52:24.305278 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:52:24.305606 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:52:24.329553 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:52:24.407920 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 13 23:52:24.416236 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:52:24.420588 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:52:24.424107 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 23:52:24.426866 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:52:24.533495 extend-filesystems[1515]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 23:52:24.533495 extend-filesystems[1515]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 23:52:24.533495 extend-filesystems[1515]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 13 23:52:24.540781 extend-filesystems[1494]: Resized filesystem in /dev/vda9 May 13 23:52:24.536042 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:52:24.536421 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:52:24.547480 bash[1544]: Updated "/home/core/.ssh/authorized_keys" May 13 23:52:24.550238 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:52:24.553054 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 23:52:24.603266 containerd[1526]: time="2025-05-13T23:52:24Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:52:24.605072 containerd[1526]: time="2025-05-13T23:52:24.605027424Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:52:24.616609 containerd[1526]: time="2025-05-13T23:52:24.616553714Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.737µs" May 13 23:52:24.616609 containerd[1526]: time="2025-05-13T23:52:24.616600362Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:52:24.616729 containerd[1526]: time="2025-05-13T23:52:24.616630388Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:52:24.617468 containerd[1526]: time="2025-05-13T23:52:24.616927054Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:52:24.617468 containerd[1526]: time="2025-05-13T23:52:24.617123513Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:52:24.617468 containerd[1526]: time="2025-05-13T23:52:24.617182023Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:52:24.617468 containerd[1526]: time="2025-05-13T23:52:24.617279796Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:52:24.617468 containerd[1526]: time="2025-05-13T23:52:24.617296257Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:52:24.617788 containerd[1526]: time="2025-05-13T23:52:24.617689905Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:52:24.617788 containerd[1526]: time="2025-05-13T23:52:24.617723609Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:52:24.617788 containerd[1526]: time="2025-05-13T23:52:24.617740731Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:52:24.617788 containerd[1526]: time="2025-05-13T23:52:24.617753685Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:52:24.617943 containerd[1526]: time="2025-05-13T23:52:24.617923203Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:52:24.618391 containerd[1526]: time="2025-05-13T23:52:24.618348411Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:52:24.618440 containerd[1526]: time="2025-05-13T23:52:24.618392213Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:52:24.618440 containerd[1526]: time="2025-05-13T23:52:24.618409305Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:52:24.618507 containerd[1526]: time="2025-05-13T23:52:24.618460751Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:52:24.618815 containerd[1526]: time="2025-05-13T23:52:24.618758129Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:52:24.618971 containerd[1526]: time="2025-05-13T23:52:24.618932075Z" level=info msg="metadata content store policy set" policy=shared May 13 23:52:24.699372 tar[1517]: linux-amd64/LICENSE May 13 23:52:24.699508 tar[1517]: linux-amd64/README.md May 13 23:52:24.722793 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:52:24.935534 containerd[1526]: time="2025-05-13T23:52:24.935366726Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:52:24.935534 containerd[1526]: time="2025-05-13T23:52:24.935475290Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:52:24.935534 containerd[1526]: time="2025-05-13T23:52:24.935498523Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:52:24.935534 containerd[1526]: time="2025-05-13T23:52:24.935516197Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:52:24.935534 containerd[1526]: time="2025-05-13T23:52:24.935534100Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935549990Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935579535Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935598130Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935614220Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935630381Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935646251Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:52:24.935724 containerd[1526]: time="2025-05-13T23:52:24.935663112Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:52:24.935992 containerd[1526]: time="2025-05-13T23:52:24.935945201Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:52:24.936029 containerd[1526]: time="2025-05-13T23:52:24.935993001Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:52:24.936029 containerd[1526]: time="2025-05-13T23:52:24.936010203Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:52:24.936029 containerd[1526]: time="2025-05-13T23:52:24.936024781Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:52:24.936102 containerd[1526]: time="2025-05-13T23:52:24.936040029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:52:24.936102 containerd[1526]: time="2025-05-13T23:52:24.936055759Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:52:24.936102 containerd[1526]: time="2025-05-13T23:52:24.936073342Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:52:24.936102 containerd[1526]: time="2025-05-13T23:52:24.936087298Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:52:24.936233 containerd[1526]: time="2025-05-13T23:52:24.936101925Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:52:24.936233 containerd[1526]: time="2025-05-13T23:52:24.936116883Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:52:24.936233 containerd[1526]: time="2025-05-13T23:52:24.936144365Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:52:24.936319 containerd[1526]: time="2025-05-13T23:52:24.936239614Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:52:24.936319 containerd[1526]: time="2025-05-13T23:52:24.936260112Z" level=info msg="Start snapshots syncer" May 13 23:52:24.936319 containerd[1526]: time="2025-05-13T23:52:24.936298083Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:52:24.936718 containerd[1526]: time="2025-05-13T23:52:24.936652969Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:52:24.936849 containerd[1526]: time="2025-05-13T23:52:24.936728591Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:52:24.936849 containerd[1526]: time="2025-05-13T23:52:24.936814071Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:52:24.937033 containerd[1526]: time="2025-05-13T23:52:24.936990021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:52:24.937033 containerd[1526]: time="2025-05-13T23:52:24.937025287Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:52:24.937033 containerd[1526]: time="2025-05-13T23:52:24.937041818Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:52:24.937033 containerd[1526]: time="2025-05-13T23:52:24.937055103Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937071263Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937087253Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937102402Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937178184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937202149Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937215484Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:52:24.937298 containerd[1526]: time="2025-05-13T23:52:24.937283692Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937305042Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937320150Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937335068Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937346870Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937360296Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937374372Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937397796Z" level=info msg="runtime interface created" May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937406202Z" level=info msg="created NRI interface" May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937418114Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937434425Z" level=info msg="Connect containerd service" May 13 23:52:24.937477 containerd[1526]: time="2025-05-13T23:52:24.937463579Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:52:24.939402 containerd[1526]: time="2025-05-13T23:52:24.939359395Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:52:25.042314 containerd[1526]: time="2025-05-13T23:52:25.042249428Z" level=info msg="Start subscribing containerd event" May 13 23:52:25.042446 containerd[1526]: time="2025-05-13T23:52:25.042328346Z" level=info msg="Start recovering state" May 13 23:52:25.042540 containerd[1526]: time="2025-05-13T23:52:25.042482234Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:52:25.042573 containerd[1526]: time="2025-05-13T23:52:25.042488727Z" level=info msg="Start event monitor" May 13 23:52:25.042598 containerd[1526]: time="2025-05-13T23:52:25.042575089Z" level=info msg="Start cni network conf syncer for default" May 13 23:52:25.042598 containerd[1526]: time="2025-05-13T23:52:25.042583785Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:52:25.042681 containerd[1526]: time="2025-05-13T23:52:25.042588073Z" level=info msg="Start streaming server" May 13 23:52:25.042681 containerd[1526]: time="2025-05-13T23:52:25.042643717Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:52:25.042681 containerd[1526]: time="2025-05-13T23:52:25.042653526Z" level=info msg="runtime interface starting up..." May 13 23:52:25.042681 containerd[1526]: time="2025-05-13T23:52:25.042660579Z" level=info msg="starting plugins..." May 13 23:52:25.042764 containerd[1526]: time="2025-05-13T23:52:25.042691677Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:52:25.042909 containerd[1526]: time="2025-05-13T23:52:25.042866225Z" level=info msg="containerd successfully booted in 0.440325s" May 13 23:52:25.043014 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:52:25.044841 systemd-networkd[1435]: eth0: Gained IPv6LL May 13 23:52:25.049438 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:52:25.051683 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:52:25.055178 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 13 23:52:25.058693 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:52:25.067825 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:52:25.091594 systemd[1]: coreos-metadata.service: Deactivated successfully. May 13 23:52:25.091918 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 13 23:52:25.093633 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:52:25.100852 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:52:25.898942 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:52:26.106066 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:52:26.106489 (kubelet)[1619]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:52:26.107961 systemd[1]: Startup finished in 1.474s (kernel) + 7.782s (initrd) + 5.580s (userspace) = 14.837s. May 13 23:52:26.520262 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:52:26.522206 systemd[1]: Started sshd@0-10.0.0.42:22-10.0.0.1:33696.service - OpenSSH per-connection server daemon (10.0.0.1:33696). May 13 23:52:26.613665 sshd[1631]: Accepted publickey for core from 10.0.0.1 port 33696 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:26.616206 sshd-session[1631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:26.629741 systemd-logind[1504]: New session 1 of user core. May 13 23:52:26.631559 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:52:26.633112 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:52:26.664965 kubelet[1619]: E0513 23:52:26.664757 1619 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:52:26.668815 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:52:26.669055 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:52:26.669435 systemd[1]: kubelet.service: Consumed 1.237s CPU time, 238.5M memory peak. May 13 23:52:26.670001 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:52:26.674369 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:52:26.698281 (systemd)[1636]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:52:26.701645 systemd-logind[1504]: New session c1 of user core. May 13 23:52:26.867595 systemd[1636]: Queued start job for default target default.target. May 13 23:52:26.878520 systemd[1636]: Created slice app.slice - User Application Slice. May 13 23:52:26.878554 systemd[1636]: Reached target paths.target - Paths. May 13 23:52:26.878612 systemd[1636]: Reached target timers.target - Timers. May 13 23:52:26.880308 systemd[1636]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:52:26.893832 systemd[1636]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:52:26.894023 systemd[1636]: Reached target sockets.target - Sockets. May 13 23:52:26.894096 systemd[1636]: Reached target basic.target - Basic System. May 13 23:52:26.894200 systemd[1636]: Reached target default.target - Main User Target. May 13 23:52:26.894242 systemd[1636]: Startup finished in 184ms. May 13 23:52:26.894660 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:52:26.896727 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:52:26.964277 systemd[1]: Started sshd@1-10.0.0.42:22-10.0.0.1:33708.service - OpenSSH per-connection server daemon (10.0.0.1:33708). May 13 23:52:27.027402 sshd[1647]: Accepted publickey for core from 10.0.0.1 port 33708 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.029191 sshd-session[1647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.033825 systemd-logind[1504]: New session 2 of user core. May 13 23:52:27.046336 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:52:27.102455 sshd[1649]: Connection closed by 10.0.0.1 port 33708 May 13 23:52:27.102792 sshd-session[1647]: pam_unix(sshd:session): session closed for user core May 13 23:52:27.128182 systemd[1]: sshd@1-10.0.0.42:22-10.0.0.1:33708.service: Deactivated successfully. May 13 23:52:27.131117 systemd[1]: session-2.scope: Deactivated successfully. May 13 23:52:27.133693 systemd-logind[1504]: Session 2 logged out. Waiting for processes to exit. May 13 23:52:27.135812 systemd[1]: Started sshd@2-10.0.0.42:22-10.0.0.1:33724.service - OpenSSH per-connection server daemon (10.0.0.1:33724). May 13 23:52:27.136792 systemd-logind[1504]: Removed session 2. May 13 23:52:27.199838 sshd[1654]: Accepted publickey for core from 10.0.0.1 port 33724 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.201957 sshd-session[1654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.207950 systemd-logind[1504]: New session 3 of user core. May 13 23:52:27.217121 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:52:27.270915 sshd[1657]: Connection closed by 10.0.0.1 port 33724 May 13 23:52:27.271347 sshd-session[1654]: pam_unix(sshd:session): session closed for user core May 13 23:52:27.280211 systemd[1]: sshd@2-10.0.0.42:22-10.0.0.1:33724.service: Deactivated successfully. May 13 23:52:27.282207 systemd[1]: session-3.scope: Deactivated successfully. May 13 23:52:27.284347 systemd-logind[1504]: Session 3 logged out. Waiting for processes to exit. May 13 23:52:27.286012 systemd[1]: Started sshd@3-10.0.0.42:22-10.0.0.1:33728.service - OpenSSH per-connection server daemon (10.0.0.1:33728). May 13 23:52:27.287310 systemd-logind[1504]: Removed session 3. May 13 23:52:27.341461 sshd[1662]: Accepted publickey for core from 10.0.0.1 port 33728 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.343383 sshd-session[1662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.350151 systemd-logind[1504]: New session 4 of user core. May 13 23:52:27.365308 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:52:27.423472 sshd[1665]: Connection closed by 10.0.0.1 port 33728 May 13 23:52:27.423732 sshd-session[1662]: pam_unix(sshd:session): session closed for user core May 13 23:52:27.441970 systemd[1]: sshd@3-10.0.0.42:22-10.0.0.1:33728.service: Deactivated successfully. May 13 23:52:27.444003 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:52:27.446074 systemd-logind[1504]: Session 4 logged out. Waiting for processes to exit. May 13 23:52:27.447490 systemd[1]: Started sshd@4-10.0.0.42:22-10.0.0.1:33736.service - OpenSSH per-connection server daemon (10.0.0.1:33736). May 13 23:52:27.448516 systemd-logind[1504]: Removed session 4. May 13 23:52:27.517971 sshd[1670]: Accepted publickey for core from 10.0.0.1 port 33736 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.519995 sshd-session[1670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.525756 systemd-logind[1504]: New session 5 of user core. May 13 23:52:27.536134 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:52:27.596968 sudo[1674]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:52:27.597415 sudo[1674]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:52:27.618137 sudo[1674]: pam_unix(sudo:session): session closed for user root May 13 23:52:27.620246 sshd[1673]: Connection closed by 10.0.0.1 port 33736 May 13 23:52:27.621080 sshd-session[1670]: pam_unix(sshd:session): session closed for user core May 13 23:52:27.634403 systemd[1]: sshd@4-10.0.0.42:22-10.0.0.1:33736.service: Deactivated successfully. May 13 23:52:27.637286 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:52:27.639850 systemd-logind[1504]: Session 5 logged out. Waiting for processes to exit. May 13 23:52:27.642020 systemd[1]: Started sshd@5-10.0.0.42:22-10.0.0.1:33750.service - OpenSSH per-connection server daemon (10.0.0.1:33750). May 13 23:52:27.643228 systemd-logind[1504]: Removed session 5. May 13 23:52:27.698800 sshd[1679]: Accepted publickey for core from 10.0.0.1 port 33750 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.700960 sshd-session[1679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.706932 systemd-logind[1504]: New session 6 of user core. May 13 23:52:27.716120 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:52:27.772530 sudo[1684]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:52:27.772908 sudo[1684]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:52:27.777607 sudo[1684]: pam_unix(sudo:session): session closed for user root May 13 23:52:27.785349 sudo[1683]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:52:27.785699 sudo[1683]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:52:27.800013 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:52:27.856556 augenrules[1706]: No rules May 13 23:52:27.857722 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:52:27.858117 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:52:27.859656 sudo[1683]: pam_unix(sudo:session): session closed for user root May 13 23:52:27.861926 sshd[1682]: Connection closed by 10.0.0.1 port 33750 May 13 23:52:27.862397 sshd-session[1679]: pam_unix(sshd:session): session closed for user core May 13 23:52:27.877930 systemd[1]: sshd@5-10.0.0.42:22-10.0.0.1:33750.service: Deactivated successfully. May 13 23:52:27.880609 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:52:27.882853 systemd-logind[1504]: Session 6 logged out. Waiting for processes to exit. May 13 23:52:27.885177 systemd[1]: Started sshd@6-10.0.0.42:22-10.0.0.1:33752.service - OpenSSH per-connection server daemon (10.0.0.1:33752). May 13 23:52:27.886720 systemd-logind[1504]: Removed session 6. May 13 23:52:27.942356 sshd[1714]: Accepted publickey for core from 10.0.0.1 port 33752 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:52:27.944404 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:52:27.950004 systemd-logind[1504]: New session 7 of user core. May 13 23:52:27.957425 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:52:28.014346 sudo[1718]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:52:28.014723 sudo[1718]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:52:28.390696 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:52:28.403678 (dockerd)[1738]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:52:28.717542 dockerd[1738]: time="2025-05-13T23:52:28.717348423Z" level=info msg="Starting up" May 13 23:52:28.718859 dockerd[1738]: time="2025-05-13T23:52:28.718817488Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:52:29.740900 dockerd[1738]: time="2025-05-13T23:52:29.740789275Z" level=info msg="Loading containers: start." May 13 23:52:29.986450 kernel: Initializing XFRM netlink socket May 13 23:52:30.080753 systemd-networkd[1435]: docker0: Link UP May 13 23:52:30.582612 dockerd[1738]: time="2025-05-13T23:52:30.582542915Z" level=info msg="Loading containers: done." May 13 23:52:30.615551 dockerd[1738]: time="2025-05-13T23:52:30.615471339Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:52:30.615798 dockerd[1738]: time="2025-05-13T23:52:30.615599219Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:52:30.615798 dockerd[1738]: time="2025-05-13T23:52:30.615780659Z" level=info msg="Daemon has completed initialization" May 13 23:52:30.685660 dockerd[1738]: time="2025-05-13T23:52:30.685547362Z" level=info msg="API listen on /run/docker.sock" May 13 23:52:30.685810 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:52:31.703653 containerd[1526]: time="2025-05-13T23:52:31.703604035Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 23:52:33.022445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3765358517.mount: Deactivated successfully. May 13 23:52:35.390131 containerd[1526]: time="2025-05-13T23:52:35.390044363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:35.406812 containerd[1526]: time="2025-05-13T23:52:35.406692210Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960987" May 13 23:52:35.429081 containerd[1526]: time="2025-05-13T23:52:35.429033826Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:35.484372 containerd[1526]: time="2025-05-13T23:52:35.484296353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:35.485289 containerd[1526]: time="2025-05-13T23:52:35.485250342Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 3.781604398s" May 13 23:52:35.485289 containerd[1526]: time="2025-05-13T23:52:35.485283695Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 13 23:52:35.486978 containerd[1526]: time="2025-05-13T23:52:35.486951804Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 23:52:36.869643 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:52:36.872319 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:52:37.470560 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:52:37.488276 (kubelet)[2008]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:52:37.605649 kubelet[2008]: E0513 23:52:37.605545 2008 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:52:37.612295 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:52:37.612502 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:52:37.612937 systemd[1]: kubelet.service: Consumed 324ms CPU time, 98.5M memory peak. May 13 23:52:39.068554 containerd[1526]: time="2025-05-13T23:52:39.068459348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:39.200044 containerd[1526]: time="2025-05-13T23:52:39.199916524Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713776" May 13 23:52:39.237250 containerd[1526]: time="2025-05-13T23:52:39.237160624Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:39.300068 containerd[1526]: time="2025-05-13T23:52:39.299986427Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:39.302534 containerd[1526]: time="2025-05-13T23:52:39.302388462Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 3.815400871s" May 13 23:52:39.302534 containerd[1526]: time="2025-05-13T23:52:39.302451370Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 13 23:52:39.303090 containerd[1526]: time="2025-05-13T23:52:39.303056576Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 23:52:44.384044 containerd[1526]: time="2025-05-13T23:52:44.383953336Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:44.490220 containerd[1526]: time="2025-05-13T23:52:44.490112219Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780386" May 13 23:52:44.578186 containerd[1526]: time="2025-05-13T23:52:44.578110499Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:44.624130 containerd[1526]: time="2025-05-13T23:52:44.624032936Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:44.625309 containerd[1526]: time="2025-05-13T23:52:44.625248567Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 5.322152036s" May 13 23:52:44.625309 containerd[1526]: time="2025-05-13T23:52:44.625290245Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 13 23:52:44.625908 containerd[1526]: time="2025-05-13T23:52:44.625845797Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 23:52:47.619480 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:52:47.621393 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:52:48.025764 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:52:48.031032 (kubelet)[2033]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:52:48.071484 kubelet[2033]: E0513 23:52:48.071411 2033 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:52:48.076221 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:52:48.076492 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:52:48.077008 systemd[1]: kubelet.service: Consumed 214ms CPU time, 95.8M memory peak. May 13 23:52:49.746175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3071808134.mount: Deactivated successfully. May 13 23:52:51.935740 containerd[1526]: time="2025-05-13T23:52:51.935660700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:51.996964 containerd[1526]: time="2025-05-13T23:52:51.996831319Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354625" May 13 23:52:52.046571 containerd[1526]: time="2025-05-13T23:52:52.046500033Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:52.105695 containerd[1526]: time="2025-05-13T23:52:52.105637478Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:52.106449 containerd[1526]: time="2025-05-13T23:52:52.106355595Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 7.480440999s" May 13 23:52:52.106567 containerd[1526]: time="2025-05-13T23:52:52.106453579Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 13 23:52:52.107129 containerd[1526]: time="2025-05-13T23:52:52.107085394Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 23:52:52.783449 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2443674504.mount: Deactivated successfully. May 13 23:52:54.308683 containerd[1526]: time="2025-05-13T23:52:54.308589597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:54.313228 containerd[1526]: time="2025-05-13T23:52:54.313138339Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" May 13 23:52:54.318620 containerd[1526]: time="2025-05-13T23:52:54.318528699Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:54.327780 containerd[1526]: time="2025-05-13T23:52:54.327737763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:52:54.328695 containerd[1526]: time="2025-05-13T23:52:54.328637340Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.221508244s" May 13 23:52:54.328695 containerd[1526]: time="2025-05-13T23:52:54.328688877Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 13 23:52:54.329286 containerd[1526]: time="2025-05-13T23:52:54.329240632Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 23:52:55.352579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount358159662.mount: Deactivated successfully. May 13 23:52:58.119431 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:52:58.121409 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:52:58.305372 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:52:58.323283 (kubelet)[2106]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:52:58.420927 kubelet[2106]: E0513 23:52:58.419309 2106 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:52:58.423237 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:52:58.423467 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:52:58.423862 systemd[1]: kubelet.service: Consumed 268ms CPU time, 99.9M memory peak. May 13 23:52:59.312835 containerd[1526]: time="2025-05-13T23:52:59.312729085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:52:59.598397 containerd[1526]: time="2025-05-13T23:52:59.598105542Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 13 23:52:59.623476 containerd[1526]: time="2025-05-13T23:52:59.623408292Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:52:59.722308 containerd[1526]: time="2025-05-13T23:52:59.722212646Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:52:59.723046 containerd[1526]: time="2025-05-13T23:52:59.722984005Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 5.393708196s" May 13 23:52:59.723046 containerd[1526]: time="2025-05-13T23:52:59.723037868Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 13 23:52:59.723748 containerd[1526]: time="2025-05-13T23:52:59.723695078Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 23:53:04.937605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2336296961.mount: Deactivated successfully. May 13 23:53:08.619389 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:53:08.621408 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:08.817447 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:08.833177 (kubelet)[2130]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:53:09.275493 update_engine[1508]: I20250513 23:53:09.275352 1508 update_attempter.cc:509] Updating boot flags... May 13 23:53:09.788628 kubelet[2130]: E0513 23:53:09.788545 2130 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:53:09.792904 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:53:09.793185 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:53:09.793641 systemd[1]: kubelet.service: Consumed 227ms CPU time, 98.3M memory peak. May 13 23:53:09.861929 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2144) May 13 23:53:09.952202 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2143) May 13 23:53:09.989917 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2143) May 13 23:53:19.869363 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:53:19.871483 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:20.057383 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:20.076441 (kubelet)[2205]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:53:20.110564 kubelet[2205]: E0513 23:53:20.110434 2205 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:53:20.114918 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:53:20.115205 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:53:20.115717 systemd[1]: kubelet.service: Consumed 203ms CPU time, 97.8M memory peak. May 13 23:53:21.388138 containerd[1526]: time="2025-05-13T23:53:21.388043905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:53:21.405409 containerd[1526]: time="2025-05-13T23:53:21.405313543Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 13 23:53:21.497514 containerd[1526]: time="2025-05-13T23:53:21.497425901Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:53:21.561709 containerd[1526]: time="2025-05-13T23:53:21.561624449Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:53:21.562855 containerd[1526]: time="2025-05-13T23:53:21.562802861Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 21.839061324s" May 13 23:53:21.563023 containerd[1526]: time="2025-05-13T23:53:21.562859387Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 13 23:53:25.223601 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:25.223820 systemd[1]: kubelet.service: Consumed 203ms CPU time, 97.8M memory peak. May 13 23:53:25.226699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:25.263094 systemd[1]: Reload requested from client PID 2243 ('systemctl') (unit session-7.scope)... May 13 23:53:25.263123 systemd[1]: Reloading... May 13 23:53:25.388948 zram_generator::config[2290]: No configuration found. May 13 23:53:25.931648 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:53:26.040054 systemd[1]: Reloading finished in 776 ms. May 13 23:53:26.098546 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:26.101591 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:26.105360 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:53:26.105731 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:26.105777 systemd[1]: kubelet.service: Consumed 159ms CPU time, 83.6M memory peak. May 13 23:53:26.107767 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:26.406629 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:26.421379 (kubelet)[2336]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:53:26.457171 kubelet[2336]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:53:26.457171 kubelet[2336]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:53:26.457171 kubelet[2336]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:53:26.457700 kubelet[2336]: I0513 23:53:26.457228 2336 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:53:27.293908 kubelet[2336]: I0513 23:53:27.293834 2336 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 23:53:27.293908 kubelet[2336]: I0513 23:53:27.293871 2336 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:53:27.294137 kubelet[2336]: I0513 23:53:27.294118 2336 server.go:929] "Client rotation is on, will bootstrap in background" May 13 23:53:27.543221 kubelet[2336]: I0513 23:53:27.543144 2336 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:53:27.549376 kubelet[2336]: E0513 23:53:27.549231 2336 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.42:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:27.578561 kubelet[2336]: I0513 23:53:27.578528 2336 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:53:27.593086 kubelet[2336]: I0513 23:53:27.593043 2336 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:53:27.597864 kubelet[2336]: I0513 23:53:27.597836 2336 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 23:53:27.598113 kubelet[2336]: I0513 23:53:27.598067 2336 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:53:27.598301 kubelet[2336]: I0513 23:53:27.598110 2336 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:53:27.598418 kubelet[2336]: I0513 23:53:27.598316 2336 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:53:27.598418 kubelet[2336]: I0513 23:53:27.598324 2336 container_manager_linux.go:300] "Creating device plugin manager" May 13 23:53:27.598474 kubelet[2336]: I0513 23:53:27.598465 2336 state_mem.go:36] "Initialized new in-memory state store" May 13 23:53:27.601552 kubelet[2336]: I0513 23:53:27.601525 2336 kubelet.go:408] "Attempting to sync node with API server" May 13 23:53:27.601619 kubelet[2336]: I0513 23:53:27.601561 2336 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:53:27.601619 kubelet[2336]: I0513 23:53:27.601602 2336 kubelet.go:314] "Adding apiserver pod source" May 13 23:53:27.601690 kubelet[2336]: I0513 23:53:27.601625 2336 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:53:27.617082 kubelet[2336]: I0513 23:53:27.617032 2336 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:53:27.617082 kubelet[2336]: W0513 23:53:27.616862 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:27.617299 kubelet[2336]: E0513 23:53:27.617122 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:27.621557 kubelet[2336]: W0513 23:53:27.621502 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:27.621619 kubelet[2336]: E0513 23:53:27.621552 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:27.623794 kubelet[2336]: I0513 23:53:27.623740 2336 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:53:27.623918 kubelet[2336]: W0513 23:53:27.623850 2336 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:53:27.624557 kubelet[2336]: I0513 23:53:27.624527 2336 server.go:1269] "Started kubelet" May 13 23:53:27.625921 kubelet[2336]: I0513 23:53:27.625894 2336 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:53:27.631590 kubelet[2336]: I0513 23:53:27.631548 2336 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 23:53:27.631713 kubelet[2336]: I0513 23:53:27.631693 2336 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 23:53:27.631803 kubelet[2336]: I0513 23:53:27.631781 2336 reconciler.go:26] "Reconciler: start to sync state" May 13 23:53:27.632294 kubelet[2336]: W0513 23:53:27.632235 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:27.632360 kubelet[2336]: E0513 23:53:27.632289 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:27.632747 kubelet[2336]: I0513 23:53:27.632706 2336 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:53:27.634176 kubelet[2336]: I0513 23:53:27.633402 2336 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:53:27.634176 kubelet[2336]: I0513 23:53:27.633680 2336 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:53:27.634291 kubelet[2336]: I0513 23:53:27.634257 2336 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:53:27.634416 kubelet[2336]: E0513 23:53:27.634394 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:27.634596 kubelet[2336]: I0513 23:53:27.634569 2336 server.go:460] "Adding debug handlers to kubelet server" May 13 23:53:27.635547 kubelet[2336]: E0513 23:53:27.635511 2336 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.42:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.42:6443: connect: connection refused" interval="200ms" May 13 23:53:27.636083 kubelet[2336]: E0513 23:53:27.636061 2336 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:53:27.636947 kubelet[2336]: I0513 23:53:27.636305 2336 factory.go:221] Registration of the containerd container factory successfully May 13 23:53:27.636947 kubelet[2336]: I0513 23:53:27.636326 2336 factory.go:221] Registration of the systemd container factory successfully May 13 23:53:27.636947 kubelet[2336]: I0513 23:53:27.636398 2336 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:53:27.651536 kubelet[2336]: E0513 23:53:27.648394 2336 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.42:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.42:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f3b52e2fb1af0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 23:53:27.624506096 +0000 UTC m=+1.198897089,LastTimestamp:2025-05-13 23:53:27.624506096 +0000 UTC m=+1.198897089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 23:53:27.652870 kubelet[2336]: I0513 23:53:27.652566 2336 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:53:27.652870 kubelet[2336]: I0513 23:53:27.652581 2336 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:53:27.652870 kubelet[2336]: I0513 23:53:27.652602 2336 state_mem.go:36] "Initialized new in-memory state store" May 13 23:53:27.655527 kubelet[2336]: I0513 23:53:27.655469 2336 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:53:27.657406 kubelet[2336]: I0513 23:53:27.657375 2336 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:53:27.657406 kubelet[2336]: I0513 23:53:27.657410 2336 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:53:27.657607 kubelet[2336]: I0513 23:53:27.657433 2336 kubelet.go:2321] "Starting kubelet main sync loop" May 13 23:53:27.657607 kubelet[2336]: E0513 23:53:27.657482 2336 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:53:27.658858 kubelet[2336]: W0513 23:53:27.658375 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:27.658858 kubelet[2336]: E0513 23:53:27.658408 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:27.735312 kubelet[2336]: E0513 23:53:27.735250 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:27.758698 kubelet[2336]: E0513 23:53:27.758580 2336 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:53:27.836213 kubelet[2336]: E0513 23:53:27.836037 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:27.836481 kubelet[2336]: E0513 23:53:27.836439 2336 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.42:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.42:6443: connect: connection refused" interval="400ms" May 13 23:53:27.937184 kubelet[2336]: E0513 23:53:27.937070 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:27.959420 kubelet[2336]: E0513 23:53:27.959332 2336 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:53:28.038195 kubelet[2336]: E0513 23:53:28.038105 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:28.093470 kubelet[2336]: I0513 23:53:28.093258 2336 policy_none.go:49] "None policy: Start" May 13 23:53:28.094330 kubelet[2336]: I0513 23:53:28.094304 2336 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:53:28.094394 kubelet[2336]: I0513 23:53:28.094339 2336 state_mem.go:35] "Initializing new in-memory state store" May 13 23:53:28.138614 kubelet[2336]: E0513 23:53:28.138548 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:28.229071 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:53:28.238175 kubelet[2336]: E0513 23:53:28.237835 2336 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.42:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.42:6443: connect: connection refused" interval="800ms" May 13 23:53:28.238709 kubelet[2336]: E0513 23:53:28.238648 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:28.244642 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:53:28.248493 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:53:28.262602 kubelet[2336]: I0513 23:53:28.262374 2336 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:53:28.262819 kubelet[2336]: I0513 23:53:28.262728 2336 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:53:28.262819 kubelet[2336]: I0513 23:53:28.262742 2336 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:53:28.263180 kubelet[2336]: I0513 23:53:28.263110 2336 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:53:28.266835 kubelet[2336]: E0513 23:53:28.265476 2336 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 13 23:53:28.363907 kubelet[2336]: I0513 23:53:28.363762 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:28.364399 kubelet[2336]: E0513 23:53:28.364177 2336 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.42:6443/api/v1/nodes\": dial tcp 10.0.0.42:6443: connect: connection refused" node="localhost" May 13 23:53:28.370918 systemd[1]: Created slice kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice - libcontainer container kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice. May 13 23:53:28.387872 systemd[1]: Created slice kubepods-burstable-podd8ce3728d84437a34e84e908e4ec7d17.slice - libcontainer container kubepods-burstable-podd8ce3728d84437a34e84e908e4ec7d17.slice. May 13 23:53:28.392453 systemd[1]: Created slice kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice - libcontainer container kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice. May 13 23:53:28.436072 kubelet[2336]: I0513 23:53:28.436000 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:28.436072 kubelet[2336]: I0513 23:53:28.436053 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:28.436072 kubelet[2336]: I0513 23:53:28.436079 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:28.436329 kubelet[2336]: I0513 23:53:28.436102 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:28.436329 kubelet[2336]: I0513 23:53:28.436123 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:28.436329 kubelet[2336]: I0513 23:53:28.436155 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:28.436329 kubelet[2336]: I0513 23:53:28.436202 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:28.436329 kubelet[2336]: I0513 23:53:28.436227 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:28.436458 kubelet[2336]: I0513 23:53:28.436249 2336 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 23:53:28.471766 kubelet[2336]: W0513 23:53:28.471660 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:28.471766 kubelet[2336]: E0513 23:53:28.471744 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:28.480233 kubelet[2336]: W0513 23:53:28.480179 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:28.480233 kubelet[2336]: E0513 23:53:28.480218 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:28.566198 kubelet[2336]: I0513 23:53:28.566164 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:28.566570 kubelet[2336]: E0513 23:53:28.566531 2336 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.42:6443/api/v1/nodes\": dial tcp 10.0.0.42:6443: connect: connection refused" node="localhost" May 13 23:53:28.688097 containerd[1526]: time="2025-05-13T23:53:28.688044148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,}" May 13 23:53:28.691554 containerd[1526]: time="2025-05-13T23:53:28.691524940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d8ce3728d84437a34e84e908e4ec7d17,Namespace:kube-system,Attempt:0,}" May 13 23:53:28.696403 containerd[1526]: time="2025-05-13T23:53:28.696352686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,}" May 13 23:53:28.883084 kubelet[2336]: W0513 23:53:28.882992 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:28.883084 kubelet[2336]: E0513 23:53:28.883085 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:28.968340 kubelet[2336]: I0513 23:53:28.968172 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:28.968636 kubelet[2336]: E0513 23:53:28.968588 2336 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.42:6443/api/v1/nodes\": dial tcp 10.0.0.42:6443: connect: connection refused" node="localhost" May 13 23:53:29.038658 kubelet[2336]: E0513 23:53:29.038597 2336 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.42:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.42:6443: connect: connection refused" interval="1.6s" May 13 23:53:29.073483 kubelet[2336]: W0513 23:53:29.073396 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:29.073483 kubelet[2336]: E0513 23:53:29.073478 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:29.688907 kubelet[2336]: E0513 23:53:29.688852 2336 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.42:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:29.769910 kubelet[2336]: I0513 23:53:29.769808 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:29.770253 kubelet[2336]: E0513 23:53:29.770215 2336 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.42:6443/api/v1/nodes\": dial tcp 10.0.0.42:6443: connect: connection refused" node="localhost" May 13 23:53:30.481618 kubelet[2336]: W0513 23:53:30.481518 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:30.481618 kubelet[2336]: E0513 23:53:30.481597 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.42:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:30.639434 kubelet[2336]: E0513 23:53:30.639338 2336 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.42:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.42:6443: connect: connection refused" interval="3.2s" May 13 23:53:30.751743 containerd[1526]: time="2025-05-13T23:53:30.751579947Z" level=info msg="connecting to shim c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d" address="unix:///run/containerd/s/1d101dcf447062f8dfdf4f617d572ce12f567c12a01a305f0a72e81a5c45d36a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:53:30.777088 containerd[1526]: time="2025-05-13T23:53:30.777023185Z" level=info msg="connecting to shim 4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba" address="unix:///run/containerd/s/81a10607a4bc387427ce320bf4f91f63f9b9622c44803b4e3a9e0cc2c5f2c84a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:53:30.781234 systemd[1]: Started cri-containerd-c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d.scope - libcontainer container c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d. May 13 23:53:30.812283 systemd[1]: Started cri-containerd-4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba.scope - libcontainer container 4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba. May 13 23:53:30.831140 containerd[1526]: time="2025-05-13T23:53:30.831067784Z" level=info msg="connecting to shim 850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3" address="unix:///run/containerd/s/fa1f174954c47ef03295ccce74b74bb20501c2e1b87b923c2287ad690848f5cd" namespace=k8s.io protocol=ttrpc version=3 May 13 23:53:30.859037 systemd[1]: Started cri-containerd-850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3.scope - libcontainer container 850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3. May 13 23:53:30.885564 containerd[1526]: time="2025-05-13T23:53:30.885506523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,} returns sandbox id \"c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d\"" May 13 23:53:30.888966 containerd[1526]: time="2025-05-13T23:53:30.888425314Z" level=info msg="CreateContainer within sandbox \"c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:53:30.925653 containerd[1526]: time="2025-05-13T23:53:30.925582511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba\"" May 13 23:53:30.928676 containerd[1526]: time="2025-05-13T23:53:30.928614806Z" level=info msg="CreateContainer within sandbox \"4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:53:31.098418 containerd[1526]: time="2025-05-13T23:53:31.098219422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d8ce3728d84437a34e84e908e4ec7d17,Namespace:kube-system,Attempt:0,} returns sandbox id \"850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3\"" May 13 23:53:31.100956 containerd[1526]: time="2025-05-13T23:53:31.100864838Z" level=info msg="CreateContainer within sandbox \"850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:53:31.226261 kubelet[2336]: E0513 23:53:31.226069 2336 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.42:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.42:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f3b52e2fb1af0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 23:53:27.624506096 +0000 UTC m=+1.198897089,LastTimestamp:2025-05-13 23:53:27.624506096 +0000 UTC m=+1.198897089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 23:53:31.372028 kubelet[2336]: I0513 23:53:31.371975 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:31.372555 kubelet[2336]: E0513 23:53:31.372506 2336 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.42:6443/api/v1/nodes\": dial tcp 10.0.0.42:6443: connect: connection refused" node="localhost" May 13 23:53:31.491814 containerd[1526]: time="2025-05-13T23:53:31.491730925Z" level=info msg="Container b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4: CDI devices from CRI Config.CDIDevices: []" May 13 23:53:31.535272 kubelet[2336]: W0513 23:53:31.535211 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:31.535272 kubelet[2336]: E0513 23:53:31.535260 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.42:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:31.653244 kubelet[2336]: W0513 23:53:31.653025 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:31.653244 kubelet[2336]: E0513 23:53:31.653097 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.42:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:31.760228 containerd[1526]: time="2025-05-13T23:53:31.760155808Z" level=info msg="Container 6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475: CDI devices from CRI Config.CDIDevices: []" May 13 23:53:31.804235 kubelet[2336]: W0513 23:53:31.804178 2336 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.42:6443: connect: connection refused May 13 23:53:31.804235 kubelet[2336]: E0513 23:53:31.804232 2336 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.42:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.42:6443: connect: connection refused" logger="UnhandledError" May 13 23:53:32.020343 containerd[1526]: time="2025-05-13T23:53:32.020122625Z" level=info msg="Container 5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262: CDI devices from CRI Config.CDIDevices: []" May 13 23:53:32.479701 containerd[1526]: time="2025-05-13T23:53:32.479622612Z" level=info msg="CreateContainer within sandbox \"c00ac80df000aa7141374a144e5cda9d9dbfaed0f89785dc8a6c40556ed8370d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4\"" May 13 23:53:32.480514 containerd[1526]: time="2025-05-13T23:53:32.480462411Z" level=info msg="StartContainer for \"b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4\"" May 13 23:53:32.481637 containerd[1526]: time="2025-05-13T23:53:32.481614779Z" level=info msg="connecting to shim b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4" address="unix:///run/containerd/s/1d101dcf447062f8dfdf4f617d572ce12f567c12a01a305f0a72e81a5c45d36a" protocol=ttrpc version=3 May 13 23:53:32.504078 systemd[1]: Started cri-containerd-b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4.scope - libcontainer container b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4. May 13 23:53:32.905485 containerd[1526]: time="2025-05-13T23:53:32.905438509Z" level=info msg="CreateContainer within sandbox \"4b59f9e8fce91fdfce462b89cd05b64cbbb1499b392dee3e85ccef6d56ab26ba\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475\"" May 13 23:53:32.906551 containerd[1526]: time="2025-05-13T23:53:32.906220961Z" level=info msg="StartContainer for \"b1e933844f3abace03614feb93c3b20675a8b9330d673c67757b981dcb4fa1f4\" returns successfully" May 13 23:53:32.906551 containerd[1526]: time="2025-05-13T23:53:32.906285231Z" level=info msg="StartContainer for \"6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475\"" May 13 23:53:32.907759 containerd[1526]: time="2025-05-13T23:53:32.907643065Z" level=info msg="connecting to shim 6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475" address="unix:///run/containerd/s/81a10607a4bc387427ce320bf4f91f63f9b9622c44803b4e3a9e0cc2c5f2c84a" protocol=ttrpc version=3 May 13 23:53:32.934060 systemd[1]: Started cri-containerd-6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475.scope - libcontainer container 6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475. May 13 23:53:33.223581 containerd[1526]: time="2025-05-13T23:53:33.222218765Z" level=info msg="CreateContainer within sandbox \"850e59f274047118448feb5efa4c21e707c4a2fa512c3e186729859403f2fae3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262\"" May 13 23:53:33.224135 containerd[1526]: time="2025-05-13T23:53:33.224111404Z" level=info msg="StartContainer for \"6cb5bc1244c23b17fc9be8ca28019204c9d38faaa8dc33f1d70912a69a925475\" returns successfully" May 13 23:53:33.224354 containerd[1526]: time="2025-05-13T23:53:33.224144807Z" level=info msg="StartContainer for \"5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262\"" May 13 23:53:33.225908 containerd[1526]: time="2025-05-13T23:53:33.225860703Z" level=info msg="connecting to shim 5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262" address="unix:///run/containerd/s/fa1f174954c47ef03295ccce74b74bb20501c2e1b87b923c2287ad690848f5cd" protocol=ttrpc version=3 May 13 23:53:33.253080 systemd[1]: Started cri-containerd-5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262.scope - libcontainer container 5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262. May 13 23:53:33.501901 containerd[1526]: time="2025-05-13T23:53:33.501708241Z" level=info msg="StartContainer for \"5e2cee7618ab72b8efda16658fe22b0a4aa8a10b9384daa360ef09ecdb343262\" returns successfully" May 13 23:53:34.604180 kubelet[2336]: I0513 23:53:34.604119 2336 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:34.614840 kubelet[2336]: E0513 23:53:34.614764 2336 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 13 23:53:34.815867 kubelet[2336]: I0513 23:53:34.815822 2336 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 23:53:34.815867 kubelet[2336]: E0513 23:53:34.815857 2336 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 13 23:53:35.470513 kubelet[2336]: E0513 23:53:35.470446 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:35.571146 kubelet[2336]: E0513 23:53:35.571077 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:35.671688 kubelet[2336]: E0513 23:53:35.671615 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:35.772577 kubelet[2336]: E0513 23:53:35.772425 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:35.873244 kubelet[2336]: E0513 23:53:35.873173 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:35.974230 kubelet[2336]: E0513 23:53:35.974163 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.075361 kubelet[2336]: E0513 23:53:36.075179 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.175925 kubelet[2336]: E0513 23:53:36.175781 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.276962 kubelet[2336]: E0513 23:53:36.276838 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.377803 kubelet[2336]: E0513 23:53:36.377757 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.478720 kubelet[2336]: E0513 23:53:36.478644 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.579568 kubelet[2336]: E0513 23:53:36.579506 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.679843 kubelet[2336]: E0513 23:53:36.679681 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.780667 kubelet[2336]: E0513 23:53:36.780574 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.881768 kubelet[2336]: E0513 23:53:36.881690 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:36.982208 kubelet[2336]: E0513 23:53:36.982029 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.082950 kubelet[2336]: E0513 23:53:37.082839 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.183911 kubelet[2336]: E0513 23:53:37.183815 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.284340 kubelet[2336]: E0513 23:53:37.284172 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.385059 kubelet[2336]: E0513 23:53:37.384992 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.485910 kubelet[2336]: E0513 23:53:37.485839 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.587148 kubelet[2336]: E0513 23:53:37.586975 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:37.687771 kubelet[2336]: E0513 23:53:37.687700 2336 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:38.614110 kubelet[2336]: I0513 23:53:38.614065 2336 apiserver.go:52] "Watching apiserver" May 13 23:53:38.631818 kubelet[2336]: I0513 23:53:38.631778 2336 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 23:53:42.892168 kubelet[2336]: I0513 23:53:42.891977 2336 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.891958678 podStartE2EDuration="1.891958678s" podCreationTimestamp="2025-05-13 23:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:53:42.778806762 +0000 UTC m=+16.353197765" watchObservedRunningTime="2025-05-13 23:53:42.891958678 +0000 UTC m=+16.466349671" May 13 23:53:43.167535 kubelet[2336]: I0513 23:53:43.167334 2336 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.167313707 podStartE2EDuration="2.167313707s" podCreationTimestamp="2025-05-13 23:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:53:42.892126584 +0000 UTC m=+16.466517577" watchObservedRunningTime="2025-05-13 23:53:43.167313707 +0000 UTC m=+16.741704700" May 13 23:53:43.167535 kubelet[2336]: I0513 23:53:43.167425 2336 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.1674207380000001 podStartE2EDuration="1.167420738s" podCreationTimestamp="2025-05-13 23:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:53:43.167263022 +0000 UTC m=+16.741654025" watchObservedRunningTime="2025-05-13 23:53:43.167420738 +0000 UTC m=+16.741811731" May 13 23:53:48.830179 systemd[1]: Reload requested from client PID 2611 ('systemctl') (unit session-7.scope)... May 13 23:53:48.830196 systemd[1]: Reloading... May 13 23:53:48.944008 zram_generator::config[2655]: No configuration found. May 13 23:53:49.111306 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:53:49.260939 systemd[1]: Reloading finished in 430 ms. May 13 23:53:49.292635 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:49.306444 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:53:49.306811 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:49.306868 systemd[1]: kubelet.service: Consumed 1.997s CPU time, 121.1M memory peak. May 13 23:53:49.309096 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:53:49.518404 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:53:49.531278 (kubelet)[2701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:53:49.576558 kubelet[2701]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:53:49.576558 kubelet[2701]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:53:49.576558 kubelet[2701]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:53:49.576558 kubelet[2701]: I0513 23:53:49.576248 2701 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:53:49.585106 kubelet[2701]: I0513 23:53:49.584948 2701 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 23:53:49.585106 kubelet[2701]: I0513 23:53:49.584998 2701 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:53:49.585383 kubelet[2701]: I0513 23:53:49.585336 2701 server.go:929] "Client rotation is on, will bootstrap in background" May 13 23:53:49.587082 kubelet[2701]: I0513 23:53:49.587033 2701 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 23:53:49.590023 kubelet[2701]: I0513 23:53:49.589941 2701 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:53:49.595372 kubelet[2701]: I0513 23:53:49.595323 2701 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:53:49.603012 kubelet[2701]: I0513 23:53:49.602844 2701 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:53:49.603012 kubelet[2701]: I0513 23:53:49.603022 2701 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 23:53:49.603276 kubelet[2701]: I0513 23:53:49.603150 2701 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:53:49.603518 kubelet[2701]: I0513 23:53:49.603187 2701 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:53:49.603518 kubelet[2701]: I0513 23:53:49.603432 2701 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:53:49.603518 kubelet[2701]: I0513 23:53:49.603444 2701 container_manager_linux.go:300] "Creating device plugin manager" May 13 23:53:49.603518 kubelet[2701]: I0513 23:53:49.603482 2701 state_mem.go:36] "Initialized new in-memory state store" May 13 23:53:49.603940 kubelet[2701]: I0513 23:53:49.603633 2701 kubelet.go:408] "Attempting to sync node with API server" May 13 23:53:49.603940 kubelet[2701]: I0513 23:53:49.603647 2701 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:53:49.603940 kubelet[2701]: I0513 23:53:49.603692 2701 kubelet.go:314] "Adding apiserver pod source" May 13 23:53:49.603940 kubelet[2701]: I0513 23:53:49.603704 2701 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.605010 2701 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.605524 2701 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.606079 2701 server.go:1269] "Started kubelet" May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.606192 2701 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.606417 2701 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.606865 2701 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:53:49.608503 kubelet[2701]: I0513 23:53:49.607366 2701 server.go:460] "Adding debug handlers to kubelet server" May 13 23:53:49.614396 kubelet[2701]: I0513 23:53:49.611844 2701 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:53:49.614396 kubelet[2701]: I0513 23:53:49.612981 2701 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:53:49.616377 kubelet[2701]: I0513 23:53:49.616336 2701 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 23:53:49.616560 kubelet[2701]: I0513 23:53:49.616525 2701 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 23:53:49.616827 kubelet[2701]: I0513 23:53:49.616801 2701 reconciler.go:26] "Reconciler: start to sync state" May 13 23:53:49.618562 kubelet[2701]: E0513 23:53:49.618075 2701 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 23:53:49.620122 kubelet[2701]: I0513 23:53:49.620095 2701 factory.go:221] Registration of the systemd container factory successfully May 13 23:53:49.620391 kubelet[2701]: E0513 23:53:49.620348 2701 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:53:49.620865 kubelet[2701]: I0513 23:53:49.620813 2701 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:53:49.623004 kubelet[2701]: I0513 23:53:49.622970 2701 factory.go:221] Registration of the containerd container factory successfully May 13 23:53:49.633602 kubelet[2701]: I0513 23:53:49.633556 2701 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:53:49.636727 kubelet[2701]: I0513 23:53:49.636313 2701 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:53:49.636727 kubelet[2701]: I0513 23:53:49.636393 2701 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:53:49.636826 kubelet[2701]: I0513 23:53:49.636648 2701 kubelet.go:2321] "Starting kubelet main sync loop" May 13 23:53:49.636948 kubelet[2701]: E0513 23:53:49.636819 2701 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:53:49.661831 kubelet[2701]: I0513 23:53:49.661789 2701 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:53:49.661831 kubelet[2701]: I0513 23:53:49.661815 2701 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:53:49.661831 kubelet[2701]: I0513 23:53:49.661841 2701 state_mem.go:36] "Initialized new in-memory state store" May 13 23:53:49.662110 kubelet[2701]: I0513 23:53:49.662087 2701 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 23:53:49.662154 kubelet[2701]: I0513 23:53:49.662109 2701 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 23:53:49.662154 kubelet[2701]: I0513 23:53:49.662135 2701 policy_none.go:49] "None policy: Start" May 13 23:53:49.663000 kubelet[2701]: I0513 23:53:49.662946 2701 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:53:49.663000 kubelet[2701]: I0513 23:53:49.662980 2701 state_mem.go:35] "Initializing new in-memory state store" May 13 23:53:49.663216 kubelet[2701]: I0513 23:53:49.663183 2701 state_mem.go:75] "Updated machine memory state" May 13 23:53:49.668859 kubelet[2701]: I0513 23:53:49.668808 2701 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:53:49.669093 kubelet[2701]: I0513 23:53:49.669072 2701 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:53:49.669416 kubelet[2701]: I0513 23:53:49.669092 2701 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:53:49.669416 kubelet[2701]: I0513 23:53:49.669296 2701 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:53:49.770887 kubelet[2701]: I0513 23:53:49.770754 2701 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 23:53:49.817425 kubelet[2701]: I0513 23:53:49.817374 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:49.817425 kubelet[2701]: I0513 23:53:49.817416 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:49.817425 kubelet[2701]: I0513 23:53:49.817435 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:49.817692 kubelet[2701]: I0513 23:53:49.817451 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:49.817692 kubelet[2701]: I0513 23:53:49.817467 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:49.817692 kubelet[2701]: I0513 23:53:49.817480 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8ce3728d84437a34e84e908e4ec7d17-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d8ce3728d84437a34e84e908e4ec7d17\") " pod="kube-system/kube-apiserver-localhost" May 13 23:53:49.817692 kubelet[2701]: I0513 23:53:49.817553 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:49.817692 kubelet[2701]: I0513 23:53:49.817598 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:53:49.817816 kubelet[2701]: I0513 23:53:49.817623 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 23:53:49.953089 kubelet[2701]: E0513 23:53:49.952999 2701 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 13 23:53:49.963336 kubelet[2701]: E0513 23:53:49.962899 2701 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 23:53:49.963579 kubelet[2701]: E0513 23:53:49.963514 2701 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 23:53:50.195720 kubelet[2701]: I0513 23:53:50.195663 2701 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 13 23:53:50.195947 kubelet[2701]: I0513 23:53:50.195777 2701 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 23:53:50.604310 kubelet[2701]: I0513 23:53:50.604171 2701 apiserver.go:52] "Watching apiserver" May 13 23:53:50.617655 kubelet[2701]: I0513 23:53:50.617533 2701 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 23:53:54.383440 kubelet[2701]: I0513 23:53:54.383223 2701 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 23:53:54.383973 kubelet[2701]: I0513 23:53:54.383845 2701 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 23:53:54.384012 containerd[1526]: time="2025-05-13T23:53:54.383522859Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 23:53:55.507959 systemd[1]: Created slice kubepods-besteffort-pod43726185_0801_4425_8400_caaef8d10db4.slice - libcontainer container kubepods-besteffort-pod43726185_0801_4425_8400_caaef8d10db4.slice. May 13 23:53:55.554667 kubelet[2701]: I0513 23:53:55.554592 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/43726185-0801-4425-8400-caaef8d10db4-xtables-lock\") pod \"kube-proxy-rdvvr\" (UID: \"43726185-0801-4425-8400-caaef8d10db4\") " pod="kube-system/kube-proxy-rdvvr" May 13 23:53:55.554667 kubelet[2701]: I0513 23:53:55.554645 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/43726185-0801-4425-8400-caaef8d10db4-kube-proxy\") pod \"kube-proxy-rdvvr\" (UID: \"43726185-0801-4425-8400-caaef8d10db4\") " pod="kube-system/kube-proxy-rdvvr" May 13 23:53:55.554667 kubelet[2701]: I0513 23:53:55.554663 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43726185-0801-4425-8400-caaef8d10db4-lib-modules\") pod \"kube-proxy-rdvvr\" (UID: \"43726185-0801-4425-8400-caaef8d10db4\") " pod="kube-system/kube-proxy-rdvvr" May 13 23:53:55.554667 kubelet[2701]: I0513 23:53:55.554678 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsb4h\" (UniqueName: \"kubernetes.io/projected/43726185-0801-4425-8400-caaef8d10db4-kube-api-access-hsb4h\") pod \"kube-proxy-rdvvr\" (UID: \"43726185-0801-4425-8400-caaef8d10db4\") " pod="kube-system/kube-proxy-rdvvr" May 13 23:53:56.121600 containerd[1526]: time="2025-05-13T23:53:56.121528006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rdvvr,Uid:43726185-0801-4425-8400-caaef8d10db4,Namespace:kube-system,Attempt:0,}" May 13 23:53:56.443157 containerd[1526]: time="2025-05-13T23:53:56.442733956Z" level=info msg="connecting to shim 6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb" address="unix:///run/containerd/s/f40b550999aa207f6245dfdd8c2f856d0d35e94d39bca5c7b326f45b06ac0121" namespace=k8s.io protocol=ttrpc version=3 May 13 23:53:56.475161 systemd[1]: Started cri-containerd-6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb.scope - libcontainer container 6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb. May 13 23:53:56.513153 containerd[1526]: time="2025-05-13T23:53:56.513084458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rdvvr,Uid:43726185-0801-4425-8400-caaef8d10db4,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb\"" May 13 23:53:56.516129 containerd[1526]: time="2025-05-13T23:53:56.516077175Z" level=info msg="CreateContainer within sandbox \"6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 23:53:56.775996 containerd[1526]: time="2025-05-13T23:53:56.775799885Z" level=info msg="Container 43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377: CDI devices from CRI Config.CDIDevices: []" May 13 23:53:57.036018 containerd[1526]: time="2025-05-13T23:53:57.035842192Z" level=info msg="CreateContainer within sandbox \"6a80969182fe27820bec93016d2399b20a3a9e4f005b257512bd548f7b6554cb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377\"" May 13 23:53:57.038915 containerd[1526]: time="2025-05-13T23:53:57.036481853Z" level=info msg="StartContainer for \"43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377\"" May 13 23:53:57.038915 containerd[1526]: time="2025-05-13T23:53:57.038315956Z" level=info msg="connecting to shim 43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377" address="unix:///run/containerd/s/f40b550999aa207f6245dfdd8c2f856d0d35e94d39bca5c7b326f45b06ac0121" protocol=ttrpc version=3 May 13 23:53:57.068179 systemd[1]: Started cri-containerd-43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377.scope - libcontainer container 43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377. May 13 23:53:57.238719 containerd[1526]: time="2025-05-13T23:53:57.238620828Z" level=info msg="StartContainer for \"43f8be3f14ed1376197ad19f9eefb23487b4a7010bf8d8cf6e8113ff1ba71377\" returns successfully" May 13 23:53:59.242915 kubelet[2701]: I0513 23:53:59.240448 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rdvvr" podStartSLOduration=4.240427268 podStartE2EDuration="4.240427268s" podCreationTimestamp="2025-05-13 23:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:53:58.071185833 +0000 UTC m=+8.531732059" watchObservedRunningTime="2025-05-13 23:53:59.240427268 +0000 UTC m=+9.700973484" May 13 23:54:00.365451 systemd[1]: Created slice kubepods-besteffort-pod5330ad30_7161_4f90_9ecb_22ec3b728c98.slice - libcontainer container kubepods-besteffort-pod5330ad30_7161_4f90_9ecb_22ec3b728c98.slice. May 13 23:54:00.491550 kubelet[2701]: I0513 23:54:00.491470 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5330ad30-7161-4f90-9ecb-22ec3b728c98-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-vr4fl\" (UID: \"5330ad30-7161-4f90-9ecb-22ec3b728c98\") " pod="tigera-operator/tigera-operator-6f6897fdc5-vr4fl" May 13 23:54:00.491550 kubelet[2701]: I0513 23:54:00.491525 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txc6x\" (UniqueName: \"kubernetes.io/projected/5330ad30-7161-4f90-9ecb-22ec3b728c98-kube-api-access-txc6x\") pod \"tigera-operator-6f6897fdc5-vr4fl\" (UID: \"5330ad30-7161-4f90-9ecb-22ec3b728c98\") " pod="tigera-operator/tigera-operator-6f6897fdc5-vr4fl" May 13 23:54:00.969094 containerd[1526]: time="2025-05-13T23:54:00.969034913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-vr4fl,Uid:5330ad30-7161-4f90-9ecb-22ec3b728c98,Namespace:tigera-operator,Attempt:0,}" May 13 23:54:01.710094 containerd[1526]: time="2025-05-13T23:54:01.710008678Z" level=info msg="connecting to shim 8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da" address="unix:///run/containerd/s/eddf6eef0a9c77cd00b24fa9326845693533b1c1c50c284c06fe38ddfcd9e681" namespace=k8s.io protocol=ttrpc version=3 May 13 23:54:01.769214 systemd[1]: Started cri-containerd-8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da.scope - libcontainer container 8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da. May 13 23:54:01.906966 containerd[1526]: time="2025-05-13T23:54:01.906869915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-vr4fl,Uid:5330ad30-7161-4f90-9ecb-22ec3b728c98,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da\"" May 13 23:54:01.909277 containerd[1526]: time="2025-05-13T23:54:01.909216115Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 23:54:03.510177 sudo[1718]: pam_unix(sudo:session): session closed for user root May 13 23:54:03.511758 sshd[1717]: Connection closed by 10.0.0.1 port 33752 May 13 23:54:03.513286 sshd-session[1714]: pam_unix(sshd:session): session closed for user core May 13 23:54:03.519460 systemd[1]: sshd@6-10.0.0.42:22-10.0.0.1:33752.service: Deactivated successfully. May 13 23:54:03.522293 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:54:03.522537 systemd[1]: session-7.scope: Consumed 5.547s CPU time, 213.7M memory peak. May 13 23:54:03.523842 systemd-logind[1504]: Session 7 logged out. Waiting for processes to exit. May 13 23:54:03.524775 systemd-logind[1504]: Removed session 7. May 13 23:54:08.909070 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3984941297.mount: Deactivated successfully. May 13 23:54:11.137211 containerd[1526]: time="2025-05-13T23:54:11.137117828Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:11.184487 containerd[1526]: time="2025-05-13T23:54:11.184370299Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 13 23:54:11.249448 containerd[1526]: time="2025-05-13T23:54:11.249324401Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:11.349708 containerd[1526]: time="2025-05-13T23:54:11.349603546Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:11.350487 containerd[1526]: time="2025-05-13T23:54:11.350442718Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 9.44117321s" May 13 23:54:11.350487 containerd[1526]: time="2025-05-13T23:54:11.350482894Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 13 23:54:11.352303 containerd[1526]: time="2025-05-13T23:54:11.352277541Z" level=info msg="CreateContainer within sandbox \"8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 23:54:12.210260 containerd[1526]: time="2025-05-13T23:54:12.210196873Z" level=info msg="Container 5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a: CDI devices from CRI Config.CDIDevices: []" May 13 23:54:12.808626 containerd[1526]: time="2025-05-13T23:54:12.808551741Z" level=info msg="CreateContainer within sandbox \"8c5f2faf1abeac26ccacdacac705bed5e58c45cb59bd69619a7bd305d88e11da\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a\"" May 13 23:54:12.809228 containerd[1526]: time="2025-05-13T23:54:12.809185889Z" level=info msg="StartContainer for \"5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a\"" May 13 23:54:12.810325 containerd[1526]: time="2025-05-13T23:54:12.810296992Z" level=info msg="connecting to shim 5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a" address="unix:///run/containerd/s/eddf6eef0a9c77cd00b24fa9326845693533b1c1c50c284c06fe38ddfcd9e681" protocol=ttrpc version=3 May 13 23:54:12.836018 systemd[1]: Started cri-containerd-5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a.scope - libcontainer container 5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a. May 13 23:54:13.002391 containerd[1526]: time="2025-05-13T23:54:13.002328910Z" level=info msg="StartContainer for \"5bcea4628f0c90d6757e5c315bca79c8a9bcc2c43d1d46a53d835fa8cdc43b0a\" returns successfully" May 13 23:54:13.882295 kubelet[2701]: I0513 23:54:13.882236 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-vr4fl" podStartSLOduration=4.439208338 podStartE2EDuration="13.882198414s" podCreationTimestamp="2025-05-13 23:54:00 +0000 UTC" firstStartedPulling="2025-05-13 23:54:01.908115465 +0000 UTC m=+12.368661671" lastFinishedPulling="2025-05-13 23:54:11.351105541 +0000 UTC m=+21.811651747" observedRunningTime="2025-05-13 23:54:13.88206085 +0000 UTC m=+24.342607076" watchObservedRunningTime="2025-05-13 23:54:13.882198414 +0000 UTC m=+24.342744650" May 13 23:54:22.056705 systemd[1]: Created slice kubepods-besteffort-pod7450de43_f9ed_46d3_9696_04db0352cecf.slice - libcontainer container kubepods-besteffort-pod7450de43_f9ed_46d3_9696_04db0352cecf.slice. May 13 23:54:22.130693 kubelet[2701]: I0513 23:54:22.130630 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7450de43-f9ed-46d3-9696-04db0352cecf-tigera-ca-bundle\") pod \"calico-typha-75fd9d79c9-m8klp\" (UID: \"7450de43-f9ed-46d3-9696-04db0352cecf\") " pod="calico-system/calico-typha-75fd9d79c9-m8klp" May 13 23:54:22.130693 kubelet[2701]: I0513 23:54:22.130692 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7450de43-f9ed-46d3-9696-04db0352cecf-typha-certs\") pod \"calico-typha-75fd9d79c9-m8klp\" (UID: \"7450de43-f9ed-46d3-9696-04db0352cecf\") " pod="calico-system/calico-typha-75fd9d79c9-m8klp" May 13 23:54:22.131308 kubelet[2701]: I0513 23:54:22.130719 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrg2\" (UniqueName: \"kubernetes.io/projected/7450de43-f9ed-46d3-9696-04db0352cecf-kube-api-access-jjrg2\") pod \"calico-typha-75fd9d79c9-m8klp\" (UID: \"7450de43-f9ed-46d3-9696-04db0352cecf\") " pod="calico-system/calico-typha-75fd9d79c9-m8klp" May 13 23:54:22.961190 containerd[1526]: time="2025-05-13T23:54:22.961119645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75fd9d79c9-m8klp,Uid:7450de43-f9ed-46d3-9696-04db0352cecf,Namespace:calico-system,Attempt:0,}" May 13 23:54:23.029012 systemd[1]: Created slice kubepods-besteffort-poda9f2a2fb_780c_4d5d_9593_2723546f673c.slice - libcontainer container kubepods-besteffort-poda9f2a2fb_780c_4d5d_9593_2723546f673c.slice. May 13 23:54:23.135872 kubelet[2701]: I0513 23:54:23.135831 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-lib-modules\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.135872 kubelet[2701]: I0513 23:54:23.135867 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-cni-net-dir\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.135872 kubelet[2701]: I0513 23:54:23.135904 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-var-run-calico\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136488 kubelet[2701]: I0513 23:54:23.135923 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a9f2a2fb-780c-4d5d-9593-2723546f673c-node-certs\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136488 kubelet[2701]: I0513 23:54:23.135938 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-flexvol-driver-host\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136488 kubelet[2701]: I0513 23:54:23.135955 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-cni-bin-dir\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136488 kubelet[2701]: I0513 23:54:23.135969 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5vk\" (UniqueName: \"kubernetes.io/projected/a9f2a2fb-780c-4d5d-9593-2723546f673c-kube-api-access-hb5vk\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136488 kubelet[2701]: I0513 23:54:23.135983 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-policysync\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136634 kubelet[2701]: I0513 23:54:23.135997 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9f2a2fb-780c-4d5d-9593-2723546f673c-tigera-ca-bundle\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136634 kubelet[2701]: I0513 23:54:23.136014 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-var-lib-calico\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136634 kubelet[2701]: I0513 23:54:23.136050 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-xtables-lock\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.136634 kubelet[2701]: I0513 23:54:23.136068 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a9f2a2fb-780c-4d5d-9593-2723546f673c-cni-log-dir\") pod \"calico-node-jhxr7\" (UID: \"a9f2a2fb-780c-4d5d-9593-2723546f673c\") " pod="calico-system/calico-node-jhxr7" May 13 23:54:23.238822 kubelet[2701]: E0513 23:54:23.238681 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.238822 kubelet[2701]: W0513 23:54:23.238714 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.238822 kubelet[2701]: E0513 23:54:23.238741 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.241394 kubelet[2701]: E0513 23:54:23.241191 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.241394 kubelet[2701]: W0513 23:54:23.241296 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.241394 kubelet[2701]: E0513 23:54:23.241321 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.337241 kubelet[2701]: E0513 23:54:23.337206 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.337241 kubelet[2701]: W0513 23:54:23.337225 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.337241 kubelet[2701]: E0513 23:54:23.337242 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.438360 kubelet[2701]: E0513 23:54:23.438327 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.438360 kubelet[2701]: W0513 23:54:23.438349 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.438541 kubelet[2701]: E0513 23:54:23.438370 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.539829 kubelet[2701]: E0513 23:54:23.539652 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.539829 kubelet[2701]: W0513 23:54:23.539671 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.539829 kubelet[2701]: E0513 23:54:23.539688 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.622524 containerd[1526]: time="2025-05-13T23:54:23.622417440Z" level=info msg="connecting to shim 5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b" address="unix:///run/containerd/s/53ccd542a54282d2feaf7ef4d03f665d8fb04eb3ab95910a8d9669ded6215745" namespace=k8s.io protocol=ttrpc version=3 May 13 23:54:23.640699 kubelet[2701]: E0513 23:54:23.640613 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.640699 kubelet[2701]: W0513 23:54:23.640642 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.640699 kubelet[2701]: E0513 23:54:23.640663 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.657214 systemd[1]: Started cri-containerd-5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b.scope - libcontainer container 5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b. May 13 23:54:23.692940 kubelet[2701]: E0513 23:54:23.692859 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:23.692940 kubelet[2701]: W0513 23:54:23.692924 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:23.692940 kubelet[2701]: E0513 23:54:23.692950 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:23.744164 containerd[1526]: time="2025-05-13T23:54:23.744092048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75fd9d79c9-m8klp,Uid:7450de43-f9ed-46d3-9696-04db0352cecf,Namespace:calico-system,Attempt:0,} returns sandbox id \"5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b\"" May 13 23:54:23.745626 containerd[1526]: time="2025-05-13T23:54:23.745573415Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 23:54:23.933302 containerd[1526]: time="2025-05-13T23:54:23.933243482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jhxr7,Uid:a9f2a2fb-780c-4d5d-9593-2723546f673c,Namespace:calico-system,Attempt:0,}" May 13 23:54:24.133422 kubelet[2701]: E0513 23:54:24.133108 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:24.141401 kubelet[2701]: E0513 23:54:24.141340 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.141401 kubelet[2701]: W0513 23:54:24.141362 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.141401 kubelet[2701]: E0513 23:54:24.141380 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142082 kubelet[2701]: E0513 23:54:24.141578 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142082 kubelet[2701]: W0513 23:54:24.141586 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142082 kubelet[2701]: E0513 23:54:24.141594 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142082 kubelet[2701]: E0513 23:54:24.141919 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142082 kubelet[2701]: W0513 23:54:24.141926 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142082 kubelet[2701]: E0513 23:54:24.141934 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142260 kubelet[2701]: E0513 23:54:24.142115 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142260 kubelet[2701]: W0513 23:54:24.142122 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142260 kubelet[2701]: E0513 23:54:24.142130 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142354 kubelet[2701]: E0513 23:54:24.142345 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142388 kubelet[2701]: W0513 23:54:24.142355 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142388 kubelet[2701]: E0513 23:54:24.142363 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142591 kubelet[2701]: E0513 23:54:24.142560 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142591 kubelet[2701]: W0513 23:54:24.142573 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142591 kubelet[2701]: E0513 23:54:24.142581 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142778 kubelet[2701]: E0513 23:54:24.142748 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142778 kubelet[2701]: W0513 23:54:24.142766 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142778 kubelet[2701]: E0513 23:54:24.142774 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.142987 kubelet[2701]: E0513 23:54:24.142963 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.142987 kubelet[2701]: W0513 23:54:24.142976 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.142987 kubelet[2701]: E0513 23:54:24.142984 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.143170 kubelet[2701]: E0513 23:54:24.143147 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.143170 kubelet[2701]: W0513 23:54:24.143158 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.143170 kubelet[2701]: E0513 23:54:24.143166 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.143368 kubelet[2701]: E0513 23:54:24.143346 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.143368 kubelet[2701]: W0513 23:54:24.143357 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.143368 kubelet[2701]: E0513 23:54:24.143365 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.143591 kubelet[2701]: E0513 23:54:24.143573 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.143591 kubelet[2701]: W0513 23:54:24.143587 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.143648 kubelet[2701]: E0513 23:54:24.143596 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.143824 kubelet[2701]: E0513 23:54:24.143807 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.143824 kubelet[2701]: W0513 23:54:24.143818 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.143926 kubelet[2701]: E0513 23:54:24.143826 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.144039 kubelet[2701]: E0513 23:54:24.144023 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.144039 kubelet[2701]: W0513 23:54:24.144033 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.144039 kubelet[2701]: E0513 23:54:24.144041 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.144208 kubelet[2701]: E0513 23:54:24.144193 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.144244 kubelet[2701]: W0513 23:54:24.144227 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.144244 kubelet[2701]: E0513 23:54:24.144238 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.144464 kubelet[2701]: E0513 23:54:24.144449 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.144464 kubelet[2701]: W0513 23:54:24.144459 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.144464 kubelet[2701]: E0513 23:54:24.144467 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.144641 kubelet[2701]: E0513 23:54:24.144625 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.144641 kubelet[2701]: W0513 23:54:24.144636 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.144714 kubelet[2701]: E0513 23:54:24.144644 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.144865 kubelet[2701]: E0513 23:54:24.144848 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.144865 kubelet[2701]: W0513 23:54:24.144859 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.144865 kubelet[2701]: E0513 23:54:24.144867 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.145068 kubelet[2701]: E0513 23:54:24.145052 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.145068 kubelet[2701]: W0513 23:54:24.145063 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.145142 kubelet[2701]: E0513 23:54:24.145070 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.145254 kubelet[2701]: E0513 23:54:24.145237 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.145254 kubelet[2701]: W0513 23:54:24.145248 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.145335 kubelet[2701]: E0513 23:54:24.145255 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.145425 kubelet[2701]: E0513 23:54:24.145409 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.145425 kubelet[2701]: W0513 23:54:24.145419 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.145425 kubelet[2701]: E0513 23:54:24.145426 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.145676 kubelet[2701]: E0513 23:54:24.145660 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.145676 kubelet[2701]: W0513 23:54:24.145671 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.145737 kubelet[2701]: E0513 23:54:24.145679 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.145737 kubelet[2701]: I0513 23:54:24.145705 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a3c9bb24-adc4-4f0b-8af7-8850a622c673-registration-dir\") pod \"csi-node-driver-ftfgb\" (UID: \"a3c9bb24-adc4-4f0b-8af7-8850a622c673\") " pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:24.145952 kubelet[2701]: E0513 23:54:24.145937 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.145952 kubelet[2701]: W0513 23:54:24.145948 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.146020 kubelet[2701]: E0513 23:54:24.145961 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.146020 kubelet[2701]: I0513 23:54:24.145976 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3c9bb24-adc4-4f0b-8af7-8850a622c673-kubelet-dir\") pod \"csi-node-driver-ftfgb\" (UID: \"a3c9bb24-adc4-4f0b-8af7-8850a622c673\") " pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:24.146207 kubelet[2701]: E0513 23:54:24.146178 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.146207 kubelet[2701]: W0513 23:54:24.146203 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.146275 kubelet[2701]: E0513 23:54:24.146220 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.146404 kubelet[2701]: E0513 23:54:24.146388 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.146404 kubelet[2701]: W0513 23:54:24.146400 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.146479 kubelet[2701]: E0513 23:54:24.146413 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.146760 kubelet[2701]: E0513 23:54:24.146707 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.146792 kubelet[2701]: W0513 23:54:24.146743 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.146826 kubelet[2701]: E0513 23:54:24.146794 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.146863 kubelet[2701]: I0513 23:54:24.146829 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a3c9bb24-adc4-4f0b-8af7-8850a622c673-socket-dir\") pod \"csi-node-driver-ftfgb\" (UID: \"a3c9bb24-adc4-4f0b-8af7-8850a622c673\") " pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:24.147111 kubelet[2701]: E0513 23:54:24.147085 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.147111 kubelet[2701]: W0513 23:54:24.147100 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.147191 kubelet[2701]: E0513 23:54:24.147117 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.147191 kubelet[2701]: I0513 23:54:24.147133 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8slc\" (UniqueName: \"kubernetes.io/projected/a3c9bb24-adc4-4f0b-8af7-8850a622c673-kube-api-access-p8slc\") pod \"csi-node-driver-ftfgb\" (UID: \"a3c9bb24-adc4-4f0b-8af7-8850a622c673\") " pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:24.147368 kubelet[2701]: E0513 23:54:24.147343 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.147368 kubelet[2701]: W0513 23:54:24.147356 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.147438 kubelet[2701]: E0513 23:54:24.147369 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.147564 kubelet[2701]: E0513 23:54:24.147545 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.147564 kubelet[2701]: W0513 23:54:24.147559 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.147641 kubelet[2701]: E0513 23:54:24.147573 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.147771 kubelet[2701]: E0513 23:54:24.147738 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.147771 kubelet[2701]: W0513 23:54:24.147757 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.147771 kubelet[2701]: E0513 23:54:24.147770 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.148094 kubelet[2701]: E0513 23:54:24.148055 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.148094 kubelet[2701]: W0513 23:54:24.148088 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.148249 kubelet[2701]: E0513 23:54:24.148129 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.148499 kubelet[2701]: E0513 23:54:24.148475 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.148499 kubelet[2701]: W0513 23:54:24.148488 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.148551 kubelet[2701]: E0513 23:54:24.148503 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.148551 kubelet[2701]: I0513 23:54:24.148540 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a3c9bb24-adc4-4f0b-8af7-8850a622c673-varrun\") pod \"csi-node-driver-ftfgb\" (UID: \"a3c9bb24-adc4-4f0b-8af7-8850a622c673\") " pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:24.148912 kubelet[2701]: E0513 23:54:24.148871 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.148912 kubelet[2701]: W0513 23:54:24.148902 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.148982 kubelet[2701]: E0513 23:54:24.148919 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.149170 kubelet[2701]: E0513 23:54:24.149143 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.149170 kubelet[2701]: W0513 23:54:24.149161 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.149243 kubelet[2701]: E0513 23:54:24.149181 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.149515 kubelet[2701]: E0513 23:54:24.149470 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.149515 kubelet[2701]: W0513 23:54:24.149489 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.149515 kubelet[2701]: E0513 23:54:24.149503 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.149730 kubelet[2701]: E0513 23:54:24.149709 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.149730 kubelet[2701]: W0513 23:54:24.149719 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.149730 kubelet[2701]: E0513 23:54:24.149728 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.250309 kubelet[2701]: E0513 23:54:24.250122 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.250309 kubelet[2701]: W0513 23:54:24.250154 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.250309 kubelet[2701]: E0513 23:54:24.250177 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.250706 kubelet[2701]: E0513 23:54:24.250492 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.250706 kubelet[2701]: W0513 23:54:24.250535 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.250706 kubelet[2701]: E0513 23:54:24.250574 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.251212 kubelet[2701]: E0513 23:54:24.251183 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.251212 kubelet[2701]: W0513 23:54:24.251203 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.251212 kubelet[2701]: E0513 23:54:24.251224 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.251711 kubelet[2701]: E0513 23:54:24.251519 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.251711 kubelet[2701]: W0513 23:54:24.251532 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.251711 kubelet[2701]: E0513 23:54:24.251560 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.251954 kubelet[2701]: E0513 23:54:24.251745 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.251954 kubelet[2701]: W0513 23:54:24.251768 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.251954 kubelet[2701]: E0513 23:54:24.251832 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.252040 kubelet[2701]: E0513 23:54:24.252016 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.252040 kubelet[2701]: W0513 23:54:24.252028 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.252117 kubelet[2701]: E0513 23:54:24.252093 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.252253 kubelet[2701]: E0513 23:54:24.252236 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.252253 kubelet[2701]: W0513 23:54:24.252249 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.252340 kubelet[2701]: E0513 23:54:24.252266 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.252548 kubelet[2701]: E0513 23:54:24.252531 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.252548 kubelet[2701]: W0513 23:54:24.252543 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.252647 kubelet[2701]: E0513 23:54:24.252557 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.252992 kubelet[2701]: E0513 23:54:24.252973 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.252992 kubelet[2701]: W0513 23:54:24.252990 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.253089 kubelet[2701]: E0513 23:54:24.253065 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.253358 kubelet[2701]: E0513 23:54:24.253340 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.253358 kubelet[2701]: W0513 23:54:24.253356 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.253433 kubelet[2701]: E0513 23:54:24.253391 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.253728 kubelet[2701]: E0513 23:54:24.253708 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.253728 kubelet[2701]: W0513 23:54:24.253724 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.253829 kubelet[2701]: E0513 23:54:24.253786 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.254141 kubelet[2701]: E0513 23:54:24.254120 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.254141 kubelet[2701]: W0513 23:54:24.254135 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.254229 kubelet[2701]: E0513 23:54:24.254187 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.254431 kubelet[2701]: E0513 23:54:24.254413 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.254431 kubelet[2701]: W0513 23:54:24.254425 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.254513 kubelet[2701]: E0513 23:54:24.254458 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.254712 kubelet[2701]: E0513 23:54:24.254694 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.254712 kubelet[2701]: W0513 23:54:24.254705 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.254802 kubelet[2701]: E0513 23:54:24.254741 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.255017 kubelet[2701]: E0513 23:54:24.254997 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.255017 kubelet[2701]: W0513 23:54:24.255015 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.255106 kubelet[2701]: E0513 23:54:24.255063 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.255287 kubelet[2701]: E0513 23:54:24.255269 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.255287 kubelet[2701]: W0513 23:54:24.255283 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.255408 kubelet[2701]: E0513 23:54:24.255381 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.255707 kubelet[2701]: E0513 23:54:24.255672 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.255707 kubelet[2701]: W0513 23:54:24.255688 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.255707 kubelet[2701]: E0513 23:54:24.255711 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.255978 kubelet[2701]: E0513 23:54:24.255953 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.255978 kubelet[2701]: W0513 23:54:24.255965 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.256039 kubelet[2701]: E0513 23:54:24.255997 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.256288 kubelet[2701]: E0513 23:54:24.256251 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.256341 kubelet[2701]: W0513 23:54:24.256284 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.256380 kubelet[2701]: E0513 23:54:24.256351 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.256744 kubelet[2701]: E0513 23:54:24.256723 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.256744 kubelet[2701]: W0513 23:54:24.256738 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.256849 kubelet[2701]: E0513 23:54:24.256810 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.257145 kubelet[2701]: E0513 23:54:24.257123 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.257145 kubelet[2701]: W0513 23:54:24.257140 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.257224 kubelet[2701]: E0513 23:54:24.257180 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.257448 kubelet[2701]: E0513 23:54:24.257432 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.257448 kubelet[2701]: W0513 23:54:24.257445 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.257504 kubelet[2701]: E0513 23:54:24.257479 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.257677 kubelet[2701]: E0513 23:54:24.257663 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.257677 kubelet[2701]: W0513 23:54:24.257673 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.257746 kubelet[2701]: E0513 23:54:24.257725 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.258034 kubelet[2701]: E0513 23:54:24.258008 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.258034 kubelet[2701]: W0513 23:54:24.258024 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.258034 kubelet[2701]: E0513 23:54:24.258040 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.258332 kubelet[2701]: E0513 23:54:24.258318 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.258332 kubelet[2701]: W0513 23:54:24.258329 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.258387 kubelet[2701]: E0513 23:54:24.258337 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.354208 kubelet[2701]: E0513 23:54:24.354143 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.354208 kubelet[2701]: W0513 23:54:24.354172 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.354208 kubelet[2701]: E0513 23:54:24.354196 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.454949 kubelet[2701]: E0513 23:54:24.454908 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.454949 kubelet[2701]: W0513 23:54:24.454940 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.455136 kubelet[2701]: E0513 23:54:24.454965 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.556839 kubelet[2701]: E0513 23:54:24.556663 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.556839 kubelet[2701]: W0513 23:54:24.556696 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.556839 kubelet[2701]: E0513 23:54:24.556721 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.658447 kubelet[2701]: E0513 23:54:24.658376 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.658447 kubelet[2701]: W0513 23:54:24.658406 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.658447 kubelet[2701]: E0513 23:54:24.658427 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.760154 kubelet[2701]: E0513 23:54:24.759823 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.760154 kubelet[2701]: W0513 23:54:24.759856 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.760154 kubelet[2701]: E0513 23:54:24.759914 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:24.802458 kubelet[2701]: E0513 23:54:24.802414 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:24.802458 kubelet[2701]: W0513 23:54:24.802445 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:24.802458 kubelet[2701]: E0513 23:54:24.802465 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:25.642146 kubelet[2701]: E0513 23:54:25.641208 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:26.065410 containerd[1526]: time="2025-05-13T23:54:26.065234901Z" level=info msg="connecting to shim 24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260" address="unix:///run/containerd/s/da7c54cdaff18fe6a27df05ce558463ba2d9ffcfe65afbfaec41903495f0b3a4" namespace=k8s.io protocol=ttrpc version=3 May 13 23:54:26.097055 systemd[1]: Started cri-containerd-24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260.scope - libcontainer container 24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260. May 13 23:54:26.739693 containerd[1526]: time="2025-05-13T23:54:26.739634448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jhxr7,Uid:a9f2a2fb-780c-4d5d-9593-2723546f673c,Namespace:calico-system,Attempt:0,} returns sandbox id \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\"" May 13 23:54:27.637506 kubelet[2701]: E0513 23:54:27.637435 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:29.637224 kubelet[2701]: E0513 23:54:29.637172 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:31.174615 containerd[1526]: time="2025-05-13T23:54:31.174517904Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:31.637475 kubelet[2701]: E0513 23:54:31.637401 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:32.076912 containerd[1526]: time="2025-05-13T23:54:32.076675698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 13 23:54:32.121484 containerd[1526]: time="2025-05-13T23:54:32.121390950Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:32.196686 containerd[1526]: time="2025-05-13T23:54:32.196602437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:32.197434 containerd[1526]: time="2025-05-13T23:54:32.197387580Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 8.451775412s" May 13 23:54:32.197508 containerd[1526]: time="2025-05-13T23:54:32.197438587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 13 23:54:32.198548 containerd[1526]: time="2025-05-13T23:54:32.198480870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 23:54:32.206837 containerd[1526]: time="2025-05-13T23:54:32.206577691Z" level=info msg="CreateContainer within sandbox \"5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:54:32.677482 containerd[1526]: time="2025-05-13T23:54:32.677407209Z" level=info msg="Container 83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693: CDI devices from CRI Config.CDIDevices: []" May 13 23:54:33.383647 containerd[1526]: time="2025-05-13T23:54:33.383582724Z" level=info msg="CreateContainer within sandbox \"5350366b5fa38248a60b4524c9c91be6ce188a1e892123d99a0edea8eb51057b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693\"" May 13 23:54:33.384188 containerd[1526]: time="2025-05-13T23:54:33.383984658Z" level=info msg="StartContainer for \"83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693\"" May 13 23:54:33.385236 containerd[1526]: time="2025-05-13T23:54:33.385207734Z" level=info msg="connecting to shim 83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693" address="unix:///run/containerd/s/53ccd542a54282d2feaf7ef4d03f665d8fb04eb3ab95910a8d9669ded6215745" protocol=ttrpc version=3 May 13 23:54:33.408053 systemd[1]: Started cri-containerd-83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693.scope - libcontainer container 83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693. May 13 23:54:33.604640 containerd[1526]: time="2025-05-13T23:54:33.604594216Z" level=info msg="StartContainer for \"83b683066caac9ec7df572a1d3865338227a2d160dcbe913152cbdc836e62693\" returns successfully" May 13 23:54:33.638118 kubelet[2701]: E0513 23:54:33.637956 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:33.811971 kubelet[2701]: E0513 23:54:33.811922 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.811971 kubelet[2701]: W0513 23:54:33.811952 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.811971 kubelet[2701]: E0513 23:54:33.811975 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.812309 kubelet[2701]: E0513 23:54:33.812197 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.812309 kubelet[2701]: W0513 23:54:33.812206 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.812309 kubelet[2701]: E0513 23:54:33.812215 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.812451 kubelet[2701]: E0513 23:54:33.812431 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.812451 kubelet[2701]: W0513 23:54:33.812442 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.812451 kubelet[2701]: E0513 23:54:33.812450 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.812678 kubelet[2701]: E0513 23:54:33.812647 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.812678 kubelet[2701]: W0513 23:54:33.812660 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.812678 kubelet[2701]: E0513 23:54:33.812667 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.812903 kubelet[2701]: E0513 23:54:33.812870 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.812903 kubelet[2701]: W0513 23:54:33.812895 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.812903 kubelet[2701]: E0513 23:54:33.812903 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.813118 kubelet[2701]: E0513 23:54:33.813098 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.813118 kubelet[2701]: W0513 23:54:33.813110 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.813118 kubelet[2701]: E0513 23:54:33.813118 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.813321 kubelet[2701]: E0513 23:54:33.813303 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.813321 kubelet[2701]: W0513 23:54:33.813315 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.813321 kubelet[2701]: E0513 23:54:33.813323 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.813522 kubelet[2701]: E0513 23:54:33.813504 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.813522 kubelet[2701]: W0513 23:54:33.813515 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.813522 kubelet[2701]: E0513 23:54:33.813523 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.813763 kubelet[2701]: E0513 23:54:33.813717 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.813763 kubelet[2701]: W0513 23:54:33.813737 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.813763 kubelet[2701]: E0513 23:54:33.813744 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.813966 kubelet[2701]: E0513 23:54:33.813953 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.813966 kubelet[2701]: W0513 23:54:33.813962 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.814014 kubelet[2701]: E0513 23:54:33.813972 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.814188 kubelet[2701]: E0513 23:54:33.814167 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.814188 kubelet[2701]: W0513 23:54:33.814179 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.814188 kubelet[2701]: E0513 23:54:33.814187 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.814424 kubelet[2701]: E0513 23:54:33.814405 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.814424 kubelet[2701]: W0513 23:54:33.814416 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.814424 kubelet[2701]: E0513 23:54:33.814424 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.814633 kubelet[2701]: E0513 23:54:33.814614 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.814633 kubelet[2701]: W0513 23:54:33.814625 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.814633 kubelet[2701]: E0513 23:54:33.814633 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.814847 kubelet[2701]: E0513 23:54:33.814826 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.814847 kubelet[2701]: W0513 23:54:33.814839 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.814929 kubelet[2701]: E0513 23:54:33.814850 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.815178 kubelet[2701]: E0513 23:54:33.815153 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.815178 kubelet[2701]: W0513 23:54:33.815167 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.815178 kubelet[2701]: E0513 23:54:33.815175 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.826680 kubelet[2701]: E0513 23:54:33.826645 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.826680 kubelet[2701]: W0513 23:54:33.826668 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.826810 kubelet[2701]: E0513 23:54:33.826690 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.826986 kubelet[2701]: E0513 23:54:33.826959 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.826986 kubelet[2701]: W0513 23:54:33.826979 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.827057 kubelet[2701]: E0513 23:54:33.826998 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.827314 kubelet[2701]: E0513 23:54:33.827283 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.827314 kubelet[2701]: W0513 23:54:33.827305 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.827369 kubelet[2701]: E0513 23:54:33.827326 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.827549 kubelet[2701]: E0513 23:54:33.827527 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.827549 kubelet[2701]: W0513 23:54:33.827539 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.827609 kubelet[2701]: E0513 23:54:33.827553 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.827767 kubelet[2701]: E0513 23:54:33.827745 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.827767 kubelet[2701]: W0513 23:54:33.827758 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.827833 kubelet[2701]: E0513 23:54:33.827771 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.828033 kubelet[2701]: E0513 23:54:33.828004 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.828033 kubelet[2701]: W0513 23:54:33.828024 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.828100 kubelet[2701]: E0513 23:54:33.828042 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.828277 kubelet[2701]: E0513 23:54:33.828251 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.828277 kubelet[2701]: W0513 23:54:33.828265 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.828334 kubelet[2701]: E0513 23:54:33.828279 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.828546 kubelet[2701]: E0513 23:54:33.828523 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.828546 kubelet[2701]: W0513 23:54:33.828538 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.828606 kubelet[2701]: E0513 23:54:33.828554 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.828916 kubelet[2701]: E0513 23:54:33.828869 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.828916 kubelet[2701]: W0513 23:54:33.828912 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.828978 kubelet[2701]: E0513 23:54:33.828931 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.829183 kubelet[2701]: E0513 23:54:33.829167 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.829183 kubelet[2701]: W0513 23:54:33.829178 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.829238 kubelet[2701]: E0513 23:54:33.829192 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.829399 kubelet[2701]: E0513 23:54:33.829385 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.829399 kubelet[2701]: W0513 23:54:33.829395 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.829451 kubelet[2701]: E0513 23:54:33.829407 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.829606 kubelet[2701]: E0513 23:54:33.829592 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.829606 kubelet[2701]: W0513 23:54:33.829602 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.829657 kubelet[2701]: E0513 23:54:33.829615 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.829949 kubelet[2701]: E0513 23:54:33.829928 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.829949 kubelet[2701]: W0513 23:54:33.829941 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.830013 kubelet[2701]: E0513 23:54:33.829955 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.830177 kubelet[2701]: E0513 23:54:33.830147 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.830177 kubelet[2701]: W0513 23:54:33.830157 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.830177 kubelet[2701]: E0513 23:54:33.830168 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.830356 kubelet[2701]: E0513 23:54:33.830341 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.830356 kubelet[2701]: W0513 23:54:33.830351 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.830429 kubelet[2701]: E0513 23:54:33.830365 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.830608 kubelet[2701]: E0513 23:54:33.830587 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.830649 kubelet[2701]: W0513 23:54:33.830607 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.830649 kubelet[2701]: E0513 23:54:33.830633 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.830925 kubelet[2701]: E0513 23:54:33.830912 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.830925 kubelet[2701]: W0513 23:54:33.830922 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.830977 kubelet[2701]: E0513 23:54:33.830931 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:33.831579 kubelet[2701]: E0513 23:54:33.831560 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:33.831579 kubelet[2701]: W0513 23:54:33.831574 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:33.831653 kubelet[2701]: E0513 23:54:33.831585 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.823521 kubelet[2701]: E0513 23:54:34.823484 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.823521 kubelet[2701]: W0513 23:54:34.823509 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.823521 kubelet[2701]: E0513 23:54:34.823529 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824134 kubelet[2701]: E0513 23:54:34.823731 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824134 kubelet[2701]: W0513 23:54:34.823741 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824134 kubelet[2701]: E0513 23:54:34.823751 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824134 kubelet[2701]: E0513 23:54:34.823956 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824134 kubelet[2701]: W0513 23:54:34.823965 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824134 kubelet[2701]: E0513 23:54:34.823975 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824283 kubelet[2701]: E0513 23:54:34.824167 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824283 kubelet[2701]: W0513 23:54:34.824177 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824283 kubelet[2701]: E0513 23:54:34.824186 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824380 kubelet[2701]: E0513 23:54:34.824366 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824380 kubelet[2701]: W0513 23:54:34.824377 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824430 kubelet[2701]: E0513 23:54:34.824386 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824576 kubelet[2701]: E0513 23:54:34.824556 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824576 kubelet[2701]: W0513 23:54:34.824568 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824629 kubelet[2701]: E0513 23:54:34.824577 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824786 kubelet[2701]: E0513 23:54:34.824767 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824786 kubelet[2701]: W0513 23:54:34.824780 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824834 kubelet[2701]: E0513 23:54:34.824790 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.824995 kubelet[2701]: E0513 23:54:34.824976 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.824995 kubelet[2701]: W0513 23:54:34.824986 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.824995 kubelet[2701]: E0513 23:54:34.824995 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.825164 kubelet[2701]: E0513 23:54:34.825153 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.825164 kubelet[2701]: W0513 23:54:34.825161 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.825214 kubelet[2701]: E0513 23:54:34.825168 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.825333 kubelet[2701]: E0513 23:54:34.825321 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.825365 kubelet[2701]: W0513 23:54:34.825332 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.825365 kubelet[2701]: E0513 23:54:34.825340 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.825504 kubelet[2701]: E0513 23:54:34.825493 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.825504 kubelet[2701]: W0513 23:54:34.825502 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.825549 kubelet[2701]: E0513 23:54:34.825509 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.825686 kubelet[2701]: E0513 23:54:34.825669 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.825686 kubelet[2701]: W0513 23:54:34.825679 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.825755 kubelet[2701]: E0513 23:54:34.825686 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.825857 kubelet[2701]: E0513 23:54:34.825846 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.825857 kubelet[2701]: W0513 23:54:34.825854 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.825929 kubelet[2701]: E0513 23:54:34.825862 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.826068 kubelet[2701]: E0513 23:54:34.826056 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.826068 kubelet[2701]: W0513 23:54:34.826067 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.826126 kubelet[2701]: E0513 23:54:34.826074 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.826246 kubelet[2701]: E0513 23:54:34.826235 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.826246 kubelet[2701]: W0513 23:54:34.826244 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.826304 kubelet[2701]: E0513 23:54:34.826251 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.834698 kubelet[2701]: E0513 23:54:34.834646 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.834698 kubelet[2701]: W0513 23:54:34.834663 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.834698 kubelet[2701]: E0513 23:54:34.834675 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.835036 kubelet[2701]: E0513 23:54:34.835006 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.835036 kubelet[2701]: W0513 23:54:34.835015 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.835036 kubelet[2701]: E0513 23:54:34.835028 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.835275 kubelet[2701]: E0513 23:54:34.835241 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.835275 kubelet[2701]: W0513 23:54:34.835259 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.835349 kubelet[2701]: E0513 23:54:34.835276 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.835528 kubelet[2701]: E0513 23:54:34.835500 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.835528 kubelet[2701]: W0513 23:54:34.835517 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.835579 kubelet[2701]: E0513 23:54:34.835532 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.835747 kubelet[2701]: E0513 23:54:34.835727 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.835747 kubelet[2701]: W0513 23:54:34.835740 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.835804 kubelet[2701]: E0513 23:54:34.835754 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.835978 kubelet[2701]: E0513 23:54:34.835959 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.835978 kubelet[2701]: W0513 23:54:34.835972 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.836033 kubelet[2701]: E0513 23:54:34.835986 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.836209 kubelet[2701]: E0513 23:54:34.836190 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.836209 kubelet[2701]: W0513 23:54:34.836205 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.836267 kubelet[2701]: E0513 23:54:34.836233 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.836439 kubelet[2701]: E0513 23:54:34.836421 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.836439 kubelet[2701]: W0513 23:54:34.836432 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.836490 kubelet[2701]: E0513 23:54:34.836470 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.836628 kubelet[2701]: E0513 23:54:34.836611 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.836628 kubelet[2701]: W0513 23:54:34.836623 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.836682 kubelet[2701]: E0513 23:54:34.836659 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.836855 kubelet[2701]: E0513 23:54:34.836837 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.836855 kubelet[2701]: W0513 23:54:34.836849 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.836926 kubelet[2701]: E0513 23:54:34.836864 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.837189 kubelet[2701]: E0513 23:54:34.837168 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.837189 kubelet[2701]: W0513 23:54:34.837184 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.837250 kubelet[2701]: E0513 23:54:34.837199 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.837417 kubelet[2701]: E0513 23:54:34.837398 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.837417 kubelet[2701]: W0513 23:54:34.837411 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.837469 kubelet[2701]: E0513 23:54:34.837425 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.837627 kubelet[2701]: E0513 23:54:34.837609 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.837627 kubelet[2701]: W0513 23:54:34.837621 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.837688 kubelet[2701]: E0513 23:54:34.837636 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.837887 kubelet[2701]: E0513 23:54:34.837854 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.837887 kubelet[2701]: W0513 23:54:34.837868 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.837945 kubelet[2701]: E0513 23:54:34.837901 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.838292 kubelet[2701]: E0513 23:54:34.838274 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.838292 kubelet[2701]: W0513 23:54:34.838288 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.838410 kubelet[2701]: E0513 23:54:34.838302 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.838521 kubelet[2701]: E0513 23:54:34.838504 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.838521 kubelet[2701]: W0513 23:54:34.838515 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.838571 kubelet[2701]: E0513 23:54:34.838528 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.838859 kubelet[2701]: E0513 23:54:34.838835 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.838859 kubelet[2701]: W0513 23:54:34.838857 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.838957 kubelet[2701]: E0513 23:54:34.838869 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.839302 kubelet[2701]: E0513 23:54:34.839286 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:34.839302 kubelet[2701]: W0513 23:54:34.839298 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:34.839365 kubelet[2701]: E0513 23:54:34.839308 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:34.851578 kubelet[2701]: I0513 23:54:34.850762 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-75fd9d79c9-m8klp" podStartSLOduration=5.397661076 podStartE2EDuration="13.850748185s" podCreationTimestamp="2025-05-13 23:54:21 +0000 UTC" firstStartedPulling="2025-05-13 23:54:23.745279424 +0000 UTC m=+34.205825630" lastFinishedPulling="2025-05-13 23:54:32.198366533 +0000 UTC m=+42.658912739" observedRunningTime="2025-05-13 23:54:33.896770093 +0000 UTC m=+44.357316299" watchObservedRunningTime="2025-05-13 23:54:34.850748185 +0000 UTC m=+45.311294391" May 13 23:54:35.637381 kubelet[2701]: E0513 23:54:35.637335 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:35.833904 kubelet[2701]: E0513 23:54:35.833845 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.833904 kubelet[2701]: W0513 23:54:35.833871 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.833904 kubelet[2701]: E0513 23:54:35.833917 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.834412 kubelet[2701]: E0513 23:54:35.834107 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.834412 kubelet[2701]: W0513 23:54:35.834115 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.834412 kubelet[2701]: E0513 23:54:35.834123 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.834412 kubelet[2701]: E0513 23:54:35.834312 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.834412 kubelet[2701]: W0513 23:54:35.834320 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.834412 kubelet[2701]: E0513 23:54:35.834328 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.834600 kubelet[2701]: E0513 23:54:35.834581 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.834600 kubelet[2701]: W0513 23:54:35.834595 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.834656 kubelet[2701]: E0513 23:54:35.834606 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.834909 kubelet[2701]: E0513 23:54:35.834888 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.834909 kubelet[2701]: W0513 23:54:35.834904 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.834989 kubelet[2701]: E0513 23:54:35.834913 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.835120 kubelet[2701]: E0513 23:54:35.835097 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.835120 kubelet[2701]: W0513 23:54:35.835110 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.835120 kubelet[2701]: E0513 23:54:35.835119 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.835369 kubelet[2701]: E0513 23:54:35.835345 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.835369 kubelet[2701]: W0513 23:54:35.835357 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.835369 kubelet[2701]: E0513 23:54:35.835365 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.835940 kubelet[2701]: E0513 23:54:35.835792 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.835940 kubelet[2701]: W0513 23:54:35.835805 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.835940 kubelet[2701]: E0513 23:54:35.835815 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.836070 kubelet[2701]: E0513 23:54:35.836057 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.836070 kubelet[2701]: W0513 23:54:35.836067 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.836115 kubelet[2701]: E0513 23:54:35.836076 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.836317 kubelet[2701]: E0513 23:54:35.836298 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.836317 kubelet[2701]: W0513 23:54:35.836310 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.836372 kubelet[2701]: E0513 23:54:35.836318 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.836536 kubelet[2701]: E0513 23:54:35.836524 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.836536 kubelet[2701]: W0513 23:54:35.836534 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.836591 kubelet[2701]: E0513 23:54:35.836541 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.836750 kubelet[2701]: E0513 23:54:35.836738 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.836750 kubelet[2701]: W0513 23:54:35.836748 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.836806 kubelet[2701]: E0513 23:54:35.836756 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.837013 kubelet[2701]: E0513 23:54:35.837001 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.837013 kubelet[2701]: W0513 23:54:35.837010 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.837068 kubelet[2701]: E0513 23:54:35.837018 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.837227 kubelet[2701]: E0513 23:54:35.837215 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.837227 kubelet[2701]: W0513 23:54:35.837224 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.837278 kubelet[2701]: E0513 23:54:35.837231 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.837431 kubelet[2701]: E0513 23:54:35.837420 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.837431 kubelet[2701]: W0513 23:54:35.837429 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.837472 kubelet[2701]: E0513 23:54:35.837436 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.843793 kubelet[2701]: E0513 23:54:35.843758 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.843793 kubelet[2701]: W0513 23:54:35.843787 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.843861 kubelet[2701]: E0513 23:54:35.843806 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.844113 kubelet[2701]: E0513 23:54:35.844090 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.844113 kubelet[2701]: W0513 23:54:35.844104 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.844168 kubelet[2701]: E0513 23:54:35.844119 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.844391 kubelet[2701]: E0513 23:54:35.844365 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.844391 kubelet[2701]: W0513 23:54:35.844381 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.844462 kubelet[2701]: E0513 23:54:35.844397 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.844645 kubelet[2701]: E0513 23:54:35.844620 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.844645 kubelet[2701]: W0513 23:54:35.844634 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.844766 kubelet[2701]: E0513 23:54:35.844653 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.844865 kubelet[2701]: E0513 23:54:35.844849 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.844865 kubelet[2701]: W0513 23:54:35.844860 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.844962 kubelet[2701]: E0513 23:54:35.844890 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.845111 kubelet[2701]: E0513 23:54:35.845090 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.845111 kubelet[2701]: W0513 23:54:35.845101 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.845165 kubelet[2701]: E0513 23:54:35.845113 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.845335 kubelet[2701]: E0513 23:54:35.845320 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.845335 kubelet[2701]: W0513 23:54:35.845332 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.845389 kubelet[2701]: E0513 23:54:35.845346 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.845594 kubelet[2701]: E0513 23:54:35.845578 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.845594 kubelet[2701]: W0513 23:54:35.845589 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.845657 kubelet[2701]: E0513 23:54:35.845616 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.845791 kubelet[2701]: E0513 23:54:35.845778 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.845791 kubelet[2701]: W0513 23:54:35.845788 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.845852 kubelet[2701]: E0513 23:54:35.845810 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.845988 kubelet[2701]: E0513 23:54:35.845975 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.845988 kubelet[2701]: W0513 23:54:35.845985 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.846042 kubelet[2701]: E0513 23:54:35.845997 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.846187 kubelet[2701]: E0513 23:54:35.846172 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.846187 kubelet[2701]: W0513 23:54:35.846185 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.846245 kubelet[2701]: E0513 23:54:35.846198 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.846368 kubelet[2701]: E0513 23:54:35.846354 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.846368 kubelet[2701]: W0513 23:54:35.846364 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.846416 kubelet[2701]: E0513 23:54:35.846375 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.846569 kubelet[2701]: E0513 23:54:35.846556 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.846569 kubelet[2701]: W0513 23:54:35.846566 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.846622 kubelet[2701]: E0513 23:54:35.846577 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.846791 kubelet[2701]: E0513 23:54:35.846776 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.846791 kubelet[2701]: W0513 23:54:35.846788 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.846835 kubelet[2701]: E0513 23:54:35.846801 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.847003 kubelet[2701]: E0513 23:54:35.846989 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.847003 kubelet[2701]: W0513 23:54:35.846999 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.847059 kubelet[2701]: E0513 23:54:35.847012 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.847225 kubelet[2701]: E0513 23:54:35.847210 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.847252 kubelet[2701]: W0513 23:54:35.847223 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.847252 kubelet[2701]: E0513 23:54:35.847240 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.847471 kubelet[2701]: E0513 23:54:35.847459 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.847505 kubelet[2701]: W0513 23:54:35.847470 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.847505 kubelet[2701]: E0513 23:54:35.847482 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:35.847860 kubelet[2701]: E0513 23:54:35.847846 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:54:35.847860 kubelet[2701]: W0513 23:54:35.847856 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:54:35.847936 kubelet[2701]: E0513 23:54:35.847865 2701 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:54:37.637408 kubelet[2701]: E0513 23:54:37.637267 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:39.183417 containerd[1526]: time="2025-05-13T23:54:39.183228841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:39.283706 containerd[1526]: time="2025-05-13T23:54:39.283575924Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 13 23:54:39.376277 containerd[1526]: time="2025-05-13T23:54:39.376209794Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:39.494764 containerd[1526]: time="2025-05-13T23:54:39.494552755Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:39.495507 containerd[1526]: time="2025-05-13T23:54:39.495445009Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 7.296930013s" May 13 23:54:39.495507 containerd[1526]: time="2025-05-13T23:54:39.495498931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 13 23:54:39.497833 containerd[1526]: time="2025-05-13T23:54:39.497640856Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:54:39.637268 kubelet[2701]: E0513 23:54:39.637201 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:39.973059 containerd[1526]: time="2025-05-13T23:54:39.973002997Z" level=info msg="Container 418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe: CDI devices from CRI Config.CDIDevices: []" May 13 23:54:40.297739 containerd[1526]: time="2025-05-13T23:54:40.297505179Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\"" May 13 23:54:40.298707 containerd[1526]: time="2025-05-13T23:54:40.298605958Z" level=info msg="StartContainer for \"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\"" May 13 23:54:40.300787 containerd[1526]: time="2025-05-13T23:54:40.300737322Z" level=info msg="connecting to shim 418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe" address="unix:///run/containerd/s/da7c54cdaff18fe6a27df05ce558463ba2d9ffcfe65afbfaec41903495f0b3a4" protocol=ttrpc version=3 May 13 23:54:40.324102 systemd[1]: Started cri-containerd-418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe.scope - libcontainer container 418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe. May 13 23:54:40.392981 systemd[1]: cri-containerd-418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe.scope: Deactivated successfully. May 13 23:54:40.393400 systemd[1]: cri-containerd-418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe.scope: Consumed 54ms CPU time, 8.5M memory peak, 5.7M written to disk. May 13 23:54:40.394491 containerd[1526]: time="2025-05-13T23:54:40.394448032Z" level=info msg="TaskExit event in podsandbox handler container_id:\"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\" id:\"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\" pid:3449 exited_at:{seconds:1747180480 nanos:393918767}" May 13 23:54:41.040505 containerd[1526]: time="2025-05-13T23:54:41.040442047Z" level=info msg="received exit event container_id:\"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\" id:\"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\" pid:3449 exited_at:{seconds:1747180480 nanos:393918767}" May 13 23:54:41.042983 containerd[1526]: time="2025-05-13T23:54:41.042472559Z" level=info msg="StartContainer for \"418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe\" returns successfully" May 13 23:54:41.063861 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-418185eb89aad981e755dc9c21f5741e92a2ae842614d3c3ebdb84a64008a8fe-rootfs.mount: Deactivated successfully. May 13 23:54:41.638868 kubelet[2701]: E0513 23:54:41.638354 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:42.057589 containerd[1526]: time="2025-05-13T23:54:42.057425838Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 23:54:43.637602 kubelet[2701]: E0513 23:54:43.637533 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:43.653241 systemd[1]: Started sshd@7-10.0.0.42:22-10.0.0.1:55690.service - OpenSSH per-connection server daemon (10.0.0.1:55690). May 13 23:54:43.832800 sshd[3490]: Accepted publickey for core from 10.0.0.1 port 55690 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:54:43.834803 sshd-session[3490]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:54:43.839597 systemd-logind[1504]: New session 8 of user core. May 13 23:54:43.849023 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:54:44.158473 sshd[3492]: Connection closed by 10.0.0.1 port 55690 May 13 23:54:44.158834 sshd-session[3490]: pam_unix(sshd:session): session closed for user core May 13 23:54:44.164769 systemd[1]: sshd@7-10.0.0.42:22-10.0.0.1:55690.service: Deactivated successfully. May 13 23:54:44.168062 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:54:44.168976 systemd-logind[1504]: Session 8 logged out. Waiting for processes to exit. May 13 23:54:44.170131 systemd-logind[1504]: Removed session 8. May 13 23:54:45.638042 kubelet[2701]: E0513 23:54:45.637983 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:47.637842 kubelet[2701]: E0513 23:54:47.637768 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:49.171546 systemd[1]: Started sshd@8-10.0.0.42:22-10.0.0.1:36280.service - OpenSSH per-connection server daemon (10.0.0.1:36280). May 13 23:54:49.420806 sshd[3514]: Accepted publickey for core from 10.0.0.1 port 36280 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:54:49.422527 sshd-session[3514]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:54:49.427281 systemd-logind[1504]: New session 9 of user core. May 13 23:54:49.433089 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:54:49.551700 sshd[3516]: Connection closed by 10.0.0.1 port 36280 May 13 23:54:49.552161 sshd-session[3514]: pam_unix(sshd:session): session closed for user core May 13 23:54:49.556947 systemd[1]: sshd@8-10.0.0.42:22-10.0.0.1:36280.service: Deactivated successfully. May 13 23:54:49.559332 systemd[1]: session-9.scope: Deactivated successfully. May 13 23:54:49.560336 systemd-logind[1504]: Session 9 logged out. Waiting for processes to exit. May 13 23:54:49.561782 systemd-logind[1504]: Removed session 9. May 13 23:54:49.637845 kubelet[2701]: E0513 23:54:49.637766 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:49.873681 containerd[1526]: time="2025-05-13T23:54:49.873603557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:49.938000 containerd[1526]: time="2025-05-13T23:54:49.937906916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 13 23:54:49.966667 containerd[1526]: time="2025-05-13T23:54:49.966569909Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:49.977345 containerd[1526]: time="2025-05-13T23:54:49.977278559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:54:49.978789 containerd[1526]: time="2025-05-13T23:54:49.978727813Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 7.921253263s" May 13 23:54:49.978789 containerd[1526]: time="2025-05-13T23:54:49.978775133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 13 23:54:49.992750 containerd[1526]: time="2025-05-13T23:54:49.992667834Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:54:50.065608 containerd[1526]: time="2025-05-13T23:54:50.064202776Z" level=info msg="Container d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d: CDI devices from CRI Config.CDIDevices: []" May 13 23:54:50.191418 containerd[1526]: time="2025-05-13T23:54:50.191277247Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\"" May 13 23:54:50.191731 containerd[1526]: time="2025-05-13T23:54:50.191705098Z" level=info msg="StartContainer for \"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\"" May 13 23:54:50.193172 containerd[1526]: time="2025-05-13T23:54:50.193145435Z" level=info msg="connecting to shim d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d" address="unix:///run/containerd/s/da7c54cdaff18fe6a27df05ce558463ba2d9ffcfe65afbfaec41903495f0b3a4" protocol=ttrpc version=3 May 13 23:54:50.216018 systemd[1]: Started cri-containerd-d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d.scope - libcontainer container d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d. May 13 23:54:51.637331 kubelet[2701]: E0513 23:54:51.637245 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:52.385392 containerd[1526]: time="2025-05-13T23:54:52.385342453Z" level=info msg="StartContainer for \"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\" returns successfully" May 13 23:54:53.639617 kubelet[2701]: E0513 23:54:53.639563 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:54.570855 systemd[1]: Started sshd@9-10.0.0.42:22-10.0.0.1:36290.service - OpenSSH per-connection server daemon (10.0.0.1:36290). May 13 23:54:54.715570 sshd[3568]: Accepted publickey for core from 10.0.0.1 port 36290 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:54:54.717742 sshd-session[3568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:54:54.722924 systemd-logind[1504]: New session 10 of user core. May 13 23:54:54.738092 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 23:54:54.873989 containerd[1526]: time="2025-05-13T23:54:54.873850975Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:54:54.879030 systemd[1]: cri-containerd-d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d.scope: Deactivated successfully. May 13 23:54:54.880300 systemd[1]: cri-containerd-d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d.scope: Consumed 658ms CPU time, 160.3M memory peak, 8K read from disk, 154M written to disk. May 13 23:54:54.880723 containerd[1526]: time="2025-05-13T23:54:54.880684075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\" id:\"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\" pid:3547 exited_at:{seconds:1747180494 nanos:879723908}" May 13 23:54:54.881009 containerd[1526]: time="2025-05-13T23:54:54.880864767Z" level=info msg="received exit event container_id:\"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\" id:\"d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d\" pid:3547 exited_at:{seconds:1747180494 nanos:879723908}" May 13 23:54:54.896105 kubelet[2701]: I0513 23:54:54.895994 2701 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 13 23:54:54.912860 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d0d2ebe9f801325ee941c5fa5842cc97758afcaba19d9b88f1f72802e1ffbc4d-rootfs.mount: Deactivated successfully. May 13 23:54:54.937723 sshd[3570]: Connection closed by 10.0.0.1 port 36290 May 13 23:54:54.938467 sshd-session[3568]: pam_unix(sshd:session): session closed for user core May 13 23:54:54.943341 systemd[1]: sshd@9-10.0.0.42:22-10.0.0.1:36290.service: Deactivated successfully. May 13 23:54:54.945811 systemd[1]: session-10.scope: Deactivated successfully. May 13 23:54:54.946639 systemd-logind[1504]: Session 10 logged out. Waiting for processes to exit. May 13 23:54:54.947731 systemd-logind[1504]: Removed session 10. May 13 23:54:55.032607 systemd[1]: Created slice kubepods-besteffort-pod0399bbfa_118a_4448_9eb9_556767adecf1.slice - libcontainer container kubepods-besteffort-pod0399bbfa_118a_4448_9eb9_556767adecf1.slice. May 13 23:54:55.038916 systemd[1]: Created slice kubepods-burstable-poda631a5bc_b8f6_438b_ba2f_de7a312a2873.slice - libcontainer container kubepods-burstable-poda631a5bc_b8f6_438b_ba2f_de7a312a2873.slice. May 13 23:54:55.045983 systemd[1]: Created slice kubepods-burstable-pod10cdb901_f5c0_4ff4_a7d1_3bf3da3de8c0.slice - libcontainer container kubepods-burstable-pod10cdb901_f5c0_4ff4_a7d1_3bf3da3de8c0.slice. May 13 23:54:55.050352 systemd[1]: Created slice kubepods-besteffort-pod07e6469d_9168_4ad4_80fe_12eb59493ccb.slice - libcontainer container kubepods-besteffort-pod07e6469d_9168_4ad4_80fe_12eb59493ccb.slice. May 13 23:54:55.059463 systemd[1]: Created slice kubepods-besteffort-podfa0beed6_eb05_4283_a264_160950532e24.slice - libcontainer container kubepods-besteffort-podfa0beed6_eb05_4283_a264_160950532e24.slice. May 13 23:54:55.216539 kubelet[2701]: I0513 23:54:55.216333 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75x9\" (UniqueName: \"kubernetes.io/projected/fa0beed6-eb05-4283-a264-160950532e24-kube-api-access-q75x9\") pod \"calico-apiserver-d986995dd-mdnct\" (UID: \"fa0beed6-eb05-4283-a264-160950532e24\") " pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" May 13 23:54:55.216539 kubelet[2701]: I0513 23:54:55.216398 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jkm\" (UniqueName: \"kubernetes.io/projected/10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0-kube-api-access-r7jkm\") pod \"coredns-6f6b679f8f-gc8hd\" (UID: \"10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0\") " pod="kube-system/coredns-6f6b679f8f-gc8hd" May 13 23:54:55.216539 kubelet[2701]: I0513 23:54:55.216417 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7z4\" (UniqueName: \"kubernetes.io/projected/0399bbfa-118a-4448-9eb9-556767adecf1-kube-api-access-8p7z4\") pod \"calico-kube-controllers-59c8f969f4-mhpbc\" (UID: \"0399bbfa-118a-4448-9eb9-556767adecf1\") " pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" May 13 23:54:55.216539 kubelet[2701]: I0513 23:54:55.216439 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/07e6469d-9168-4ad4-80fe-12eb59493ccb-calico-apiserver-certs\") pod \"calico-apiserver-d986995dd-qnwxn\" (UID: \"07e6469d-9168-4ad4-80fe-12eb59493ccb\") " pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" May 13 23:54:55.216844 kubelet[2701]: I0513 23:54:55.216578 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0-config-volume\") pod \"coredns-6f6b679f8f-gc8hd\" (UID: \"10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0\") " pod="kube-system/coredns-6f6b679f8f-gc8hd" May 13 23:54:55.216844 kubelet[2701]: I0513 23:54:55.216637 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0399bbfa-118a-4448-9eb9-556767adecf1-tigera-ca-bundle\") pod \"calico-kube-controllers-59c8f969f4-mhpbc\" (UID: \"0399bbfa-118a-4448-9eb9-556767adecf1\") " pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" May 13 23:54:55.216844 kubelet[2701]: I0513 23:54:55.216675 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvhpg\" (UniqueName: \"kubernetes.io/projected/07e6469d-9168-4ad4-80fe-12eb59493ccb-kube-api-access-bvhpg\") pod \"calico-apiserver-d986995dd-qnwxn\" (UID: \"07e6469d-9168-4ad4-80fe-12eb59493ccb\") " pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" May 13 23:54:55.216844 kubelet[2701]: I0513 23:54:55.216694 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fa0beed6-eb05-4283-a264-160950532e24-calico-apiserver-certs\") pod \"calico-apiserver-d986995dd-mdnct\" (UID: \"fa0beed6-eb05-4283-a264-160950532e24\") " pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" May 13 23:54:55.216844 kubelet[2701]: I0513 23:54:55.216712 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a631a5bc-b8f6-438b-ba2f-de7a312a2873-config-volume\") pod \"coredns-6f6b679f8f-gv55w\" (UID: \"a631a5bc-b8f6-438b-ba2f-de7a312a2873\") " pod="kube-system/coredns-6f6b679f8f-gv55w" May 13 23:54:55.217006 kubelet[2701]: I0513 23:54:55.216726 2701 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzq2z\" (UniqueName: \"kubernetes.io/projected/a631a5bc-b8f6-438b-ba2f-de7a312a2873-kube-api-access-gzq2z\") pod \"coredns-6f6b679f8f-gv55w\" (UID: \"a631a5bc-b8f6-438b-ba2f-de7a312a2873\") " pod="kube-system/coredns-6f6b679f8f-gv55w" May 13 23:54:55.374957 containerd[1526]: time="2025-05-13T23:54:55.374894865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-mdnct,Uid:fa0beed6-eb05-4283-a264-160950532e24,Namespace:calico-apiserver,Attempt:0,}" May 13 23:54:55.375816 containerd[1526]: time="2025-05-13T23:54:55.375752838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-qnwxn,Uid:07e6469d-9168-4ad4-80fe-12eb59493ccb,Namespace:calico-apiserver,Attempt:0,}" May 13 23:54:55.409197 containerd[1526]: time="2025-05-13T23:54:55.409016407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 23:54:55.590298 containerd[1526]: time="2025-05-13T23:54:55.590120716Z" level=error msg="Failed to destroy network for sandbox \"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.614202 containerd[1526]: time="2025-05-13T23:54:55.614025530Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-mdnct,Uid:fa0beed6-eb05-4283-a264-160950532e24,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.615702 kubelet[2701]: E0513 23:54:55.615537 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.615702 kubelet[2701]: E0513 23:54:55.615618 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" May 13 23:54:55.615702 kubelet[2701]: E0513 23:54:55.615638 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" May 13 23:54:55.615901 kubelet[2701]: E0513 23:54:55.615672 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d986995dd-mdnct_calico-apiserver(fa0beed6-eb05-4283-a264-160950532e24)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d986995dd-mdnct_calico-apiserver(fa0beed6-eb05-4283-a264-160950532e24)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c7030dba483e49fe8354e7d18b40c1ae4ecc0f7a44b39aa1a53d765d92f40a0d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" podUID="fa0beed6-eb05-4283-a264-160950532e24" May 13 23:54:55.636538 containerd[1526]: time="2025-05-13T23:54:55.636466706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8f969f4-mhpbc,Uid:0399bbfa-118a-4448-9eb9-556767adecf1,Namespace:calico-system,Attempt:0,}" May 13 23:54:55.643402 containerd[1526]: time="2025-05-13T23:54:55.643357792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gv55w,Uid:a631a5bc-b8f6-438b-ba2f-de7a312a2873,Namespace:kube-system,Attempt:0,}" May 13 23:54:55.646681 systemd[1]: Created slice kubepods-besteffort-poda3c9bb24_adc4_4f0b_8af7_8850a622c673.slice - libcontainer container kubepods-besteffort-poda3c9bb24_adc4_4f0b_8af7_8850a622c673.slice. May 13 23:54:55.649264 containerd[1526]: time="2025-05-13T23:54:55.649228138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gc8hd,Uid:10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0,Namespace:kube-system,Attempt:0,}" May 13 23:54:55.650097 containerd[1526]: time="2025-05-13T23:54:55.649410313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ftfgb,Uid:a3c9bb24-adc4-4f0b-8af7-8850a622c673,Namespace:calico-system,Attempt:0,}" May 13 23:54:55.656086 containerd[1526]: time="2025-05-13T23:54:55.656023724Z" level=error msg="Failed to destroy network for sandbox \"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.940912 containerd[1526]: time="2025-05-13T23:54:55.940816595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-qnwxn,Uid:07e6469d-9168-4ad4-80fe-12eb59493ccb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.941451 kubelet[2701]: E0513 23:54:55.941124 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:55.941451 kubelet[2701]: E0513 23:54:55.941192 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" May 13 23:54:55.941451 kubelet[2701]: E0513 23:54:55.941212 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" May 13 23:54:55.941866 kubelet[2701]: E0513 23:54:55.941255 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d986995dd-qnwxn_calico-apiserver(07e6469d-9168-4ad4-80fe-12eb59493ccb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d986995dd-qnwxn_calico-apiserver(07e6469d-9168-4ad4-80fe-12eb59493ccb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f1315146a0c8d6feda8d0a372eb327ad04ab5ccf3c3497746a9d0f098214ce8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" podUID="07e6469d-9168-4ad4-80fe-12eb59493ccb" May 13 23:54:56.005950 containerd[1526]: time="2025-05-13T23:54:56.005869613Z" level=error msg="Failed to destroy network for sandbox \"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.008361 systemd[1]: run-netns-cni\x2da013a85b\x2dd320\x2df5a9\x2d97db\x2dc59da2609e32.mount: Deactivated successfully. May 13 23:54:56.124696 containerd[1526]: time="2025-05-13T23:54:56.124628339Z" level=error msg="Failed to destroy network for sandbox \"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.127737 systemd[1]: run-netns-cni\x2d7c2e40ec\x2de880\x2db394\x2df8c2\x2da17c6cf26fad.mount: Deactivated successfully. May 13 23:54:56.168084 containerd[1526]: time="2025-05-13T23:54:56.168013263Z" level=error msg="Failed to destroy network for sandbox \"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.170384 systemd[1]: run-netns-cni\x2d00a2f65d\x2da7b4\x2d75a1\x2d7cc5\x2d6d8ab6b11ca0.mount: Deactivated successfully. May 13 23:54:56.442813 containerd[1526]: time="2025-05-13T23:54:56.442742052Z" level=error msg="Failed to destroy network for sandbox \"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.444776 containerd[1526]: time="2025-05-13T23:54:56.444735052Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8f969f4-mhpbc,Uid:0399bbfa-118a-4448-9eb9-556767adecf1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.445180 kubelet[2701]: E0513 23:54:56.445047 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.445180 kubelet[2701]: E0513 23:54:56.445132 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" May 13 23:54:56.445180 kubelet[2701]: E0513 23:54:56.445162 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" May 13 23:54:56.445328 kubelet[2701]: E0513 23:54:56.445216 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59c8f969f4-mhpbc_calico-system(0399bbfa-118a-4448-9eb9-556767adecf1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59c8f969f4-mhpbc_calico-system(0399bbfa-118a-4448-9eb9-556767adecf1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4dbb6f33dc3fe013aa89bc85e3bb59232f68651f80dad928aabe1d58e9bc28c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" podUID="0399bbfa-118a-4448-9eb9-556767adecf1" May 13 23:54:56.445311 systemd[1]: run-netns-cni\x2da7095f44\x2d8a54\x2de9bf\x2d0ffc\x2d5e1c385fef81.mount: Deactivated successfully. May 13 23:54:56.491300 containerd[1526]: time="2025-05-13T23:54:56.491077330Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gv55w,Uid:a631a5bc-b8f6-438b-ba2f-de7a312a2873,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.491556 kubelet[2701]: E0513 23:54:56.491426 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.491556 kubelet[2701]: E0513 23:54:56.491514 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gv55w" May 13 23:54:56.491556 kubelet[2701]: E0513 23:54:56.491535 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gv55w" May 13 23:54:56.491728 kubelet[2701]: E0513 23:54:56.491591 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-gv55w_kube-system(a631a5bc-b8f6-438b-ba2f-de7a312a2873)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-gv55w_kube-system(a631a5bc-b8f6-438b-ba2f-de7a312a2873)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bbea7986398ac603fef8d4e6df0ef50c4bcca3ae7393ddc6caa9bb42d4c26fe9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-gv55w" podUID="a631a5bc-b8f6-438b-ba2f-de7a312a2873" May 13 23:54:56.509110 containerd[1526]: time="2025-05-13T23:54:56.509044074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gc8hd,Uid:10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.509387 kubelet[2701]: E0513 23:54:56.509339 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.509488 kubelet[2701]: E0513 23:54:56.509426 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gc8hd" May 13 23:54:56.509488 kubelet[2701]: E0513 23:54:56.509461 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gc8hd" May 13 23:54:56.509566 kubelet[2701]: E0513 23:54:56.509535 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-gc8hd_kube-system(10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-gc8hd_kube-system(10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93890d04c15dae9e3dfa856524d4b191032a29cf256c934d38c108fe021e80e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-gc8hd" podUID="10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0" May 13 23:54:56.542734 containerd[1526]: time="2025-05-13T23:54:56.542644521Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ftfgb,Uid:a3c9bb24-adc4-4f0b-8af7-8850a622c673,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.542981 kubelet[2701]: E0513 23:54:56.542931 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:54:56.543046 kubelet[2701]: E0513 23:54:56.542984 2701 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:56.543046 kubelet[2701]: E0513 23:54:56.543006 2701 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ftfgb" May 13 23:54:56.543092 kubelet[2701]: E0513 23:54:56.543046 2701 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ftfgb_calico-system(a3c9bb24-adc4-4f0b-8af7-8850a622c673)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ftfgb_calico-system(a3c9bb24-adc4-4f0b-8af7-8850a622c673)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b921edde3ab3eddc709c87952e330337b45f30fa5bfe47da2144fe528b869f34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ftfgb" podUID="a3c9bb24-adc4-4f0b-8af7-8850a622c673" May 13 23:54:59.953299 systemd[1]: Started sshd@10-10.0.0.42:22-10.0.0.1:59732.service - OpenSSH per-connection server daemon (10.0.0.1:59732). May 13 23:55:00.038349 sshd[3828]: Accepted publickey for core from 10.0.0.1 port 59732 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:00.040211 sshd-session[3828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:00.045867 systemd-logind[1504]: New session 11 of user core. May 13 23:55:00.051084 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 23:55:00.193934 sshd[3830]: Connection closed by 10.0.0.1 port 59732 May 13 23:55:00.193654 sshd-session[3828]: pam_unix(sshd:session): session closed for user core May 13 23:55:00.198333 systemd[1]: sshd@10-10.0.0.42:22-10.0.0.1:59732.service: Deactivated successfully. May 13 23:55:00.202282 systemd[1]: session-11.scope: Deactivated successfully. May 13 23:55:00.203298 systemd-logind[1504]: Session 11 logged out. Waiting for processes to exit. May 13 23:55:00.204375 systemd-logind[1504]: Removed session 11. May 13 23:55:01.817224 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3336929269.mount: Deactivated successfully. May 13 23:55:03.942329 containerd[1526]: time="2025-05-13T23:55:03.942247861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:03.999812 containerd[1526]: time="2025-05-13T23:55:03.999683778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 13 23:55:04.082652 containerd[1526]: time="2025-05-13T23:55:04.082549319Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:04.121832 containerd[1526]: time="2025-05-13T23:55:04.121735641Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:04.122520 containerd[1526]: time="2025-05-13T23:55:04.122453057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.713385293s" May 13 23:55:04.122520 containerd[1526]: time="2025-05-13T23:55:04.122514784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 13 23:55:04.131426 containerd[1526]: time="2025-05-13T23:55:04.131370967Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:55:04.505546 containerd[1526]: time="2025-05-13T23:55:04.505491416Z" level=info msg="Container 99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:04.784500 containerd[1526]: time="2025-05-13T23:55:04.784293383Z" level=info msg="CreateContainer within sandbox \"24817bf0eb3ce6f6d46d17daa32d42afa2692ecf7592fe4903e221939b19c260\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\"" May 13 23:55:04.785284 containerd[1526]: time="2025-05-13T23:55:04.785226687Z" level=info msg="StartContainer for \"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\"" May 13 23:55:04.807930 containerd[1526]: time="2025-05-13T23:55:04.807856966Z" level=info msg="connecting to shim 99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17" address="unix:///run/containerd/s/da7c54cdaff18fe6a27df05ce558463ba2d9ffcfe65afbfaec41903495f0b3a4" protocol=ttrpc version=3 May 13 23:55:04.830032 systemd[1]: Started cri-containerd-99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17.scope - libcontainer container 99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17. May 13 23:55:05.018553 containerd[1526]: time="2025-05-13T23:55:05.018506889Z" level=info msg="StartContainer for \"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" returns successfully" May 13 23:55:05.084070 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 23:55:05.084799 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 23:55:05.206627 systemd[1]: Started sshd@11-10.0.0.42:22-10.0.0.1:59738.service - OpenSSH per-connection server daemon (10.0.0.1:59738). May 13 23:55:05.262285 sshd[3890]: Accepted publickey for core from 10.0.0.1 port 59738 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:05.264274 sshd-session[3890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:05.268949 systemd-logind[1504]: New session 12 of user core. May 13 23:55:05.280078 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 23:55:05.503648 sshd[3892]: Connection closed by 10.0.0.1 port 59738 May 13 23:55:05.504365 sshd-session[3890]: pam_unix(sshd:session): session closed for user core May 13 23:55:05.509093 systemd[1]: sshd@11-10.0.0.42:22-10.0.0.1:59738.service: Deactivated successfully. May 13 23:55:05.511928 systemd[1]: session-12.scope: Deactivated successfully. May 13 23:55:05.512851 systemd-logind[1504]: Session 12 logged out. Waiting for processes to exit. May 13 23:55:05.514074 systemd-logind[1504]: Removed session 12. May 13 23:55:05.581928 kubelet[2701]: I0513 23:55:05.581676 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jhxr7" podStartSLOduration=6.198820802 podStartE2EDuration="43.581660473s" podCreationTimestamp="2025-05-13 23:54:22 +0000 UTC" firstStartedPulling="2025-05-13 23:54:26.740757528 +0000 UTC m=+37.201303734" lastFinishedPulling="2025-05-13 23:55:04.123597199 +0000 UTC m=+74.584143405" observedRunningTime="2025-05-13 23:55:05.580126605 +0000 UTC m=+76.040672821" watchObservedRunningTime="2025-05-13 23:55:05.581660473 +0000 UTC m=+76.042206680" May 13 23:55:05.633836 containerd[1526]: time="2025-05-13T23:55:05.633776594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" id:\"660c3bdf062cb37b5385401015f0643d1619b5cb5248c22accf01dd97d7e7590\" pid:3916 exit_status:1 exited_at:{seconds:1747180505 nanos:633243697}" May 13 23:55:06.529558 containerd[1526]: time="2025-05-13T23:55:06.529500991Z" level=info msg="TaskExit event in podsandbox handler container_id:\"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" id:\"d8c6dde2f7770367c2d3c481ba1895bfd13eb987e50610337e4dd1ff0e101483\" pid:3961 exit_status:1 exited_at:{seconds:1747180506 nanos:529135491}" May 13 23:55:06.638314 containerd[1526]: time="2025-05-13T23:55:06.638244381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gv55w,Uid:a631a5bc-b8f6-438b-ba2f-de7a312a2873,Namespace:kube-system,Attempt:0,}" May 13 23:55:07.115980 systemd-networkd[1435]: cali61495366864: Link UP May 13 23:55:07.116276 systemd-networkd[1435]: cali61495366864: Gained carrier May 13 23:55:07.618651 containerd[1526]: 2025-05-13 23:55:06.810 [INFO][3974] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:55:07.618651 containerd[1526]: 2025-05-13 23:55:06.835 [INFO][3974] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--gv55w-eth0 coredns-6f6b679f8f- kube-system a631a5bc-b8f6-438b-ba2f-de7a312a2873 838 0 2025-05-13 23:53:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-gv55w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali61495366864 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-" May 13 23:55:07.618651 containerd[1526]: 2025-05-13 23:55:06.835 [INFO][3974] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.618651 containerd[1526]: 2025-05-13 23:55:06.898 [INFO][3987] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" HandleID="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Workload="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.909 [INFO][3987] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" HandleID="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Workload="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000301400), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-gv55w", "timestamp":"2025-05-13 23:55:06.898313329 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.909 [INFO][3987] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.909 [INFO][3987] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.909 [INFO][3987] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.911 [INFO][3987] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" host="localhost" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.916 [INFO][3987] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.920 [INFO][3987] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.922 [INFO][3987] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.924 [INFO][3987] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:07.619475 containerd[1526]: 2025-05-13 23:55:06.924 [INFO][3987] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" host="localhost" May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:06.926 [INFO][3987] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:06.970 [INFO][3987] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" host="localhost" May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:07.103 [INFO][3987] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" host="localhost" May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:07.103 [INFO][3987] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" host="localhost" May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:07.103 [INFO][3987] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:07.619710 containerd[1526]: 2025-05-13 23:55:07.103 [INFO][3987] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" HandleID="k8s-pod-network.d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Workload="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.619843 containerd[1526]: 2025-05-13 23:55:07.106 [INFO][3974] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--gv55w-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"a631a5bc-b8f6-438b-ba2f-de7a312a2873", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-gv55w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61495366864", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:07.619921 containerd[1526]: 2025-05-13 23:55:07.107 [INFO][3974] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.619921 containerd[1526]: 2025-05-13 23:55:07.107 [INFO][3974] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61495366864 ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.619921 containerd[1526]: 2025-05-13 23:55:07.185 [INFO][3974] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.619995 containerd[1526]: 2025-05-13 23:55:07.185 [INFO][3974] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--gv55w-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"a631a5bc-b8f6-438b-ba2f-de7a312a2873", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a", Pod:"coredns-6f6b679f8f-gv55w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61495366864", MAC:"9e:93:b2:11:84:1a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:07.619995 containerd[1526]: 2025-05-13 23:55:07.615 [INFO][3974] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gv55w" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gv55w-eth0" May 13 23:55:07.638789 containerd[1526]: time="2025-05-13T23:55:07.638693715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8f969f4-mhpbc,Uid:0399bbfa-118a-4448-9eb9-556767adecf1,Namespace:calico-system,Attempt:0,}" May 13 23:55:07.948917 kernel: bpftool[4117]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 23:55:08.638234 containerd[1526]: time="2025-05-13T23:55:08.638170115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-mdnct,Uid:fa0beed6-eb05-4283-a264-160950532e24,Namespace:calico-apiserver,Attempt:0,}" May 13 23:55:09.011119 systemd-networkd[1435]: cali61495366864: Gained IPv6LL May 13 23:55:09.116046 systemd-networkd[1435]: vxlan.calico: Link UP May 13 23:55:09.116052 systemd-networkd[1435]: vxlan.calico: Gained carrier May 13 23:55:09.638084 containerd[1526]: time="2025-05-13T23:55:09.638026734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ftfgb,Uid:a3c9bb24-adc4-4f0b-8af7-8850a622c673,Namespace:calico-system,Attempt:0,}" May 13 23:55:10.227167 systemd-networkd[1435]: vxlan.calico: Gained IPv6LL May 13 23:55:10.519962 systemd[1]: Started sshd@12-10.0.0.42:22-10.0.0.1:48446.service - OpenSSH per-connection server daemon (10.0.0.1:48446). May 13 23:55:10.614437 sshd[4239]: Accepted publickey for core from 10.0.0.1 port 48446 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:10.616711 sshd-session[4239]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:10.621724 systemd-logind[1504]: New session 13 of user core. May 13 23:55:10.634423 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 23:55:10.639918 containerd[1526]: time="2025-05-13T23:55:10.639613910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-qnwxn,Uid:07e6469d-9168-4ad4-80fe-12eb59493ccb,Namespace:calico-apiserver,Attempt:0,}" May 13 23:55:10.640360 systemd-networkd[1435]: cali000b0ea82c0: Link UP May 13 23:55:10.640678 systemd-networkd[1435]: cali000b0ea82c0: Gained carrier May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.467 [INFO][4216] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0 calico-kube-controllers-59c8f969f4- calico-system 0399bbfa-118a-4448-9eb9-556767adecf1 837 0 2025-05-13 23:54:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59c8f969f4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-59c8f969f4-mhpbc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali000b0ea82c0 [] []}} ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.467 [INFO][4216] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.551 [INFO][4234] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" HandleID="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Workload="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.559 [INFO][4234] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" HandleID="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Workload="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00070c880), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-59c8f969f4-mhpbc", "timestamp":"2025-05-13 23:55:10.551048148 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.559 [INFO][4234] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.560 [INFO][4234] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.560 [INFO][4234] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.562 [INFO][4234] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.569 [INFO][4234] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.574 [INFO][4234] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.577 [INFO][4234] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.579 [INFO][4234] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.579 [INFO][4234] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.582 [INFO][4234] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287 May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.601 [INFO][4234] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.634 [INFO][4234] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.634 [INFO][4234] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" host="localhost" May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.634 [INFO][4234] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:10.689499 containerd[1526]: 2025-05-13 23:55:10.634 [INFO][4234] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" HandleID="k8s-pod-network.7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Workload="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.637 [INFO][4216] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0", GenerateName:"calico-kube-controllers-59c8f969f4-", Namespace:"calico-system", SelfLink:"", UID:"0399bbfa-118a-4448-9eb9-556767adecf1", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59c8f969f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-59c8f969f4-mhpbc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali000b0ea82c0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.637 [INFO][4216] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.637 [INFO][4216] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali000b0ea82c0 ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.641 [INFO][4216] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.641 [INFO][4216] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0", GenerateName:"calico-kube-controllers-59c8f969f4-", Namespace:"calico-system", SelfLink:"", UID:"0399bbfa-118a-4448-9eb9-556767adecf1", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59c8f969f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287", Pod:"calico-kube-controllers-59c8f969f4-mhpbc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali000b0ea82c0", MAC:"2e:d7:9d:d7:3c:ec", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:10.690133 containerd[1526]: 2025-05-13 23:55:10.680 [INFO][4216] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" Namespace="calico-system" Pod="calico-kube-controllers-59c8f969f4-mhpbc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--59c8f969f4--mhpbc-eth0" May 13 23:55:10.810098 sshd[4244]: Connection closed by 10.0.0.1 port 48446 May 13 23:55:10.810441 sshd-session[4239]: pam_unix(sshd:session): session closed for user core May 13 23:55:10.815240 systemd[1]: sshd@12-10.0.0.42:22-10.0.0.1:48446.service: Deactivated successfully. May 13 23:55:10.818083 systemd[1]: session-13.scope: Deactivated successfully. May 13 23:55:10.818951 systemd-logind[1504]: Session 13 logged out. Waiting for processes to exit. May 13 23:55:10.826471 systemd-logind[1504]: Removed session 13. May 13 23:55:10.872633 systemd-networkd[1435]: calif2b2958789f: Link UP May 13 23:55:10.872839 systemd-networkd[1435]: calif2b2958789f: Gained carrier May 13 23:55:10.874439 containerd[1526]: time="2025-05-13T23:55:10.874359152Z" level=info msg="connecting to shim d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a" address="unix:///run/containerd/s/8212a22719b62e702e1535ec19872e44f7bd69089be6496fa92c04ccbb9fd16a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:10.994333 systemd[1]: Started cri-containerd-d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a.scope - libcontainer container d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a. May 13 23:55:11.010095 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.722 [INFO][4258] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--ftfgb-eth0 csi-node-driver- calico-system a3c9bb24-adc4-4f0b-8af7-8850a622c673 661 0 2025-05-13 23:54:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-ftfgb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif2b2958789f [] []}} ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.725 [INFO][4258] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.765 [INFO][4304] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" HandleID="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Workload="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.776 [INFO][4304] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" HandleID="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Workload="localhost-k8s-csi--node--driver--ftfgb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00034f700), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-ftfgb", "timestamp":"2025-05-13 23:55:10.765964438 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.776 [INFO][4304] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.776 [INFO][4304] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.776 [INFO][4304] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.798 [INFO][4304] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.803 [INFO][4304] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.808 [INFO][4304] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.825 [INFO][4304] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.829 [INFO][4304] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.830 [INFO][4304] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.831 [INFO][4304] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5 May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.841 [INFO][4304] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.864 [INFO][4304] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.864 [INFO][4304] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" host="localhost" May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.864 [INFO][4304] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:11.034809 containerd[1526]: 2025-05-13 23:55:10.864 [INFO][4304] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" HandleID="k8s-pod-network.69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Workload="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:10.869 [INFO][4258] cni-plugin/k8s.go 386: Populated endpoint ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ftfgb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a3c9bb24-adc4-4f0b-8af7-8850a622c673", ResourceVersion:"661", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-ftfgb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif2b2958789f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:10.869 [INFO][4258] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:10.869 [INFO][4258] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2b2958789f ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:10.872 [INFO][4258] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:10.872 [INFO][4258] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ftfgb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a3c9bb24-adc4-4f0b-8af7-8850a622c673", ResourceVersion:"661", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5", Pod:"csi-node-driver-ftfgb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif2b2958789f", MAC:"0a:8f:f6:54:67:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:11.036546 containerd[1526]: 2025-05-13 23:55:11.028 [INFO][4258] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" Namespace="calico-system" Pod="csi-node-driver-ftfgb" WorkloadEndpoint="localhost-k8s-csi--node--driver--ftfgb-eth0" May 13 23:55:11.093020 containerd[1526]: time="2025-05-13T23:55:11.092833866Z" level=info msg="connecting to shim 7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287" address="unix:///run/containerd/s/da19fa3e56e2ff84d09e88ceacdac8528a4b7e82a99b8b71c3f13c3ac73a36af" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:11.112199 systemd-networkd[1435]: cali9da5eb2ec40: Link UP May 13 23:55:11.113170 systemd-networkd[1435]: cali9da5eb2ec40: Gained carrier May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.716 [INFO][4246] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0 calico-apiserver-d986995dd- calico-apiserver fa0beed6-eb05-4283-a264-160950532e24 843 0 2025-05-13 23:54:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d986995dd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-d986995dd-mdnct eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9da5eb2ec40 [] []}} ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.717 [INFO][4246] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.769 [INFO][4298] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" HandleID="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Workload="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.800 [INFO][4298] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" HandleID="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Workload="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000314b40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-d986995dd-mdnct", "timestamp":"2025-05-13 23:55:10.769000501 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.800 [INFO][4298] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.865 [INFO][4298] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:10.865 [INFO][4298] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.025 [INFO][4298] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.033 [INFO][4298] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.043 [INFO][4298] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.046 [INFO][4298] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.051 [INFO][4298] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.051 [INFO][4298] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.054 [INFO][4298] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.078 [INFO][4298] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.106 [INFO][4298] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.106 [INFO][4298] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" host="localhost" May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.106 [INFO][4298] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:11.340328 containerd[1526]: 2025-05-13 23:55:11.106 [INFO][4298] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" HandleID="k8s-pod-network.cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Workload="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.108 [INFO][4246] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0", GenerateName:"calico-apiserver-d986995dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"fa0beed6-eb05-4283-a264-160950532e24", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d986995dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-d986995dd-mdnct", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9da5eb2ec40", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.108 [INFO][4246] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.108 [INFO][4246] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9da5eb2ec40 ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.112 [INFO][4246] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.112 [INFO][4246] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0", GenerateName:"calico-apiserver-d986995dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"fa0beed6-eb05-4283-a264-160950532e24", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d986995dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d", Pod:"calico-apiserver-d986995dd-mdnct", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9da5eb2ec40", MAC:"e6:70:ca:e6:a6:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:11.341123 containerd[1526]: 2025-05-13 23:55:11.337 [INFO][4246] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-mdnct" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--mdnct-eth0" May 13 23:55:11.347086 systemd[1]: Started cri-containerd-7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287.scope - libcontainer container 7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287. May 13 23:55:11.360299 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:11.638004 containerd[1526]: time="2025-05-13T23:55:11.637939319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gc8hd,Uid:10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0,Namespace:kube-system,Attempt:0,}" May 13 23:55:11.891145 systemd-networkd[1435]: calif2b2958789f: Gained IPv6LL May 13 23:55:11.975131 containerd[1526]: time="2025-05-13T23:55:11.975075407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gv55w,Uid:a631a5bc-b8f6-438b-ba2f-de7a312a2873,Namespace:kube-system,Attempt:0,} returns sandbox id \"d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a\"" May 13 23:55:11.977859 containerd[1526]: time="2025-05-13T23:55:11.977820509Z" level=info msg="CreateContainer within sandbox \"d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:55:12.102137 containerd[1526]: time="2025-05-13T23:55:12.102044161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59c8f969f4-mhpbc,Uid:0399bbfa-118a-4448-9eb9-556767adecf1,Namespace:calico-system,Attempt:0,} returns sandbox id \"7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287\"" May 13 23:55:12.104334 containerd[1526]: time="2025-05-13T23:55:12.104289750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 23:55:12.275286 systemd-networkd[1435]: cali000b0ea82c0: Gained IPv6LL May 13 23:55:12.324945 systemd-networkd[1435]: cali728f8d6a48c: Link UP May 13 23:55:12.325162 systemd-networkd[1435]: cali728f8d6a48c: Gained carrier May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.033 [INFO][4327] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0 calico-apiserver-d986995dd- calico-apiserver 07e6469d-9168-4ad4-80fe-12eb59493ccb 841 0 2025-05-13 23:54:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d986995dd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-d986995dd-qnwxn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali728f8d6a48c [] []}} ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.033 [INFO][4327] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.085 [INFO][4386] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" HandleID="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Workload="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.856 [INFO][4386] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" HandleID="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Workload="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002dd8a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-d986995dd-qnwxn", "timestamp":"2025-05-13 23:55:11.084918802 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.857 [INFO][4386] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.857 [INFO][4386] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.857 [INFO][4386] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:11.915 [INFO][4386] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.066 [INFO][4386] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.070 [INFO][4386] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.072 [INFO][4386] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.074 [INFO][4386] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.075 [INFO][4386] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.076 [INFO][4386] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.095 [INFO][4386] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.319 [INFO][4386] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.319 [INFO][4386] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" host="localhost" May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.319 [INFO][4386] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:12.351845 containerd[1526]: 2025-05-13 23:55:12.319 [INFO][4386] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" HandleID="k8s-pod-network.d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Workload="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.322 [INFO][4327] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0", GenerateName:"calico-apiserver-d986995dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"07e6469d-9168-4ad4-80fe-12eb59493ccb", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d986995dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-d986995dd-qnwxn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali728f8d6a48c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.322 [INFO][4327] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.322 [INFO][4327] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali728f8d6a48c ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.325 [INFO][4327] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.325 [INFO][4327] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0", GenerateName:"calico-apiserver-d986995dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"07e6469d-9168-4ad4-80fe-12eb59493ccb", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d986995dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed", Pod:"calico-apiserver-d986995dd-qnwxn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali728f8d6a48c", MAC:"02:b6:2f:a8:4c:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:12.352579 containerd[1526]: 2025-05-13 23:55:12.348 [INFO][4327] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" Namespace="calico-apiserver" Pod="calico-apiserver-d986995dd-qnwxn" WorkloadEndpoint="localhost-k8s-calico--apiserver--d986995dd--qnwxn-eth0" May 13 23:55:12.851193 systemd-networkd[1435]: cali9da5eb2ec40: Gained IPv6LL May 13 23:55:13.202220 systemd-networkd[1435]: cali6ab19f255aa: Link UP May 13 23:55:13.203099 systemd-networkd[1435]: cali6ab19f255aa: Gained carrier May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.789 [INFO][4478] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0 coredns-6f6b679f8f- kube-system 10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0 834 0 2025-05-13 23:53:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-gc8hd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6ab19f255aa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.789 [INFO][4478] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.837 [INFO][4492] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" HandleID="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Workload="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.847 [INFO][4492] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" HandleID="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Workload="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000297ca0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-gc8hd", "timestamp":"2025-05-13 23:55:12.837489222 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.847 [INFO][4492] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.847 [INFO][4492] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.847 [INFO][4492] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.850 [INFO][4492] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.854 [INFO][4492] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.859 [INFO][4492] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.860 [INFO][4492] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.985 [INFO][4492] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.985 [INFO][4492] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:12.989 [INFO][4492] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237 May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:13.055 [INFO][4492] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:13.196 [INFO][4492] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:13.196 [INFO][4492] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" host="localhost" May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:13.196 [INFO][4492] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:55:13.232773 containerd[1526]: 2025-05-13 23:55:13.196 [INFO][4492] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" HandleID="k8s-pod-network.faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Workload="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.199 [INFO][4478] cni-plugin/k8s.go 386: Populated endpoint ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-gc8hd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ab19f255aa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.199 [INFO][4478] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.199 [INFO][4478] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ab19f255aa ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.202 [INFO][4478] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.203 [INFO][4478] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237", Pod:"coredns-6f6b679f8f-gc8hd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ab19f255aa", MAC:"5a:1d:2f:ab:02:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:55:13.233991 containerd[1526]: 2025-05-13 23:55:13.229 [INFO][4478] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" Namespace="kube-system" Pod="coredns-6f6b679f8f-gc8hd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--gc8hd-eth0" May 13 23:55:13.614683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3382146096.mount: Deactivated successfully. May 13 23:55:13.618131 containerd[1526]: time="2025-05-13T23:55:13.618073472Z" level=info msg="Container b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:13.621303 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3625265861.mount: Deactivated successfully. May 13 23:55:14.131050 systemd-networkd[1435]: cali728f8d6a48c: Gained IPv6LL May 13 23:55:14.248665 containerd[1526]: time="2025-05-13T23:55:14.248614368Z" level=info msg="connecting to shim 69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5" address="unix:///run/containerd/s/2cb69cb4889fe22358d647a0087a5b2b30e21f6f0bf656754c6732a501d1e9f0" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:14.282085 systemd[1]: Started cri-containerd-69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5.scope - libcontainer container 69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5. May 13 23:55:14.297507 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:14.643039 systemd-networkd[1435]: cali6ab19f255aa: Gained IPv6LL May 13 23:55:14.727137 containerd[1526]: time="2025-05-13T23:55:14.727053163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ftfgb,Uid:a3c9bb24-adc4-4f0b-8af7-8850a622c673,Namespace:calico-system,Attempt:0,} returns sandbox id \"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5\"" May 13 23:55:14.961100 containerd[1526]: time="2025-05-13T23:55:14.960927956Z" level=info msg="CreateContainer within sandbox \"d2338d40b0b5b7e89e82e4655e310664259cdd082efcc91d7babb97ace0be19a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec\"" May 13 23:55:14.961969 containerd[1526]: time="2025-05-13T23:55:14.961907184Z" level=info msg="StartContainer for \"b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec\"" May 13 23:55:14.963184 containerd[1526]: time="2025-05-13T23:55:14.963128419Z" level=info msg="connecting to shim b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec" address="unix:///run/containerd/s/8212a22719b62e702e1535ec19872e44f7bd69089be6496fa92c04ccbb9fd16a" protocol=ttrpc version=3 May 13 23:55:14.989034 systemd[1]: Started cri-containerd-b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec.scope - libcontainer container b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec. May 13 23:55:15.063733 containerd[1526]: time="2025-05-13T23:55:15.063656267Z" level=info msg="connecting to shim cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d" address="unix:///run/containerd/s/865f371f424ff536f3686ad556e174141cddc170e85068ac3a3d13416b17fee0" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:15.088788 systemd[1]: Started cri-containerd-cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d.scope - libcontainer container cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d. May 13 23:55:15.104080 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:15.205700 containerd[1526]: time="2025-05-13T23:55:15.205642487Z" level=info msg="StartContainer for \"b78d9fd3b3567b018d6cbf9929849cee856fed9a76f4e8366852b0dff24febec\" returns successfully" May 13 23:55:15.398634 containerd[1526]: time="2025-05-13T23:55:15.398507502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-mdnct,Uid:fa0beed6-eb05-4283-a264-160950532e24,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d\"" May 13 23:55:15.466140 containerd[1526]: time="2025-05-13T23:55:15.465169498Z" level=info msg="connecting to shim d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed" address="unix:///run/containerd/s/4cc881ca4028f65e426b4ec7013999ece8840d015a8256563388d1bcd4204cf5" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:15.573800 systemd[1]: Started cri-containerd-d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed.scope - libcontainer container d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed. May 13 23:55:15.633624 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:15.731225 kubelet[2701]: I0513 23:55:15.715065 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-gv55w" podStartSLOduration=80.715041416 podStartE2EDuration="1m20.715041416s" podCreationTimestamp="2025-05-13 23:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:55:15.701253574 +0000 UTC m=+86.161799790" watchObservedRunningTime="2025-05-13 23:55:15.715041416 +0000 UTC m=+86.175587622" May 13 23:55:15.833193 systemd[1]: Started sshd@13-10.0.0.42:22-10.0.0.1:48450.service - OpenSSH per-connection server daemon (10.0.0.1:48450). May 13 23:55:16.103822 sshd[4690]: Accepted publickey for core from 10.0.0.1 port 48450 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:16.106981 sshd-session[4690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:16.121520 systemd-logind[1504]: New session 14 of user core. May 13 23:55:16.139545 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 23:55:16.271078 containerd[1526]: time="2025-05-13T23:55:16.270387793Z" level=info msg="connecting to shim faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237" address="unix:///run/containerd/s/427d4c468cf13c401cf0a0afe096feea7bfc4ccdaa1866fda530f9c4562ee5af" namespace=k8s.io protocol=ttrpc version=3 May 13 23:55:16.335793 containerd[1526]: time="2025-05-13T23:55:16.335644837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d986995dd-qnwxn,Uid:07e6469d-9168-4ad4-80fe-12eb59493ccb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed\"" May 13 23:55:16.395259 systemd[1]: Started cri-containerd-faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237.scope - libcontainer container faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237. May 13 23:55:16.442369 systemd-resolved[1352]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:55:16.635671 sshd[4693]: Connection closed by 10.0.0.1 port 48450 May 13 23:55:16.636731 sshd-session[4690]: pam_unix(sshd:session): session closed for user core May 13 23:55:16.651347 systemd[1]: sshd@13-10.0.0.42:22-10.0.0.1:48450.service: Deactivated successfully. May 13 23:55:16.654293 systemd[1]: session-14.scope: Deactivated successfully. May 13 23:55:16.655531 systemd-logind[1504]: Session 14 logged out. Waiting for processes to exit. May 13 23:55:16.660326 systemd[1]: Started sshd@14-10.0.0.42:22-10.0.0.1:48462.service - OpenSSH per-connection server daemon (10.0.0.1:48462). May 13 23:55:16.661056 systemd-logind[1504]: Removed session 14. May 13 23:55:16.668007 containerd[1526]: time="2025-05-13T23:55:16.667956656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gc8hd,Uid:10cdb901-f5c0-4ff4-a7d1-3bf3da3de8c0,Namespace:kube-system,Attempt:0,} returns sandbox id \"faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237\"" May 13 23:55:16.672349 containerd[1526]: time="2025-05-13T23:55:16.671834224Z" level=info msg="CreateContainer within sandbox \"faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:55:16.715809 sshd[4763]: Accepted publickey for core from 10.0.0.1 port 48462 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:16.718235 sshd-session[4763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:16.725651 systemd-logind[1504]: New session 15 of user core. May 13 23:55:16.729094 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 23:55:17.318037 sshd[4768]: Connection closed by 10.0.0.1 port 48462 May 13 23:55:17.318894 sshd-session[4763]: pam_unix(sshd:session): session closed for user core May 13 23:55:17.334949 systemd[1]: sshd@14-10.0.0.42:22-10.0.0.1:48462.service: Deactivated successfully. May 13 23:55:17.338071 systemd[1]: session-15.scope: Deactivated successfully. May 13 23:55:17.341063 systemd-logind[1504]: Session 15 logged out. Waiting for processes to exit. May 13 23:55:17.342726 systemd[1]: Started sshd@15-10.0.0.42:22-10.0.0.1:48476.service - OpenSSH per-connection server daemon (10.0.0.1:48476). May 13 23:55:17.344436 systemd-logind[1504]: Removed session 15. May 13 23:55:17.528570 sshd[4778]: Accepted publickey for core from 10.0.0.1 port 48476 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:17.529370 sshd-session[4778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:17.549678 systemd-logind[1504]: New session 16 of user core. May 13 23:55:17.557221 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 23:55:17.669017 containerd[1526]: time="2025-05-13T23:55:17.668819557Z" level=info msg="Container 763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:17.964511 sshd[4781]: Connection closed by 10.0.0.1 port 48476 May 13 23:55:17.965587 sshd-session[4778]: pam_unix(sshd:session): session closed for user core May 13 23:55:17.970778 systemd[1]: sshd@15-10.0.0.42:22-10.0.0.1:48476.service: Deactivated successfully. May 13 23:55:17.972956 systemd[1]: session-16.scope: Deactivated successfully. May 13 23:55:17.973732 systemd-logind[1504]: Session 16 logged out. Waiting for processes to exit. May 13 23:55:17.974706 systemd-logind[1504]: Removed session 16. May 13 23:55:18.221147 containerd[1526]: time="2025-05-13T23:55:18.220968486Z" level=info msg="CreateContainer within sandbox \"faf258ac469eb34fe091c32c60222b96ffb3cfd88acf1827c5698cbb9dad0237\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc\"" May 13 23:55:18.222963 containerd[1526]: time="2025-05-13T23:55:18.221974514Z" level=info msg="StartContainer for \"763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc\"" May 13 23:55:18.223519 containerd[1526]: time="2025-05-13T23:55:18.223181222Z" level=info msg="connecting to shim 763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc" address="unix:///run/containerd/s/427d4c468cf13c401cf0a0afe096feea7bfc4ccdaa1866fda530f9c4562ee5af" protocol=ttrpc version=3 May 13 23:55:18.265083 systemd[1]: Started cri-containerd-763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc.scope - libcontainer container 763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc. May 13 23:55:18.724895 containerd[1526]: time="2025-05-13T23:55:18.724833511Z" level=info msg="StartContainer for \"763c0f75ea2f53c0e8e65aa5092ea1b2b1953893e950cfda70097862c53e15bc\" returns successfully" May 13 23:55:19.787058 kubelet[2701]: I0513 23:55:19.785925 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-gc8hd" podStartSLOduration=84.785906389 podStartE2EDuration="1m24.785906389s" podCreationTimestamp="2025-05-13 23:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:55:19.785740105 +0000 UTC m=+90.246286311" watchObservedRunningTime="2025-05-13 23:55:19.785906389 +0000 UTC m=+90.246452595" May 13 23:55:21.970740 containerd[1526]: time="2025-05-13T23:55:21.970592843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:22.087759 containerd[1526]: time="2025-05-13T23:55:22.087645431Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 13 23:55:22.191034 containerd[1526]: time="2025-05-13T23:55:22.190951903Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:22.248232 containerd[1526]: time="2025-05-13T23:55:22.248026496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:22.248943 containerd[1526]: time="2025-05-13T23:55:22.248904562Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 10.144547004s" May 13 23:55:22.248943 containerd[1526]: time="2025-05-13T23:55:22.248940140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 13 23:55:22.250576 containerd[1526]: time="2025-05-13T23:55:22.249980072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 23:55:22.257482 containerd[1526]: time="2025-05-13T23:55:22.257140378Z" level=info msg="CreateContainer within sandbox \"7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:55:22.593163 containerd[1526]: time="2025-05-13T23:55:22.592935407Z" level=info msg="Container 30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:22.925076 containerd[1526]: time="2025-05-13T23:55:22.925019846Z" level=info msg="CreateContainer within sandbox \"7d26ead07403d483bbaa879a35c8702177d7dcf2a4d646378ee62a2168328287\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\"" May 13 23:55:22.946416 containerd[1526]: time="2025-05-13T23:55:22.925563872Z" level=info msg="StartContainer for \"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\"" May 13 23:55:22.947752 containerd[1526]: time="2025-05-13T23:55:22.947708576Z" level=info msg="connecting to shim 30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b" address="unix:///run/containerd/s/da19fa3e56e2ff84d09e88ceacdac8528a4b7e82a99b8b71c3f13c3ac73a36af" protocol=ttrpc version=3 May 13 23:55:22.971070 systemd[1]: Started cri-containerd-30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b.scope - libcontainer container 30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b. May 13 23:55:22.979775 systemd[1]: Started sshd@16-10.0.0.42:22-10.0.0.1:45682.service - OpenSSH per-connection server daemon (10.0.0.1:45682). May 13 23:55:23.100482 sshd[4867]: Accepted publickey for core from 10.0.0.1 port 45682 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:23.102258 sshd-session[4867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:23.106914 systemd-logind[1504]: New session 17 of user core. May 13 23:55:23.122020 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 23:55:23.409733 sshd[4882]: Connection closed by 10.0.0.1 port 45682 May 13 23:55:23.410206 sshd-session[4867]: pam_unix(sshd:session): session closed for user core May 13 23:55:23.413432 systemd[1]: sshd@16-10.0.0.42:22-10.0.0.1:45682.service: Deactivated successfully. May 13 23:55:23.416073 systemd[1]: session-17.scope: Deactivated successfully. May 13 23:55:23.417997 systemd-logind[1504]: Session 17 logged out. Waiting for processes to exit. May 13 23:55:23.419111 systemd-logind[1504]: Removed session 17. May 13 23:55:24.042757 containerd[1526]: time="2025-05-13T23:55:24.042452287Z" level=info msg="TaskExit event in podsandbox handler container_id:\"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" id:\"d20b5afd3b9d470f0f6c84859a3da7212f44e2f8c78c8e526f75e048d380ede2\" pid:4908 exited_at:{seconds:1747180524 nanos:41961431}" May 13 23:55:24.247941 containerd[1526]: time="2025-05-13T23:55:24.247534324Z" level=info msg="StartContainer for \"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" returns successfully" May 13 23:55:25.301214 containerd[1526]: time="2025-05-13T23:55:25.301166421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" id:\"5ca20e194456e5be4f68ad396a0ed649b2e3cd53afe748986e2209cdeef4ad64\" pid:4933 exited_at:{seconds:1747180525 nanos:300902524}" May 13 23:55:25.509975 kubelet[2701]: I0513 23:55:25.509831 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59c8f969f4-mhpbc" podStartSLOduration=51.363930893 podStartE2EDuration="1m1.509808979s" podCreationTimestamp="2025-05-13 23:54:24 +0000 UTC" firstStartedPulling="2025-05-13 23:55:12.103933167 +0000 UTC m=+82.564479373" lastFinishedPulling="2025-05-13 23:55:22.249811253 +0000 UTC m=+92.710357459" observedRunningTime="2025-05-13 23:55:25.508708514 +0000 UTC m=+95.969254730" watchObservedRunningTime="2025-05-13 23:55:25.509808979 +0000 UTC m=+95.970355185" May 13 23:55:25.691498 containerd[1526]: time="2025-05-13T23:55:25.691436176Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" id:\"b2d869780a7d6194e9aad730ceb69ee2210ce54204fe8eb661d6cab9d7f0fb2e\" pid:4956 exited_at:{seconds:1747180525 nanos:691240677}" May 13 23:55:28.165917 containerd[1526]: time="2025-05-13T23:55:28.165783372Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:28.267443 containerd[1526]: time="2025-05-13T23:55:28.267328621Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 13 23:55:28.363034 containerd[1526]: time="2025-05-13T23:55:28.362948869Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:28.429991 systemd[1]: Started sshd@17-10.0.0.42:22-10.0.0.1:55910.service - OpenSSH per-connection server daemon (10.0.0.1:55910). May 13 23:55:28.463066 containerd[1526]: time="2025-05-13T23:55:28.462170466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:28.463066 containerd[1526]: time="2025-05-13T23:55:28.462890173Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 6.212829179s" May 13 23:55:28.463066 containerd[1526]: time="2025-05-13T23:55:28.462943013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 13 23:55:28.465948 containerd[1526]: time="2025-05-13T23:55:28.464477456Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:55:28.467325 containerd[1526]: time="2025-05-13T23:55:28.467297514Z" level=info msg="CreateContainer within sandbox \"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 23:55:28.636526 sshd[4977]: Accepted publickey for core from 10.0.0.1 port 55910 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:28.638988 sshd-session[4977]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:28.645147 systemd-logind[1504]: New session 18 of user core. May 13 23:55:28.657071 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 23:55:29.099512 sshd[4979]: Connection closed by 10.0.0.1 port 55910 May 13 23:55:29.100253 sshd-session[4977]: pam_unix(sshd:session): session closed for user core May 13 23:55:29.105302 systemd[1]: sshd@17-10.0.0.42:22-10.0.0.1:55910.service: Deactivated successfully. May 13 23:55:29.108304 systemd[1]: session-18.scope: Deactivated successfully. May 13 23:55:29.109315 systemd-logind[1504]: Session 18 logged out. Waiting for processes to exit. May 13 23:55:29.110837 systemd-logind[1504]: Removed session 18. May 13 23:55:29.876586 containerd[1526]: time="2025-05-13T23:55:29.876519900Z" level=info msg="Container d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:30.618262 containerd[1526]: time="2025-05-13T23:55:30.618213319Z" level=info msg="CreateContainer within sandbox \"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d\"" May 13 23:55:30.620905 containerd[1526]: time="2025-05-13T23:55:30.618761003Z" level=info msg="StartContainer for \"d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d\"" May 13 23:55:30.620905 containerd[1526]: time="2025-05-13T23:55:30.620505001Z" level=info msg="connecting to shim d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d" address="unix:///run/containerd/s/2cb69cb4889fe22358d647a0087a5b2b30e21f6f0bf656754c6732a501d1e9f0" protocol=ttrpc version=3 May 13 23:55:30.639092 systemd[1]: Started cri-containerd-d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d.scope - libcontainer container d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d. May 13 23:55:30.897796 containerd[1526]: time="2025-05-13T23:55:30.897738641Z" level=info msg="StartContainer for \"d5c46c66ba5782a9025e96c6df5bca28162bb1233730f8d528206e4b0b241d8d\" returns successfully" May 13 23:55:34.116583 systemd[1]: Started sshd@18-10.0.0.42:22-10.0.0.1:55920.service - OpenSSH per-connection server daemon (10.0.0.1:55920). May 13 23:55:34.269906 sshd[5038]: Accepted publickey for core from 10.0.0.1 port 55920 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:34.271672 sshd-session[5038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:34.276367 systemd-logind[1504]: New session 19 of user core. May 13 23:55:34.285101 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 23:55:34.638505 sshd[5044]: Connection closed by 10.0.0.1 port 55920 May 13 23:55:34.640771 sshd-session[5038]: pam_unix(sshd:session): session closed for user core May 13 23:55:34.644469 systemd[1]: sshd@18-10.0.0.42:22-10.0.0.1:55920.service: Deactivated successfully. May 13 23:55:34.648460 systemd[1]: session-19.scope: Deactivated successfully. May 13 23:55:34.651005 systemd-logind[1504]: Session 19 logged out. Waiting for processes to exit. May 13 23:55:34.652601 systemd-logind[1504]: Removed session 19. May 13 23:55:35.418133 containerd[1526]: time="2025-05-13T23:55:35.418039063Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:35.436725 containerd[1526]: time="2025-05-13T23:55:35.436599428Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 13 23:55:35.450894 containerd[1526]: time="2025-05-13T23:55:35.450757707Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:35.463638 containerd[1526]: time="2025-05-13T23:55:35.463574256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:35.464185 containerd[1526]: time="2025-05-13T23:55:35.464149781Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 6.999624944s" May 13 23:55:35.464185 containerd[1526]: time="2025-05-13T23:55:35.464183114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 23:55:35.465268 containerd[1526]: time="2025-05-13T23:55:35.465204429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:55:35.466548 containerd[1526]: time="2025-05-13T23:55:35.466517954Z" level=info msg="CreateContainer within sandbox \"cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:55:35.620648 containerd[1526]: time="2025-05-13T23:55:35.620586853Z" level=info msg="Container cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:35.789270 containerd[1526]: time="2025-05-13T23:55:35.789131779Z" level=info msg="CreateContainer within sandbox \"cf2b220232b02f3d45cb9836eb4e889ce804684e56632a5a3887efd8084ebe4d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7\"" May 13 23:55:35.790872 containerd[1526]: time="2025-05-13T23:55:35.789789679Z" level=info msg="StartContainer for \"cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7\"" May 13 23:55:35.791133 containerd[1526]: time="2025-05-13T23:55:35.791098767Z" level=info msg="connecting to shim cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7" address="unix:///run/containerd/s/865f371f424ff536f3686ad556e174141cddc170e85068ac3a3d13416b17fee0" protocol=ttrpc version=3 May 13 23:55:35.813023 systemd[1]: Started cri-containerd-cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7.scope - libcontainer container cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7. May 13 23:55:36.060501 containerd[1526]: time="2025-05-13T23:55:36.060378702Z" level=info msg="StartContainer for \"cc99a96780544c4921167c1f42d56796c3b08c27f9d81ae98d1e8f6e00f5a2a7\" returns successfully" May 13 23:55:36.373995 kubelet[2701]: I0513 23:55:36.373906 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d986995dd-mdnct" podStartSLOduration=56.311023654 podStartE2EDuration="1m16.37386784s" podCreationTimestamp="2025-05-13 23:54:20 +0000 UTC" firstStartedPulling="2025-05-13 23:55:15.402067772 +0000 UTC m=+85.862613978" lastFinishedPulling="2025-05-13 23:55:35.464911958 +0000 UTC m=+105.925458164" observedRunningTime="2025-05-13 23:55:36.373389198 +0000 UTC m=+106.833935414" watchObservedRunningTime="2025-05-13 23:55:36.37386784 +0000 UTC m=+106.834414046" May 13 23:55:36.820324 containerd[1526]: time="2025-05-13T23:55:36.820150020Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:36.830351 containerd[1526]: time="2025-05-13T23:55:36.830247002Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 23:55:36.832625 containerd[1526]: time="2025-05-13T23:55:36.832565783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 1.367308233s" May 13 23:55:36.832625 containerd[1526]: time="2025-05-13T23:55:36.832615576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 23:55:36.833938 containerd[1526]: time="2025-05-13T23:55:36.833903885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 23:55:36.835465 containerd[1526]: time="2025-05-13T23:55:36.835401738Z" level=info msg="CreateContainer within sandbox \"d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:55:36.989974 containerd[1526]: time="2025-05-13T23:55:36.989708743Z" level=info msg="Container 84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:37.118044 containerd[1526]: time="2025-05-13T23:55:37.117836772Z" level=info msg="CreateContainer within sandbox \"d23529b196d6167103abcff5feb77249f37b28676479fd435e24de78c2d2d7ed\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a\"" May 13 23:55:37.119070 containerd[1526]: time="2025-05-13T23:55:37.118959900Z" level=info msg="StartContainer for \"84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a\"" May 13 23:55:37.120237 containerd[1526]: time="2025-05-13T23:55:37.120198744Z" level=info msg="connecting to shim 84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a" address="unix:///run/containerd/s/4cc881ca4028f65e426b4ec7013999ece8840d015a8256563388d1bcd4204cf5" protocol=ttrpc version=3 May 13 23:55:37.153150 systemd[1]: Started cri-containerd-84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a.scope - libcontainer container 84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a. May 13 23:55:37.268850 containerd[1526]: time="2025-05-13T23:55:37.268775796Z" level=info msg="StartContainer for \"84d8b4a459f6fca7b25c6ab4d679148c2766fe5da5885d0271048555bfd9e91a\" returns successfully" May 13 23:55:37.547913 kubelet[2701]: I0513 23:55:37.547461 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d986995dd-qnwxn" podStartSLOduration=57.051667834 podStartE2EDuration="1m17.54744544s" podCreationTimestamp="2025-05-13 23:54:20 +0000 UTC" firstStartedPulling="2025-05-13 23:55:16.337897207 +0000 UTC m=+86.798443423" lastFinishedPulling="2025-05-13 23:55:36.833674823 +0000 UTC m=+107.294221029" observedRunningTime="2025-05-13 23:55:37.546907707 +0000 UTC m=+108.007453913" watchObservedRunningTime="2025-05-13 23:55:37.54744544 +0000 UTC m=+108.007991646" May 13 23:55:38.280697 kubelet[2701]: I0513 23:55:38.280246 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:55:39.283843 kubelet[2701]: I0513 23:55:39.283792 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:55:39.667030 systemd[1]: Started sshd@19-10.0.0.42:22-10.0.0.1:53832.service - OpenSSH per-connection server daemon (10.0.0.1:53832). May 13 23:55:39.732388 sshd[5134]: Accepted publickey for core from 10.0.0.1 port 53832 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:39.734419 sshd-session[5134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:39.739717 systemd-logind[1504]: New session 20 of user core. May 13 23:55:39.747079 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 23:55:39.920239 sshd[5136]: Connection closed by 10.0.0.1 port 53832 May 13 23:55:39.921133 sshd-session[5134]: pam_unix(sshd:session): session closed for user core May 13 23:55:39.926319 systemd[1]: sshd@19-10.0.0.42:22-10.0.0.1:53832.service: Deactivated successfully. May 13 23:55:39.929241 systemd[1]: session-20.scope: Deactivated successfully. May 13 23:55:39.931462 systemd-logind[1504]: Session 20 logged out. Waiting for processes to exit. May 13 23:55:39.933368 systemd-logind[1504]: Removed session 20. May 13 23:55:42.803089 containerd[1526]: time="2025-05-13T23:55:42.803007170Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:42.921251 containerd[1526]: time="2025-05-13T23:55:42.921103898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 13 23:55:43.138313 containerd[1526]: time="2025-05-13T23:55:43.138232348Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:43.279035 containerd[1526]: time="2025-05-13T23:55:43.278954373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:55:43.279770 containerd[1526]: time="2025-05-13T23:55:43.279714436Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 6.445775435s" May 13 23:55:43.279854 containerd[1526]: time="2025-05-13T23:55:43.279767766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 13 23:55:43.285179 containerd[1526]: time="2025-05-13T23:55:43.285134199Z" level=info msg="CreateContainer within sandbox \"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 23:55:43.560007 containerd[1526]: time="2025-05-13T23:55:43.559782022Z" level=info msg="Container f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2: CDI devices from CRI Config.CDIDevices: []" May 13 23:55:43.951603 containerd[1526]: time="2025-05-13T23:55:43.951507697Z" level=info msg="CreateContainer within sandbox \"69e449dfbfc4a952df985961d18069e76d4da45ea2a20c4b5c42047c4e179cb5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2\"" May 13 23:55:43.952299 containerd[1526]: time="2025-05-13T23:55:43.952269202Z" level=info msg="StartContainer for \"f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2\"" May 13 23:55:43.954112 containerd[1526]: time="2025-05-13T23:55:43.954081477Z" level=info msg="connecting to shim f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2" address="unix:///run/containerd/s/2cb69cb4889fe22358d647a0087a5b2b30e21f6f0bf656754c6732a501d1e9f0" protocol=ttrpc version=3 May 13 23:55:44.006058 systemd[1]: Started cri-containerd-f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2.scope - libcontainer container f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2. May 13 23:55:44.616286 containerd[1526]: time="2025-05-13T23:55:44.616228453Z" level=info msg="StartContainer for \"f182cc6a49408c378bbf6d3cf56ad367eada720d2fbdadd3c9325da497f2b3b2\" returns successfully" May 13 23:55:44.733817 kubelet[2701]: I0513 23:55:44.733770 2701 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 23:55:44.733817 kubelet[2701]: I0513 23:55:44.733807 2701 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 23:55:44.933422 systemd[1]: Started sshd@20-10.0.0.42:22-10.0.0.1:53836.service - OpenSSH per-connection server daemon (10.0.0.1:53836). May 13 23:55:45.005613 sshd[5194]: Accepted publickey for core from 10.0.0.1 port 53836 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:45.007488 sshd-session[5194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:45.012349 systemd-logind[1504]: New session 21 of user core. May 13 23:55:45.022082 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 23:55:45.209178 sshd[5196]: Connection closed by 10.0.0.1 port 53836 May 13 23:55:45.209546 sshd-session[5194]: pam_unix(sshd:session): session closed for user core May 13 23:55:45.215296 systemd[1]: sshd@20-10.0.0.42:22-10.0.0.1:53836.service: Deactivated successfully. May 13 23:55:45.219691 systemd[1]: session-21.scope: Deactivated successfully. May 13 23:55:45.220821 systemd-logind[1504]: Session 21 logged out. Waiting for processes to exit. May 13 23:55:45.222103 systemd-logind[1504]: Removed session 21. May 13 23:55:50.227020 systemd[1]: Started sshd@21-10.0.0.42:22-10.0.0.1:53748.service - OpenSSH per-connection server daemon (10.0.0.1:53748). May 13 23:55:50.304088 sshd[5218]: Accepted publickey for core from 10.0.0.1 port 53748 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:50.306173 sshd-session[5218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:50.311127 systemd-logind[1504]: New session 22 of user core. May 13 23:55:50.322072 systemd[1]: Started session-22.scope - Session 22 of User core. May 13 23:55:50.535738 sshd[5220]: Connection closed by 10.0.0.1 port 53748 May 13 23:55:50.536075 sshd-session[5218]: pam_unix(sshd:session): session closed for user core May 13 23:55:50.540681 systemd[1]: sshd@21-10.0.0.42:22-10.0.0.1:53748.service: Deactivated successfully. May 13 23:55:50.543424 systemd[1]: session-22.scope: Deactivated successfully. May 13 23:55:50.544326 systemd-logind[1504]: Session 22 logged out. Waiting for processes to exit. May 13 23:55:50.545561 systemd-logind[1504]: Removed session 22. May 13 23:55:54.026048 containerd[1526]: time="2025-05-13T23:55:54.025860961Z" level=info msg="TaskExit event in podsandbox handler container_id:\"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" id:\"4664afe173f73e94b9a7ea7a7d1e7f9d456b0e983ed706666fb0ccbb7a64f13f\" pid:5245 exited_at:{seconds:1747180554 nanos:25478401}" May 13 23:55:55.550074 systemd[1]: Started sshd@22-10.0.0.42:22-10.0.0.1:53758.service - OpenSSH per-connection server daemon (10.0.0.1:53758). May 13 23:55:55.602912 sshd[5259]: Accepted publickey for core from 10.0.0.1 port 53758 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:55.605088 sshd-session[5259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:55.610274 systemd-logind[1504]: New session 23 of user core. May 13 23:55:55.620006 systemd[1]: Started session-23.scope - Session 23 of User core. May 13 23:55:55.688548 containerd[1526]: time="2025-05-13T23:55:55.688492011Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" id:\"6f9cbe581ff0cb7350a50036b9f2c5736463bd0a1460e2f55fc815dc3b2dade6\" pid:5287 exited_at:{seconds:1747180555 nanos:686645031}" May 13 23:55:55.690032 containerd[1526]: time="2025-05-13T23:55:55.689989982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" id:\"0941203c6ff49cf03dcb10eb61e5ad3f86717770e8f71935a139ef072070b3d5\" pid:5285 exited_at:{seconds:1747180555 nanos:689834850}" May 13 23:55:55.781555 sshd[5261]: Connection closed by 10.0.0.1 port 53758 May 13 23:55:55.782000 sshd-session[5259]: pam_unix(sshd:session): session closed for user core May 13 23:55:55.797329 systemd[1]: sshd@22-10.0.0.42:22-10.0.0.1:53758.service: Deactivated successfully. May 13 23:55:55.799566 systemd[1]: session-23.scope: Deactivated successfully. May 13 23:55:55.800337 systemd-logind[1504]: Session 23 logged out. Waiting for processes to exit. May 13 23:55:55.802934 systemd[1]: Started sshd@23-10.0.0.42:22-10.0.0.1:53760.service - OpenSSH per-connection server daemon (10.0.0.1:53760). May 13 23:55:55.803739 systemd-logind[1504]: Removed session 23. May 13 23:55:55.867295 sshd[5320]: Accepted publickey for core from 10.0.0.1 port 53760 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:55.868874 sshd-session[5320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:55.873532 systemd-logind[1504]: New session 24 of user core. May 13 23:55:55.886078 systemd[1]: Started session-24.scope - Session 24 of User core. May 13 23:55:56.726279 sshd[5323]: Connection closed by 10.0.0.1 port 53760 May 13 23:55:56.726731 sshd-session[5320]: pam_unix(sshd:session): session closed for user core May 13 23:55:56.742558 systemd[1]: sshd@23-10.0.0.42:22-10.0.0.1:53760.service: Deactivated successfully. May 13 23:55:56.745035 systemd[1]: session-24.scope: Deactivated successfully. May 13 23:55:56.746791 systemd-logind[1504]: Session 24 logged out. Waiting for processes to exit. May 13 23:55:56.748389 systemd[1]: Started sshd@24-10.0.0.42:22-10.0.0.1:53762.service - OpenSSH per-connection server daemon (10.0.0.1:53762). May 13 23:55:56.749428 systemd-logind[1504]: Removed session 24. May 13 23:55:56.806339 sshd[5333]: Accepted publickey for core from 10.0.0.1 port 53762 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:55:56.807990 sshd-session[5333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:56.812525 systemd-logind[1504]: New session 25 of user core. May 13 23:55:56.822051 systemd[1]: Started session-25.scope - Session 25 of User core. May 13 23:56:00.720924 sshd[5336]: Connection closed by 10.0.0.1 port 53762 May 13 23:56:00.721942 sshd-session[5333]: pam_unix(sshd:session): session closed for user core May 13 23:56:00.734266 systemd[1]: sshd@24-10.0.0.42:22-10.0.0.1:53762.service: Deactivated successfully. May 13 23:56:00.736645 systemd[1]: session-25.scope: Deactivated successfully. May 13 23:56:00.736904 systemd[1]: session-25.scope: Consumed 628ms CPU time, 69.4M memory peak. May 13 23:56:00.738252 systemd-logind[1504]: Session 25 logged out. Waiting for processes to exit. May 13 23:56:00.739744 systemd[1]: Started sshd@25-10.0.0.42:22-10.0.0.1:41372.service - OpenSSH per-connection server daemon (10.0.0.1:41372). May 13 23:56:00.741674 systemd-logind[1504]: Removed session 25. May 13 23:56:00.816678 sshd[5376]: Accepted publickey for core from 10.0.0.1 port 41372 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:00.818842 sshd-session[5376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:00.824654 systemd-logind[1504]: New session 26 of user core. May 13 23:56:00.832116 systemd[1]: Started session-26.scope - Session 26 of User core. May 13 23:56:02.289036 sshd[5380]: Connection closed by 10.0.0.1 port 41372 May 13 23:56:02.289474 sshd-session[5376]: pam_unix(sshd:session): session closed for user core May 13 23:56:02.302393 systemd[1]: sshd@25-10.0.0.42:22-10.0.0.1:41372.service: Deactivated successfully. May 13 23:56:02.305782 systemd[1]: session-26.scope: Deactivated successfully. May 13 23:56:02.308530 systemd-logind[1504]: Session 26 logged out. Waiting for processes to exit. May 13 23:56:02.310369 systemd[1]: Started sshd@26-10.0.0.42:22-10.0.0.1:41386.service - OpenSSH per-connection server daemon (10.0.0.1:41386). May 13 23:56:02.311443 systemd-logind[1504]: Removed session 26. May 13 23:56:02.365869 sshd[5390]: Accepted publickey for core from 10.0.0.1 port 41386 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:02.368040 sshd-session[5390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:02.373566 systemd-logind[1504]: New session 27 of user core. May 13 23:56:02.392185 systemd[1]: Started session-27.scope - Session 27 of User core. May 13 23:56:02.509980 sshd[5394]: Connection closed by 10.0.0.1 port 41386 May 13 23:56:02.510402 sshd-session[5390]: pam_unix(sshd:session): session closed for user core May 13 23:56:02.516107 systemd[1]: sshd@26-10.0.0.42:22-10.0.0.1:41386.service: Deactivated successfully. May 13 23:56:02.519572 systemd[1]: session-27.scope: Deactivated successfully. May 13 23:56:02.522292 systemd-logind[1504]: Session 27 logged out. Waiting for processes to exit. May 13 23:56:02.524569 systemd-logind[1504]: Removed session 27. May 13 23:56:07.529055 systemd[1]: Started sshd@27-10.0.0.42:22-10.0.0.1:41390.service - OpenSSH per-connection server daemon (10.0.0.1:41390). May 13 23:56:07.579788 sshd[5409]: Accepted publickey for core from 10.0.0.1 port 41390 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:07.582241 sshd-session[5409]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:07.589130 systemd-logind[1504]: New session 28 of user core. May 13 23:56:07.603190 systemd[1]: Started session-28.scope - Session 28 of User core. May 13 23:56:07.742032 sshd[5411]: Connection closed by 10.0.0.1 port 41390 May 13 23:56:07.742559 sshd-session[5409]: pam_unix(sshd:session): session closed for user core May 13 23:56:07.748601 systemd[1]: sshd@27-10.0.0.42:22-10.0.0.1:41390.service: Deactivated successfully. May 13 23:56:07.751925 systemd[1]: session-28.scope: Deactivated successfully. May 13 23:56:07.754576 systemd-logind[1504]: Session 28 logged out. Waiting for processes to exit. May 13 23:56:07.756906 systemd-logind[1504]: Removed session 28. May 13 23:56:12.761783 systemd[1]: Started sshd@28-10.0.0.42:22-10.0.0.1:60936.service - OpenSSH per-connection server daemon (10.0.0.1:60936). May 13 23:56:12.816970 sshd[5424]: Accepted publickey for core from 10.0.0.1 port 60936 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:12.818808 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:12.829479 systemd-logind[1504]: New session 29 of user core. May 13 23:56:12.837089 systemd[1]: Started session-29.scope - Session 29 of User core. May 13 23:56:12.949537 sshd[5426]: Connection closed by 10.0.0.1 port 60936 May 13 23:56:12.949928 sshd-session[5424]: pam_unix(sshd:session): session closed for user core May 13 23:56:12.954776 systemd[1]: sshd@28-10.0.0.42:22-10.0.0.1:60936.service: Deactivated successfully. May 13 23:56:12.958254 systemd[1]: session-29.scope: Deactivated successfully. May 13 23:56:12.959310 systemd-logind[1504]: Session 29 logged out. Waiting for processes to exit. May 13 23:56:12.960416 systemd-logind[1504]: Removed session 29. May 13 23:56:17.967314 systemd[1]: Started sshd@29-10.0.0.42:22-10.0.0.1:60938.service - OpenSSH per-connection server daemon (10.0.0.1:60938). May 13 23:56:18.008384 sshd[5440]: Accepted publickey for core from 10.0.0.1 port 60938 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:18.009967 sshd-session[5440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:18.014331 systemd-logind[1504]: New session 30 of user core. May 13 23:56:18.024035 systemd[1]: Started session-30.scope - Session 30 of User core. May 13 23:56:18.151688 sshd[5442]: Connection closed by 10.0.0.1 port 60938 May 13 23:56:18.152020 sshd-session[5440]: pam_unix(sshd:session): session closed for user core May 13 23:56:18.156431 systemd[1]: sshd@29-10.0.0.42:22-10.0.0.1:60938.service: Deactivated successfully. May 13 23:56:18.159013 systemd[1]: session-30.scope: Deactivated successfully. May 13 23:56:18.159842 systemd-logind[1504]: Session 30 logged out. Waiting for processes to exit. May 13 23:56:18.160764 systemd-logind[1504]: Removed session 30. May 13 23:56:23.168395 systemd[1]: Started sshd@30-10.0.0.42:22-10.0.0.1:36538.service - OpenSSH per-connection server daemon (10.0.0.1:36538). May 13 23:56:23.226094 sshd[5456]: Accepted publickey for core from 10.0.0.1 port 36538 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:23.228076 sshd-session[5456]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:23.233933 systemd-logind[1504]: New session 31 of user core. May 13 23:56:23.244037 systemd[1]: Started session-31.scope - Session 31 of User core. May 13 23:56:23.406934 sshd[5458]: Connection closed by 10.0.0.1 port 36538 May 13 23:56:23.407321 sshd-session[5456]: pam_unix(sshd:session): session closed for user core May 13 23:56:23.411691 systemd[1]: sshd@30-10.0.0.42:22-10.0.0.1:36538.service: Deactivated successfully. May 13 23:56:23.414476 systemd[1]: session-31.scope: Deactivated successfully. May 13 23:56:23.415217 systemd-logind[1504]: Session 31 logged out. Waiting for processes to exit. May 13 23:56:23.416716 systemd-logind[1504]: Removed session 31. May 13 23:56:24.014679 containerd[1526]: time="2025-05-13T23:56:24.014575449Z" level=info msg="TaskExit event in podsandbox handler container_id:\"99f391a4fc14509d5f1af5a93387ab1291ed6d8edc5115a375e20fb6c0be0d17\" id:\"1949b2a54b53a6c4f2e829dd9b806f4b2db758a1c7cb6011979686b014fc9e38\" pid:5481 exited_at:{seconds:1747180584 nanos:14206478}" May 13 23:56:25.686933 containerd[1526]: time="2025-05-13T23:56:25.686870436Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30aa9d15ee14b4f5469527eb8ecd04553fa1460374f7c7aa6fd72e02dd67754b\" id:\"6ff36e8be9bc9b0789cb1da9f9b1237513c8d94f8c74b9114e412774570c4a7a\" pid:5508 exited_at:{seconds:1747180585 nanos:686611998}" May 13 23:56:28.429213 systemd[1]: Started sshd@31-10.0.0.42:22-10.0.0.1:54442.service - OpenSSH per-connection server daemon (10.0.0.1:54442). May 13 23:56:28.483333 sshd[5521]: Accepted publickey for core from 10.0.0.1 port 54442 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:28.485292 sshd-session[5521]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:28.490380 systemd-logind[1504]: New session 32 of user core. May 13 23:56:28.500018 systemd[1]: Started session-32.scope - Session 32 of User core. May 13 23:56:28.619862 sshd[5523]: Connection closed by 10.0.0.1 port 54442 May 13 23:56:28.620295 sshd-session[5521]: pam_unix(sshd:session): session closed for user core May 13 23:56:28.625080 systemd[1]: sshd@31-10.0.0.42:22-10.0.0.1:54442.service: Deactivated successfully. May 13 23:56:28.628146 systemd[1]: session-32.scope: Deactivated successfully. May 13 23:56:28.629041 systemd-logind[1504]: Session 32 logged out. Waiting for processes to exit. May 13 23:56:28.629991 systemd-logind[1504]: Removed session 32. May 13 23:56:33.634023 systemd[1]: Started sshd@32-10.0.0.42:22-10.0.0.1:54448.service - OpenSSH per-connection server daemon (10.0.0.1:54448). May 13 23:56:34.146681 sshd[5545]: Accepted publickey for core from 10.0.0.1 port 54448 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:34.183272 sshd-session[5545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:34.188809 systemd-logind[1504]: New session 33 of user core. May 13 23:56:34.197008 systemd[1]: Started session-33.scope - Session 33 of User core. May 13 23:56:34.377816 sshd[5547]: Connection closed by 10.0.0.1 port 54448 May 13 23:56:34.378245 sshd-session[5545]: pam_unix(sshd:session): session closed for user core May 13 23:56:34.382950 systemd[1]: sshd@32-10.0.0.42:22-10.0.0.1:54448.service: Deactivated successfully. May 13 23:56:34.385333 systemd[1]: session-33.scope: Deactivated successfully. May 13 23:56:34.386082 systemd-logind[1504]: Session 33 logged out. Waiting for processes to exit. May 13 23:56:34.387030 systemd-logind[1504]: Removed session 33. May 13 23:56:39.392061 systemd[1]: Started sshd@33-10.0.0.42:22-10.0.0.1:40180.service - OpenSSH per-connection server daemon (10.0.0.1:40180). May 13 23:56:39.443301 sshd[5566]: Accepted publickey for core from 10.0.0.1 port 40180 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:39.445102 sshd-session[5566]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:39.450339 systemd-logind[1504]: New session 34 of user core. May 13 23:56:39.466146 systemd[1]: Started session-34.scope - Session 34 of User core. May 13 23:56:39.605790 sshd[5568]: Connection closed by 10.0.0.1 port 40180 May 13 23:56:39.606211 sshd-session[5566]: pam_unix(sshd:session): session closed for user core May 13 23:56:39.610825 systemd[1]: sshd@33-10.0.0.42:22-10.0.0.1:40180.service: Deactivated successfully. May 13 23:56:39.613250 systemd[1]: session-34.scope: Deactivated successfully. May 13 23:56:39.614175 systemd-logind[1504]: Session 34 logged out. Waiting for processes to exit. May 13 23:56:39.615128 systemd-logind[1504]: Removed session 34. May 13 23:56:44.621264 systemd[1]: Started sshd@34-10.0.0.42:22-10.0.0.1:40196.service - OpenSSH per-connection server daemon (10.0.0.1:40196). May 13 23:56:44.675908 sshd[5592]: Accepted publickey for core from 10.0.0.1 port 40196 ssh2: RSA SHA256:UIO2GBLwcS3ioFABoZ1D2izRTMNLszi+/rE/G21mFYQ May 13 23:56:44.677767 sshd-session[5592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:44.682794 systemd-logind[1504]: New session 35 of user core. May 13 23:56:44.695022 systemd[1]: Started session-35.scope - Session 35 of User core. May 13 23:56:44.812867 sshd[5594]: Connection closed by 10.0.0.1 port 40196 May 13 23:56:44.813254 sshd-session[5592]: pam_unix(sshd:session): session closed for user core May 13 23:56:44.817466 systemd[1]: sshd@34-10.0.0.42:22-10.0.0.1:40196.service: Deactivated successfully. May 13 23:56:44.819804 systemd[1]: session-35.scope: Deactivated successfully. May 13 23:56:44.820528 systemd-logind[1504]: Session 35 logged out. Waiting for processes to exit. May 13 23:56:44.821502 systemd-logind[1504]: Removed session 35.