May 13 23:58:05.058189 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 22:08:35 -00 2025 May 13 23:58:05.058223 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:58:05.058239 kernel: BIOS-provided physical RAM map: May 13 23:58:05.058250 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 13 23:58:05.058260 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved May 13 23:58:05.058271 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable May 13 23:58:05.058284 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc8fff] reserved May 13 23:58:05.058295 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data May 13 23:58:05.058309 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS May 13 23:58:05.058320 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable May 13 23:58:05.058331 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable May 13 23:58:05.058343 kernel: printk: bootconsole [earlyser0] enabled May 13 23:58:05.058354 kernel: NX (Execute Disable) protection: active May 13 23:58:05.058365 kernel: APIC: Static calls initialized May 13 23:58:05.058382 kernel: efi: EFI v2.7 by Microsoft May 13 23:58:05.058395 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3f5c1a98 RNG=0x3ffd1018 May 13 23:58:05.058407 kernel: random: crng init done May 13 23:58:05.058419 kernel: secureboot: Secure boot disabled May 13 23:58:05.058431 kernel: SMBIOS 3.1.0 present. May 13 23:58:05.058444 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 03/08/2024 May 13 23:58:05.058456 kernel: Hypervisor detected: Microsoft Hyper-V May 13 23:58:05.058469 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 May 13 23:58:05.058481 kernel: Hyper-V: Host Build 10.0.20348.1827-1-0 May 13 23:58:05.058493 kernel: Hyper-V: Nested features: 0x1e0101 May 13 23:58:05.058505 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 May 13 23:58:05.058519 kernel: Hyper-V: Using hypercall for remote TLB flush May 13 23:58:05.058532 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 13 23:58:05.058545 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 13 23:58:05.058558 kernel: tsc: Marking TSC unstable due to running on Hyper-V May 13 23:58:05.058571 kernel: tsc: Detected 2593.904 MHz processor May 13 23:58:05.058583 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 23:58:05.058596 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 23:58:05.058609 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 May 13 23:58:05.058621 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 13 23:58:05.058637 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 23:58:05.058650 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved May 13 23:58:05.058662 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 May 13 23:58:05.058675 kernel: Using GB pages for direct mapping May 13 23:58:05.058687 kernel: ACPI: Early table checksum verification disabled May 13 23:58:05.058700 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) May 13 23:58:05.058718 kernel: ACPI: XSDT 0x000000003FFF90E8 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058734 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058747 kernel: ACPI: DSDT 0x000000003FFD6000 01E184 (v02 MSFTVM DSDT01 00000001 MSFT 05000000) May 13 23:58:05.058760 kernel: ACPI: FACS 0x000000003FFFE000 000040 May 13 23:58:05.058773 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058787 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058801 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058815 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058830 kernel: ACPI: SRAT 0x000000003FFD4000 0002D0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058844 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058857 kernel: ACPI: FPDT 0x000000003FFD2000 000034 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 13 23:58:05.058871 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] May 13 23:58:05.058885 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4183] May 13 23:58:05.058898 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] May 13 23:58:05.058911 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] May 13 23:58:05.058924 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] May 13 23:58:05.058940 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] May 13 23:58:05.058953 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] May 13 23:58:05.058967 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd42cf] May 13 23:58:05.058980 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] May 13 23:58:05.058993 kernel: ACPI: Reserving FPDT table memory at [mem 0x3ffd2000-0x3ffd2033] May 13 23:58:05.059005 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 May 13 23:58:05.059018 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 May 13 23:58:05.059051 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug May 13 23:58:05.059065 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug May 13 23:58:05.059081 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug May 13 23:58:05.059095 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug May 13 23:58:05.059109 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug May 13 23:58:05.059122 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug May 13 23:58:05.059135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug May 13 23:58:05.059148 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug May 13 23:58:05.059161 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug May 13 23:58:05.059174 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug May 13 23:58:05.059188 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug May 13 23:58:05.059204 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug May 13 23:58:05.059218 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000000-0x1ffffffffffff] hotplug May 13 23:58:05.059231 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2000000000000-0x3ffffffffffff] hotplug May 13 23:58:05.059245 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x4000000000000-0x7ffffffffffff] hotplug May 13 23:58:05.059258 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x8000000000000-0xfffffffffffff] hotplug May 13 23:58:05.059271 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] May 13 23:58:05.059285 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] May 13 23:58:05.059298 kernel: Zone ranges: May 13 23:58:05.059312 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 23:58:05.059328 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 13 23:58:05.059341 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] May 13 23:58:05.059354 kernel: Movable zone start for each node May 13 23:58:05.059367 kernel: Early memory node ranges May 13 23:58:05.059381 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 13 23:58:05.059394 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] May 13 23:58:05.059407 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] May 13 23:58:05.059421 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] May 13 23:58:05.059434 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] May 13 23:58:05.059450 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 23:58:05.059463 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 13 23:58:05.059477 kernel: On node 0, zone DMA32: 190 pages in unavailable ranges May 13 23:58:05.059490 kernel: ACPI: PM-Timer IO Port: 0x408 May 13 23:58:05.059503 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) May 13 23:58:05.059517 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 May 13 23:58:05.059530 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 23:58:05.059544 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 23:58:05.059557 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 May 13 23:58:05.059573 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 13 23:58:05.059586 kernel: [mem 0x40000000-0xffffffff] available for PCI devices May 13 23:58:05.059600 kernel: Booting paravirtualized kernel on Hyper-V May 13 23:58:05.059614 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 23:58:05.059628 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 13 23:58:05.059642 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 13 23:58:05.059655 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 13 23:58:05.059668 kernel: pcpu-alloc: [0] 0 1 May 13 23:58:05.059681 kernel: Hyper-V: PV spinlocks enabled May 13 23:58:05.059698 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 13 23:58:05.059713 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:58:05.059727 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:58:05.059741 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) May 13 23:58:05.059755 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:58:05.059768 kernel: Fallback order for Node 0: 0 May 13 23:58:05.059781 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2062618 May 13 23:58:05.059795 kernel: Policy zone: Normal May 13 23:58:05.059821 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:58:05.059836 kernel: software IO TLB: area num 2. May 13 23:58:05.059853 kernel: Memory: 8072992K/8387460K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 314212K reserved, 0K cma-reserved) May 13 23:58:05.059867 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 13 23:58:05.059881 kernel: ftrace: allocating 37993 entries in 149 pages May 13 23:58:05.059896 kernel: ftrace: allocated 149 pages with 4 groups May 13 23:58:05.059910 kernel: Dynamic Preempt: voluntary May 13 23:58:05.059924 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:58:05.059939 kernel: rcu: RCU event tracing is enabled. May 13 23:58:05.059954 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 13 23:58:05.059972 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:58:05.059987 kernel: Rude variant of Tasks RCU enabled. May 13 23:58:05.060003 kernel: Tracing variant of Tasks RCU enabled. May 13 23:58:05.060018 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:58:05.064282 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 13 23:58:05.064303 kernel: Using NULL legacy PIC May 13 23:58:05.064322 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 May 13 23:58:05.064336 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:58:05.064350 kernel: Console: colour dummy device 80x25 May 13 23:58:05.064364 kernel: printk: console [tty1] enabled May 13 23:58:05.064378 kernel: printk: console [ttyS0] enabled May 13 23:58:05.064392 kernel: printk: bootconsole [earlyser0] disabled May 13 23:58:05.064406 kernel: ACPI: Core revision 20230628 May 13 23:58:05.064420 kernel: Failed to register legacy timer interrupt May 13 23:58:05.064434 kernel: APIC: Switch to symmetric I/O mode setup May 13 23:58:05.064448 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 13 23:58:05.064465 kernel: Hyper-V: Using IPI hypercalls May 13 23:58:05.064479 kernel: APIC: send_IPI() replaced with hv_send_ipi() May 13 23:58:05.064493 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() May 13 23:58:05.064507 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() May 13 23:58:05.064521 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() May 13 23:58:05.064535 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() May 13 23:58:05.064549 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() May 13 23:58:05.064564 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.80 BogoMIPS (lpj=2593904) May 13 23:58:05.064581 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 May 13 23:58:05.064595 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 May 13 23:58:05.064609 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 23:58:05.064622 kernel: Spectre V2 : Mitigation: Retpolines May 13 23:58:05.064636 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 23:58:05.064650 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! May 13 23:58:05.064664 kernel: RETBleed: Vulnerable May 13 23:58:05.064677 kernel: Speculative Store Bypass: Vulnerable May 13 23:58:05.064691 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode May 13 23:58:05.064705 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode May 13 23:58:05.064718 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 13 23:58:05.064734 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 13 23:58:05.064748 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 13 23:58:05.064762 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' May 13 23:58:05.064776 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' May 13 23:58:05.064789 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' May 13 23:58:05.064803 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 13 23:58:05.064816 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 May 13 23:58:05.064830 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 May 13 23:58:05.064843 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 May 13 23:58:05.064857 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. May 13 23:58:05.064871 kernel: Freeing SMP alternatives memory: 32K May 13 23:58:05.064887 kernel: pid_max: default: 32768 minimum: 301 May 13 23:58:05.064900 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:58:05.064914 kernel: landlock: Up and running. May 13 23:58:05.064928 kernel: SELinux: Initializing. May 13 23:58:05.064942 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 13 23:58:05.064956 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 13 23:58:05.064970 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) May 13 23:58:05.064983 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:58:05.064998 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:58:05.065012 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:58:05.065040 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. May 13 23:58:05.065054 kernel: signal: max sigframe size: 3632 May 13 23:58:05.065068 kernel: rcu: Hierarchical SRCU implementation. May 13 23:58:05.065082 kernel: rcu: Max phase no-delay instances is 400. May 13 23:58:05.065096 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 13 23:58:05.065110 kernel: smp: Bringing up secondary CPUs ... May 13 23:58:05.065124 kernel: smpboot: x86: Booting SMP configuration: May 13 23:58:05.065138 kernel: .... node #0, CPUs: #1 May 13 23:58:05.065152 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. May 13 23:58:05.065170 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. May 13 23:58:05.065184 kernel: smp: Brought up 1 node, 2 CPUs May 13 23:58:05.065198 kernel: smpboot: Max logical packages: 1 May 13 23:58:05.065212 kernel: smpboot: Total of 2 processors activated (10375.61 BogoMIPS) May 13 23:58:05.065226 kernel: devtmpfs: initialized May 13 23:58:05.065239 kernel: x86/mm: Memory block size: 128MB May 13 23:58:05.065254 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) May 13 23:58:05.065268 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:58:05.065282 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 13 23:58:05.065299 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:58:05.065313 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:58:05.065327 kernel: audit: initializing netlink subsys (disabled) May 13 23:58:05.065341 kernel: audit: type=2000 audit(1747180683.027:1): state=initialized audit_enabled=0 res=1 May 13 23:58:05.065355 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:58:05.065369 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 23:58:05.065383 kernel: cpuidle: using governor menu May 13 23:58:05.065396 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:58:05.065410 kernel: dca service started, version 1.12.1 May 13 23:58:05.065427 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] May 13 23:58:05.065440 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 23:58:05.065454 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:58:05.065468 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:58:05.065482 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:58:05.065496 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:58:05.065510 kernel: ACPI: Added _OSI(Module Device) May 13 23:58:05.065523 kernel: ACPI: Added _OSI(Processor Device) May 13 23:58:05.065538 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:58:05.065550 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:58:05.065563 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:58:05.065577 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 13 23:58:05.065590 kernel: ACPI: Interpreter enabled May 13 23:58:05.065603 kernel: ACPI: PM: (supports S0 S5) May 13 23:58:05.065617 kernel: ACPI: Using IOAPIC for interrupt routing May 13 23:58:05.065631 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 23:58:05.065645 kernel: PCI: Ignoring E820 reservations for host bridge windows May 13 23:58:05.065661 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F May 13 23:58:05.065675 kernel: iommu: Default domain type: Translated May 13 23:58:05.065688 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 23:58:05.065702 kernel: efivars: Registered efivars operations May 13 23:58:05.065715 kernel: PCI: Using ACPI for IRQ routing May 13 23:58:05.065729 kernel: PCI: System does not support PCI May 13 23:58:05.065742 kernel: vgaarb: loaded May 13 23:58:05.065756 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page May 13 23:58:05.065769 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:58:05.065782 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:58:05.065799 kernel: pnp: PnP ACPI init May 13 23:58:05.065812 kernel: pnp: PnP ACPI: found 3 devices May 13 23:58:05.065826 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 23:58:05.065840 kernel: NET: Registered PF_INET protocol family May 13 23:58:05.065853 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 13 23:58:05.065867 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) May 13 23:58:05.065881 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:58:05.065894 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:58:05.065911 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 13 23:58:05.065924 kernel: TCP: Hash tables configured (established 65536 bind 65536) May 13 23:58:05.065938 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) May 13 23:58:05.065952 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) May 13 23:58:05.065966 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:58:05.065980 kernel: NET: Registered PF_XDP protocol family May 13 23:58:05.065994 kernel: PCI: CLS 0 bytes, default 64 May 13 23:58:05.066007 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 13 23:58:05.066021 kernel: software IO TLB: mapped [mem 0x000000003b5c1000-0x000000003f5c1000] (64MB) May 13 23:58:05.066046 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 13 23:58:05.066060 kernel: Initialise system trusted keyrings May 13 23:58:05.066073 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 May 13 23:58:05.066087 kernel: Key type asymmetric registered May 13 23:58:05.066100 kernel: Asymmetric key parser 'x509' registered May 13 23:58:05.066114 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 13 23:58:05.066127 kernel: io scheduler mq-deadline registered May 13 23:58:05.066141 kernel: io scheduler kyber registered May 13 23:58:05.066155 kernel: io scheduler bfq registered May 13 23:58:05.066171 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 23:58:05.066184 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:58:05.066198 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 23:58:05.066212 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A May 13 23:58:05.066224 kernel: i8042: PNP: No PS/2 controller found. May 13 23:58:05.066408 kernel: rtc_cmos 00:02: registered as rtc0 May 13 23:58:05.066527 kernel: rtc_cmos 00:02: setting system clock to 2025-05-13T23:58:04 UTC (1747180684) May 13 23:58:05.066636 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram May 13 23:58:05.066657 kernel: intel_pstate: CPU model not supported May 13 23:58:05.066672 kernel: efifb: probing for efifb May 13 23:58:05.066686 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 13 23:58:05.066700 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 13 23:58:05.066713 kernel: efifb: scrolling: redraw May 13 23:58:05.066727 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 13 23:58:05.066741 kernel: Console: switching to colour frame buffer device 128x48 May 13 23:58:05.066755 kernel: fb0: EFI VGA frame buffer device May 13 23:58:05.066769 kernel: pstore: Using crash dump compression: deflate May 13 23:58:05.066785 kernel: pstore: Registered efi_pstore as persistent store backend May 13 23:58:05.066799 kernel: NET: Registered PF_INET6 protocol family May 13 23:58:05.066813 kernel: Segment Routing with IPv6 May 13 23:58:05.066826 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:58:05.066840 kernel: NET: Registered PF_PACKET protocol family May 13 23:58:05.066855 kernel: Key type dns_resolver registered May 13 23:58:05.066869 kernel: IPI shorthand broadcast: enabled May 13 23:58:05.066883 kernel: sched_clock: Marking stable (826156600, 44016600)->(1060718800, -190545600) May 13 23:58:05.066897 kernel: registered taskstats version 1 May 13 23:58:05.066914 kernel: Loading compiled-in X.509 certificates May 13 23:58:05.066928 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 166efda032ca4d6e9037c569aca9b53585ee6f94' May 13 23:58:05.066942 kernel: Key type .fscrypt registered May 13 23:58:05.066956 kernel: Key type fscrypt-provisioning registered May 13 23:58:05.066970 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:58:05.066983 kernel: ima: Allocated hash algorithm: sha1 May 13 23:58:05.066997 kernel: ima: No architecture policies found May 13 23:58:05.067010 kernel: clk: Disabling unused clocks May 13 23:58:05.067024 kernel: Freeing unused kernel image (initmem) memory: 43604K May 13 23:58:05.067055 kernel: Write protecting the kernel read-only data: 40960k May 13 23:58:05.067069 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 13 23:58:05.067082 kernel: Run /init as init process May 13 23:58:05.067096 kernel: with arguments: May 13 23:58:05.067108 kernel: /init May 13 23:58:05.067121 kernel: with environment: May 13 23:58:05.067134 kernel: HOME=/ May 13 23:58:05.067146 kernel: TERM=linux May 13 23:58:05.067159 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:58:05.067177 systemd[1]: Successfully made /usr/ read-only. May 13 23:58:05.067194 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:58:05.067209 systemd[1]: Detected virtualization microsoft. May 13 23:58:05.067222 systemd[1]: Detected architecture x86-64. May 13 23:58:05.067236 systemd[1]: Running in initrd. May 13 23:58:05.067249 systemd[1]: No hostname configured, using default hostname. May 13 23:58:05.067263 systemd[1]: Hostname set to . May 13 23:58:05.067280 systemd[1]: Initializing machine ID from random generator. May 13 23:58:05.067294 systemd[1]: Queued start job for default target initrd.target. May 13 23:58:05.067308 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:58:05.067322 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:58:05.067337 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:58:05.067350 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:58:05.067364 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:58:05.067382 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:58:05.067397 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:58:05.067411 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:58:05.067425 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:58:05.067439 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:58:05.067454 systemd[1]: Reached target paths.target - Path Units. May 13 23:58:05.067467 systemd[1]: Reached target slices.target - Slice Units. May 13 23:58:05.067481 systemd[1]: Reached target swap.target - Swaps. May 13 23:58:05.067498 systemd[1]: Reached target timers.target - Timer Units. May 13 23:58:05.067511 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:58:05.067525 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:58:05.067540 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:58:05.067554 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:58:05.067568 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:58:05.067582 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:58:05.067596 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:58:05.067609 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:58:05.067626 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:58:05.067640 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:58:05.067654 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:58:05.067668 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:58:05.067681 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:58:05.067695 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:58:05.067709 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:05.067747 systemd-journald[177]: Collecting audit messages is disabled. May 13 23:58:05.067781 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:58:05.067795 systemd-journald[177]: Journal started May 13 23:58:05.067827 systemd-journald[177]: Runtime Journal (/run/log/journal/ea73c083a26741859cdbc99875d5c91f) is 8M, max 158.7M, 150.7M free. May 13 23:58:05.074041 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:58:05.077314 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:58:05.083484 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:58:05.085351 systemd-modules-load[179]: Inserted module 'overlay' May 13 23:58:05.093201 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:58:05.104291 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:58:05.111527 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:05.133188 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:58:05.133256 kernel: Bridge firewalling registered May 13 23:58:05.133308 systemd-modules-load[179]: Inserted module 'br_netfilter' May 13 23:58:05.138523 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:58:05.145349 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:58:05.154908 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:58:05.159476 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:58:05.171174 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:58:05.174879 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:58:05.182675 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:58:05.187136 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:58:05.205447 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:58:05.209163 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:58:05.219791 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:58:05.244992 systemd-resolved[204]: Positive Trust Anchors: May 13 23:58:05.245006 systemd-resolved[204]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:58:05.245078 systemd-resolved[204]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:58:05.250921 systemd-resolved[204]: Defaulting to hostname 'linux'. May 13 23:58:05.252903 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:58:05.278778 dracut-cmdline[215]: dracut-dracut-053 May 13 23:58:05.278778 dracut-cmdline[215]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:58:05.270492 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:58:05.345053 kernel: SCSI subsystem initialized May 13 23:58:05.355051 kernel: Loading iSCSI transport class v2.0-870. May 13 23:58:05.367057 kernel: iscsi: registered transport (tcp) May 13 23:58:05.387443 kernel: iscsi: registered transport (qla4xxx) May 13 23:58:05.387547 kernel: QLogic iSCSI HBA Driver May 13 23:58:05.422983 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:58:05.427155 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:58:05.460957 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:58:05.461070 kernel: device-mapper: uevent: version 1.0.3 May 13 23:58:05.464245 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:58:05.504059 kernel: raid6: avx512x4 gen() 18474 MB/s May 13 23:58:05.523043 kernel: raid6: avx512x2 gen() 18467 MB/s May 13 23:58:05.542037 kernel: raid6: avx512x1 gen() 18583 MB/s May 13 23:58:05.561041 kernel: raid6: avx2x4 gen() 18506 MB/s May 13 23:58:05.580041 kernel: raid6: avx2x2 gen() 18459 MB/s May 13 23:58:05.599806 kernel: raid6: avx2x1 gen() 13962 MB/s May 13 23:58:05.599845 kernel: raid6: using algorithm avx512x1 gen() 18583 MB/s May 13 23:58:05.620803 kernel: raid6: .... xor() 26822 MB/s, rmw enabled May 13 23:58:05.620840 kernel: raid6: using avx512x2 recovery algorithm May 13 23:58:05.644054 kernel: xor: automatically using best checksumming function avx May 13 23:58:05.788057 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:58:05.797262 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:58:05.803219 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:58:05.822770 systemd-udevd[397]: Using default interface naming scheme 'v255'. May 13 23:58:05.827824 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:58:05.838757 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:58:05.859918 dracut-pre-trigger[408]: rd.md=0: removing MD RAID activation May 13 23:58:05.887632 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:58:05.893248 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:58:05.942117 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:58:05.952161 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:58:05.984962 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:58:05.992945 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:58:05.999616 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:58:06.005731 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:58:06.014200 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:58:06.045060 kernel: cryptd: max_cpu_qlen set to 1000 May 13 23:58:06.049304 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:58:06.060312 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:58:06.060521 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:58:06.078680 kernel: AVX2 version of gcm_enc/dec engaged. May 13 23:58:06.078721 kernel: AES CTR mode by8 optimization enabled May 13 23:58:06.075689 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:58:06.082113 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:58:06.082433 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:06.088066 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:06.094730 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:06.121332 kernel: hv_vmbus: Vmbus version:5.2 May 13 23:58:06.121363 kernel: hv_vmbus: registering driver hyperv_keyboard May 13 23:58:06.123524 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:58:06.123628 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:06.135393 kernel: hv_vmbus: registering driver hv_storvsc May 13 23:58:06.132365 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:58:06.147285 kernel: scsi host1: storvsc_host_t May 13 23:58:06.147345 kernel: scsi host0: storvsc_host_t May 13 23:58:06.145307 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:06.156044 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 May 13 23:58:06.160051 kernel: hv_vmbus: registering driver hv_netvsc May 13 23:58:06.166963 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 May 13 23:58:06.173042 kernel: hid: raw HID events driver (C) Jiri Kosina May 13 23:58:06.177722 kernel: pps_core: LinuxPPS API ver. 1 registered May 13 23:58:06.177748 kernel: hv_vmbus: registering driver hid_hyperv May 13 23:58:06.177761 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 13 23:58:06.196775 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/VMBUS:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 May 13 23:58:06.196835 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 May 13 23:58:06.206887 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 13 23:58:06.207190 kernel: PTP clock support registered May 13 23:58:06.218746 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:06.225148 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:58:06.247882 kernel: hv_utils: Registering HyperV Utility Driver May 13 23:58:06.247934 kernel: hv_vmbus: registering driver hv_utils May 13 23:58:06.252319 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 13 23:58:06.252648 kernel: hv_utils: Heartbeat IC version 3.0 May 13 23:58:06.252681 kernel: hv_utils: Shutdown IC version 3.2 May 13 23:58:06.257065 kernel: hv_utils: TimeSync IC version 4.0 May 13 23:58:06.257097 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:58:06.722781 systemd-resolved[204]: Clock change detected. Flushing caches. May 13 23:58:06.729674 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:58:06.736240 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 13 23:58:06.749597 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) May 13 23:58:06.749950 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks May 13 23:58:06.755769 kernel: sd 0:0:0:0: [sda] Write Protect is off May 13 23:58:06.756034 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 May 13 23:58:06.756197 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA May 13 23:58:06.763060 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:58:06.763109 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 13 23:58:06.817961 kernel: hv_netvsc 7ced8d2c-77b2-7ced-8d2c-77b27ced8d2c eth0: VF slot 1 added May 13 23:58:06.828439 kernel: hv_vmbus: registering driver hv_pci May 13 23:58:06.828492 kernel: hv_pci b0273977-8e9b-4593-9d42-e5b53f4f3ba0: PCI VMBus probing: Using version 0x10004 May 13 23:58:06.835255 kernel: hv_pci b0273977-8e9b-4593-9d42-e5b53f4f3ba0: PCI host bridge to bus 8e9b:00 May 13 23:58:06.835547 kernel: pci_bus 8e9b:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] May 13 23:58:06.838309 kernel: pci_bus 8e9b:00: No busn resource found for root bus, will use [bus 00-ff] May 13 23:58:06.843662 kernel: pci 8e9b:00:02.0: [15b3:1016] type 00 class 0x020000 May 13 23:58:06.847327 kernel: pci 8e9b:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] May 13 23:58:06.850439 kernel: pci 8e9b:00:02.0: enabling Extended Tags May 13 23:58:06.860473 kernel: pci 8e9b:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 8e9b:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 13 23:58:06.867142 kernel: pci_bus 8e9b:00: busn_res: [bus 00-ff] end is updated to 00 May 13 23:58:06.867462 kernel: pci 8e9b:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] May 13 23:58:07.029428 kernel: mlx5_core 8e9b:00:02.0: enabling device (0000 -> 0002) May 13 23:58:07.033324 kernel: mlx5_core 8e9b:00:02.0: firmware version: 14.30.5000 May 13 23:58:07.247263 kernel: hv_netvsc 7ced8d2c-77b2-7ced-8d2c-77b27ced8d2c eth0: VF registering: eth1 May 13 23:58:07.247632 kernel: mlx5_core 8e9b:00:02.0 eth1: joined to eth0 May 13 23:58:07.254350 kernel: mlx5_core 8e9b:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) May 13 23:58:07.262314 kernel: mlx5_core 8e9b:00:02.0 enP36507s1: renamed from eth1 May 13 23:58:07.838325 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (445) May 13 23:58:07.859798 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. May 13 23:58:07.889635 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. May 13 23:58:07.911185 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. May 13 23:58:07.950331 kernel: BTRFS: device fsid d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (446) May 13 23:58:07.967908 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. May 13 23:58:07.974031 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. May 13 23:58:07.985430 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:58:08.005330 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:58:08.012346 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:58:09.020867 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:58:09.022044 disk-uuid[603]: The operation has completed successfully. May 13 23:58:09.100986 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:58:09.101095 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:58:09.140510 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:58:09.161019 sh[689]: Success May 13 23:58:09.190816 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" May 13 23:58:09.575457 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:58:09.584391 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:58:09.600809 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:58:09.620093 kernel: BTRFS info (device dm-0): first mount of filesystem d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 May 13 23:58:09.620175 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 23:58:09.623651 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:58:09.626365 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:58:09.628852 kernel: BTRFS info (device dm-0): using free space tree May 13 23:58:10.253802 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:58:10.257026 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:58:10.260465 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:58:10.269412 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:58:10.296817 kernel: BTRFS info (device sda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:58:10.296891 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:58:10.299050 kernel: BTRFS info (device sda6): using free space tree May 13 23:58:10.338324 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:58:10.345396 kernel: BTRFS info (device sda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:58:10.350413 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:58:10.356513 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:58:10.366749 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:58:10.376250 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:58:10.402991 systemd-networkd[870]: lo: Link UP May 13 23:58:10.403002 systemd-networkd[870]: lo: Gained carrier May 13 23:58:10.405344 systemd-networkd[870]: Enumeration completed May 13 23:58:10.405632 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:58:10.409039 systemd-networkd[870]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:58:10.409048 systemd-networkd[870]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:58:10.413540 systemd[1]: Reached target network.target - Network. May 13 23:58:10.473332 kernel: mlx5_core 8e9b:00:02.0 enP36507s1: Link up May 13 23:58:10.504337 kernel: hv_netvsc 7ced8d2c-77b2-7ced-8d2c-77b27ced8d2c eth0: Data path switched to VF: enP36507s1 May 13 23:58:10.505375 systemd-networkd[870]: enP36507s1: Link UP May 13 23:58:10.505540 systemd-networkd[870]: eth0: Link UP May 13 23:58:10.505739 systemd-networkd[870]: eth0: Gained carrier May 13 23:58:10.505754 systemd-networkd[870]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:58:10.517372 systemd-networkd[870]: enP36507s1: Gained carrier May 13 23:58:10.547363 systemd-networkd[870]: eth0: DHCPv4 address 10.200.8.5/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 13 23:58:11.732543 systemd-networkd[870]: eth0: Gained IPv6LL May 13 23:58:11.804632 ignition[865]: Ignition 2.20.0 May 13 23:58:11.804647 ignition[865]: Stage: fetch-offline May 13 23:58:11.804688 ignition[865]: no configs at "/usr/lib/ignition/base.d" May 13 23:58:11.804699 ignition[865]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:11.804808 ignition[865]: parsed url from cmdline: "" May 13 23:58:11.804815 ignition[865]: no config URL provided May 13 23:58:11.813530 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:58:11.804822 ignition[865]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:58:11.804832 ignition[865]: no config at "/usr/lib/ignition/user.ign" May 13 23:58:11.804840 ignition[865]: failed to fetch config: resource requires networking May 13 23:58:11.805053 ignition[865]: Ignition finished successfully May 13 23:58:11.830852 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 13 23:58:11.851360 ignition[881]: Ignition 2.20.0 May 13 23:58:11.851371 ignition[881]: Stage: fetch May 13 23:58:11.851565 ignition[881]: no configs at "/usr/lib/ignition/base.d" May 13 23:58:11.851577 ignition[881]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:11.851664 ignition[881]: parsed url from cmdline: "" May 13 23:58:11.851667 ignition[881]: no config URL provided May 13 23:58:11.851671 ignition[881]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:58:11.851678 ignition[881]: no config at "/usr/lib/ignition/user.ign" May 13 23:58:11.851701 ignition[881]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 13 23:58:11.933488 ignition[881]: GET result: OK May 13 23:58:11.933595 ignition[881]: config has been read from IMDS userdata May 13 23:58:11.933633 ignition[881]: parsing config with SHA512: b364616eb76bac31a2b3ab04c5f86c561494a795dbc6d84ceed67014191f498c5942a4eec27fa33d410afd6e274ffd6889627d7e1f28444e62899ef87819ee8e May 13 23:58:11.938688 unknown[881]: fetched base config from "system" May 13 23:58:11.938702 unknown[881]: fetched base config from "system" May 13 23:58:11.939162 ignition[881]: fetch: fetch complete May 13 23:58:11.938711 unknown[881]: fetched user config from "azure" May 13 23:58:11.939168 ignition[881]: fetch: fetch passed May 13 23:58:11.940749 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 13 23:58:11.939219 ignition[881]: Ignition finished successfully May 13 23:58:11.946504 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:58:11.971069 ignition[887]: Ignition 2.20.0 May 13 23:58:11.971083 ignition[887]: Stage: kargs May 13 23:58:11.971285 ignition[887]: no configs at "/usr/lib/ignition/base.d" May 13 23:58:11.974923 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:58:11.971311 ignition[887]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:11.980454 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:58:11.972200 ignition[887]: kargs: kargs passed May 13 23:58:11.972242 ignition[887]: Ignition finished successfully May 13 23:58:12.003447 ignition[893]: Ignition 2.20.0 May 13 23:58:12.003458 ignition[893]: Stage: disks May 13 23:58:12.005365 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:58:12.003662 ignition[893]: no configs at "/usr/lib/ignition/base.d" May 13 23:58:12.009790 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:58:12.003674 ignition[893]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:12.014150 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:58:12.004529 ignition[893]: disks: disks passed May 13 23:58:12.017266 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:58:12.004569 ignition[893]: Ignition finished successfully May 13 23:58:12.023557 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:58:12.026741 systemd[1]: Reached target basic.target - Basic System. May 13 23:58:12.039431 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:58:12.153326 systemd-fsck[901]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks May 13 23:58:12.158092 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:58:12.164794 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:58:12.260496 kernel: EXT4-fs (sda9): mounted filesystem c413e98b-da35-46b1-9852-45706e1b1f52 r/w with ordered data mode. Quota mode: none. May 13 23:58:12.261249 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:58:12.264284 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:58:12.322913 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:58:12.336391 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:58:12.343460 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 13 23:58:12.346343 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:58:12.346386 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:58:12.365330 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (912) May 13 23:58:12.367964 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:58:12.376575 kernel: BTRFS info (device sda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:58:12.376607 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:58:12.376626 kernel: BTRFS info (device sda6): using free space tree May 13 23:58:12.380469 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:58:12.387561 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:58:12.390204 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:58:12.500428 systemd-networkd[870]: enP36507s1: Gained IPv6LL May 13 23:58:13.707237 coreos-metadata[914]: May 13 23:58:13.707 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 13 23:58:13.713570 coreos-metadata[914]: May 13 23:58:13.713 INFO Fetch successful May 13 23:58:13.716147 coreos-metadata[914]: May 13 23:58:13.714 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 13 23:58:13.731153 coreos-metadata[914]: May 13 23:58:13.731 INFO Fetch successful May 13 23:58:13.733702 coreos-metadata[914]: May 13 23:58:13.731 INFO wrote hostname ci-4284.0.0-n-c527831f7b to /sysroot/etc/hostname May 13 23:58:13.733476 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:58:16.048887 initrd-setup-root[942]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:58:17.334255 initrd-setup-root[949]: cut: /sysroot/etc/group: No such file or directory May 13 23:58:18.544648 initrd-setup-root[956]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:58:18.578443 initrd-setup-root[963]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:58:19.865919 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:58:19.872994 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:58:19.879001 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:58:19.897219 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:58:19.903027 kernel: BTRFS info (device sda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:58:19.920835 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:58:19.929691 ignition[1031]: INFO : Ignition 2.20.0 May 13 23:58:19.929691 ignition[1031]: INFO : Stage: mount May 13 23:58:19.936136 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:58:19.936136 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:19.936136 ignition[1031]: INFO : mount: mount passed May 13 23:58:19.936136 ignition[1031]: INFO : Ignition finished successfully May 13 23:58:19.931577 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:58:19.937415 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:58:19.960687 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:58:19.986922 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1044) May 13 23:58:19.986978 kernel: BTRFS info (device sda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:58:19.990005 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:58:19.992428 kernel: BTRFS info (device sda6): using free space tree May 13 23:58:19.997315 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:58:19.999414 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:58:20.027944 ignition[1061]: INFO : Ignition 2.20.0 May 13 23:58:20.027944 ignition[1061]: INFO : Stage: files May 13 23:58:20.032053 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:58:20.032053 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:20.032053 ignition[1061]: DEBUG : files: compiled without relabeling support, skipping May 13 23:58:20.055714 ignition[1061]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:58:20.055714 ignition[1061]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:58:20.216406 ignition[1061]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:58:20.220406 ignition[1061]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:58:20.224015 unknown[1061]: wrote ssh authorized keys file for user: core May 13 23:58:20.226658 ignition[1061]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:58:20.310973 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:58:20.316071 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 13 23:58:20.607709 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:58:20.829085 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:58:20.834245 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:58:20.834245 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:58:20.834245 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:58:20.847524 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:58:20.847524 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:58:20.856760 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 13 23:58:21.407877 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:58:21.706252 ignition[1061]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 23:58:21.706252 ignition[1061]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:58:21.716145 ignition[1061]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:58:21.720842 ignition[1061]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:58:21.720842 ignition[1061]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:58:21.730818 ignition[1061]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 13 23:58:21.730818 ignition[1061]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:58:21.730818 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:58:21.730818 ignition[1061]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:58:21.730818 ignition[1061]: INFO : files: files passed May 13 23:58:21.730818 ignition[1061]: INFO : Ignition finished successfully May 13 23:58:21.722371 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:58:21.731425 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:58:21.743908 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:58:21.751745 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:58:21.777000 initrd-setup-root-after-ignition[1090]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:58:21.777000 initrd-setup-root-after-ignition[1090]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:58:21.751833 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:58:21.795141 initrd-setup-root-after-ignition[1094]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:58:21.766709 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:58:21.773804 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:58:21.783405 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:58:21.840996 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:58:21.841109 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:58:21.850830 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:58:21.856049 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:58:21.858661 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:58:21.861419 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:58:21.882410 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:58:21.887423 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:58:21.909935 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:58:21.917859 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:58:21.924660 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:58:21.927045 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:58:21.927170 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:58:21.932783 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:58:21.942418 systemd[1]: Stopped target basic.target - Basic System. May 13 23:58:21.944761 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:58:21.949948 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:58:21.955634 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:58:21.963934 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:58:21.965075 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:58:21.965500 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:58:21.965900 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:58:21.966317 systemd[1]: Stopped target swap.target - Swaps. May 13 23:58:21.966700 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:58:21.966845 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:58:21.967596 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:58:21.968011 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:58:21.968382 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:58:21.984121 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:58:21.989555 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:58:21.989712 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:58:21.998539 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:58:21.998694 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:58:22.062487 ignition[1115]: INFO : Ignition 2.20.0 May 13 23:58:22.062487 ignition[1115]: INFO : Stage: umount May 13 23:58:22.062487 ignition[1115]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:58:22.062487 ignition[1115]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 13 23:58:22.062487 ignition[1115]: INFO : umount: umount passed May 13 23:58:22.062487 ignition[1115]: INFO : Ignition finished successfully May 13 23:58:22.003897 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:58:22.004052 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:58:22.011322 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 13 23:58:22.011447 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:58:22.021498 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:58:22.023694 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:58:22.023867 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:58:22.037529 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:58:22.045445 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:58:22.045662 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:58:22.049611 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:58:22.049784 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:58:22.064165 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:58:22.064258 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:58:22.070033 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:58:22.071037 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:58:22.082368 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:58:22.082450 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:58:22.085213 systemd[1]: ignition-fetch.service: Deactivated successfully. May 13 23:58:22.085278 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 13 23:58:22.090066 systemd[1]: Stopped target network.target - Network. May 13 23:58:22.092207 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:58:22.092314 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:58:22.097769 systemd[1]: Stopped target paths.target - Path Units. May 13 23:58:22.110264 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:58:22.113158 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:58:22.123575 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:58:22.130830 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:58:22.133621 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:58:22.133678 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:58:22.141430 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:58:22.141474 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:58:22.148741 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:58:22.148805 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:58:22.153481 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:58:22.153540 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:58:22.158544 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:58:22.165785 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:58:22.174325 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:58:22.175101 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:58:22.175208 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:58:22.196918 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:58:22.197139 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:58:22.197248 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:58:22.201970 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:58:22.202229 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:58:22.202328 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:58:22.206684 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:58:22.206767 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:58:22.214127 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:58:22.214193 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:58:22.218600 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:58:22.218654 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:58:22.226385 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:58:22.236020 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:58:22.236086 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:58:22.244022 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:58:22.244084 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:58:22.249149 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:58:22.249206 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:58:22.254479 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:58:22.254525 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:58:22.260485 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:58:22.317037 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:58:22.317145 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:58:22.317580 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:58:22.317722 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:58:22.327139 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:58:22.327214 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:58:22.331883 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:58:22.331927 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:58:22.337640 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:58:22.337701 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:58:22.345269 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:58:22.346519 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:58:22.363406 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:58:22.363483 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:58:22.373058 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:58:22.377081 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:58:22.378647 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:58:22.384376 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:58:22.384423 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:22.398337 kernel: hv_netvsc 7ced8d2c-77b2-7ced-8d2c-77b27ced8d2c eth0: Data path switched from VF: enP36507s1 May 13 23:58:22.399273 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 13 23:58:22.399700 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:58:22.400179 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:58:22.400292 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:58:22.421422 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:58:22.421538 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:58:22.427037 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:58:22.434408 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:58:22.456640 systemd[1]: Switching root. May 13 23:58:22.507445 systemd-journald[177]: Journal stopped May 13 23:58:38.199393 systemd-journald[177]: Received SIGTERM from PID 1 (systemd). May 13 23:58:38.199439 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:58:38.199460 kernel: SELinux: policy capability open_perms=1 May 13 23:58:38.199472 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:58:38.199484 kernel: SELinux: policy capability always_check_network=0 May 13 23:58:38.199496 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:58:38.199511 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:58:38.199524 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:58:38.199540 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:58:38.199553 kernel: audit: type=1403 audit(1747180704.319:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:58:38.199567 systemd[1]: Successfully loaded SELinux policy in 663.176ms. May 13 23:58:38.199582 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.258ms. May 13 23:58:38.199598 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:58:38.199612 systemd[1]: Detected virtualization microsoft. May 13 23:58:38.199630 systemd[1]: Detected architecture x86-64. May 13 23:58:38.199645 systemd[1]: Detected first boot. May 13 23:58:38.199661 systemd[1]: Hostname set to . May 13 23:58:38.199675 systemd[1]: Initializing machine ID from random generator. May 13 23:58:38.199691 zram_generator::config[1160]: No configuration found. May 13 23:58:38.199710 kernel: Guest personality initialized and is inactive May 13 23:58:38.199725 kernel: VMCI host device registered (name=vmci, major=10, minor=124) May 13 23:58:38.199738 kernel: Initialized host personality May 13 23:58:38.199752 kernel: NET: Registered PF_VSOCK protocol family May 13 23:58:38.199766 systemd[1]: Populated /etc with preset unit settings. May 13 23:58:38.199783 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:58:38.199796 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:58:38.199810 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:58:38.199825 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:58:38.199843 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:58:38.199859 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:58:38.199874 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:58:38.199889 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:58:38.199902 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:58:38.199918 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:58:38.199933 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:58:38.199951 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:58:38.199967 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:58:38.199984 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:58:38.200000 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:58:38.200015 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:58:38.200035 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:58:38.200051 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:58:38.200066 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 23:58:38.200084 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:58:38.200099 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:58:38.200114 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:58:38.200129 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:58:38.200144 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:58:38.200158 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:58:38.200173 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:58:38.200192 systemd[1]: Reached target slices.target - Slice Units. May 13 23:58:38.200206 systemd[1]: Reached target swap.target - Swaps. May 13 23:58:38.200220 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:58:38.200235 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:58:38.200250 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:58:38.200269 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:58:38.200291 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:58:38.200322 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:58:38.200337 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:58:38.200352 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:58:38.200368 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:58:38.200383 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:58:38.200394 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:38.200409 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:58:38.200423 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:58:38.200436 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:58:38.200447 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:58:38.200460 systemd[1]: Reached target machines.target - Containers. May 13 23:58:38.200472 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:58:38.200484 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:58:38.200497 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:58:38.200510 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:58:38.200524 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:58:38.200537 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:58:38.200547 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:58:38.200557 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:58:38.200567 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:58:38.200578 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:58:38.200588 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:58:38.200597 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:58:38.200610 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:58:38.200620 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:58:38.200631 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:58:38.200641 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:58:38.200654 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:58:38.200665 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:58:38.200677 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:58:38.200688 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:58:38.200703 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:58:38.200713 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:58:38.200726 systemd[1]: Stopped verity-setup.service. May 13 23:58:38.200736 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:38.200749 kernel: loop: module loaded May 13 23:58:38.200759 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:58:38.200769 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:58:38.200782 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:58:38.200794 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:58:38.200807 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:58:38.200817 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:58:38.200830 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:58:38.200840 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:58:38.200854 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:58:38.200864 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:58:38.200877 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:58:38.200917 systemd-journald[1243]: Collecting audit messages is disabled. May 13 23:58:38.200943 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:58:38.200955 kernel: fuse: init (API version 7.39) May 13 23:58:38.200966 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:58:38.200981 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:58:38.200994 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:58:38.201005 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:58:38.201017 systemd-journald[1243]: Journal started May 13 23:58:38.201039 systemd-journald[1243]: Runtime Journal (/run/log/journal/5d93007f6b434ed5b1dfbcbdcfb2d672) is 8M, max 158.7M, 150.7M free. May 13 23:58:37.486877 systemd[1]: Queued start job for default target multi-user.target. May 13 23:58:37.498167 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 13 23:58:37.498578 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:58:38.209655 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:58:38.209701 kernel: ACPI: bus type drm_connector registered May 13 23:58:38.213799 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:58:38.214049 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:58:38.217006 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:58:38.217247 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:58:38.220716 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:58:38.224171 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:58:38.227915 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:58:38.231872 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:58:38.252322 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:58:38.259408 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:58:38.267495 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:58:38.270405 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:58:38.270452 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:58:38.274494 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:58:38.286551 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:58:38.294412 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:58:38.297452 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:58:38.299284 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:58:38.303541 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:58:38.307810 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:58:38.309447 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:58:38.312408 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:58:38.313782 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:58:38.318784 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:58:38.341059 systemd-journald[1243]: Time spent on flushing to /var/log/journal/5d93007f6b434ed5b1dfbcbdcfb2d672 is 22.931ms for 967 entries. May 13 23:58:38.341059 systemd-journald[1243]: System Journal (/var/log/journal/5d93007f6b434ed5b1dfbcbdcfb2d672) is 8M, max 2.6G, 2.6G free. May 13 23:58:38.387313 systemd-journald[1243]: Received client request to flush runtime journal. May 13 23:58:38.332523 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:58:38.337685 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:58:38.346266 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:58:38.349497 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:58:38.353290 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:58:38.360212 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:58:38.375278 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:58:38.389048 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:58:38.395289 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:58:38.401971 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:58:38.412318 kernel: loop0: detected capacity change from 0 to 109808 May 13 23:58:38.414894 udevadm[1313]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 13 23:58:38.426460 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:58:38.485436 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:58:38.500316 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:58:39.024842 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:58:39.028926 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:58:39.059493 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:58:39.084327 kernel: loop1: detected capacity change from 0 to 205544 May 13 23:58:39.118319 kernel: loop2: detected capacity change from 0 to 28424 May 13 23:58:39.182037 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. May 13 23:58:39.182060 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. May 13 23:58:39.186374 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:58:39.273323 kernel: loop3: detected capacity change from 0 to 151640 May 13 23:58:39.800513 kernel: loop4: detected capacity change from 0 to 109808 May 13 23:58:39.812320 kernel: loop5: detected capacity change from 0 to 205544 May 13 23:58:39.891440 kernel: loop6: detected capacity change from 0 to 28424 May 13 23:58:39.900326 kernel: loop7: detected capacity change from 0 to 151640 May 13 23:58:39.914322 (sd-merge)[1327]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 13 23:58:39.914940 (sd-merge)[1327]: Merged extensions into '/usr'. May 13 23:58:39.921381 systemd[1]: Reload requested from client PID 1302 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:58:39.921670 systemd[1]: Reloading... May 13 23:58:39.986472 zram_generator::config[1351]: No configuration found. May 13 23:58:40.133230 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:58:40.215563 systemd[1]: Reloading finished in 293 ms. May 13 23:58:40.237475 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:58:40.241436 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:58:40.252430 systemd[1]: Starting ensure-sysext.service... May 13 23:58:40.255991 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:58:40.263105 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:58:40.292414 systemd[1]: Reload requested from client PID 1414 ('systemctl') (unit ensure-sysext.service)... May 13 23:58:40.292434 systemd[1]: Reloading... May 13 23:58:40.304993 systemd-tmpfiles[1415]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:58:40.307548 systemd-tmpfiles[1415]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:58:40.309753 systemd-tmpfiles[1415]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:58:40.310150 systemd-tmpfiles[1415]: ACLs are not supported, ignoring. May 13 23:58:40.310239 systemd-tmpfiles[1415]: ACLs are not supported, ignoring. May 13 23:58:40.319848 systemd-tmpfiles[1415]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:58:40.319866 systemd-tmpfiles[1415]: Skipping /boot May 13 23:58:40.326849 systemd-udevd[1416]: Using default interface naming scheme 'v255'. May 13 23:58:40.339988 systemd-tmpfiles[1415]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:58:40.340002 systemd-tmpfiles[1415]: Skipping /boot May 13 23:58:40.400361 zram_generator::config[1443]: No configuration found. May 13 23:58:40.661501 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:58:40.707330 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:58:40.716407 kernel: hv_vmbus: registering driver hyperv_fb May 13 23:58:40.738364 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 13 23:58:40.746099 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 13 23:58:40.746176 kernel: hv_vmbus: registering driver hv_balloon May 13 23:58:40.751786 kernel: Console: switching to colour dummy device 80x25 May 13 23:58:40.759061 kernel: Console: switching to colour frame buffer device 128x48 May 13 23:58:40.780358 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 13 23:58:40.831607 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 23:58:40.835760 systemd[1]: Reloading finished in 542 ms. May 13 23:58:40.897337 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:58:40.901363 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:58:41.018699 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:58:41.028763 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:58:41.035755 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:58:41.043926 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:58:41.062119 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1485) May 13 23:58:41.061192 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:58:41.070602 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:58:41.084124 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:41.138048 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:41.138478 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:58:41.142709 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:58:41.157982 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:58:41.168065 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:58:41.171341 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:58:41.171523 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:58:41.171666 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:41.182424 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:58:41.188089 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:58:41.188385 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:58:41.191939 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:58:41.192613 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:58:41.196944 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:58:41.197184 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:58:41.240666 augenrules[1621]: No rules May 13 23:58:41.241426 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:58:41.244742 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:58:41.245384 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:58:41.250077 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:41.251174 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:58:41.255053 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:58:41.266415 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:58:41.274020 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:58:41.283663 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:58:41.289737 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:58:41.290221 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:58:41.290491 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:58:41.302138 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:58:41.306519 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:58:41.311762 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:58:41.313386 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:58:41.318102 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:58:41.319124 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:58:41.323655 kernel: kvm_intel: Using Hyper-V Enlightened VMCS May 13 23:58:41.326241 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:58:41.326585 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:58:41.334332 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:58:41.335537 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:58:41.363427 systemd[1]: Finished ensure-sysext.service. May 13 23:58:41.437324 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. May 13 23:58:41.455748 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:58:41.456867 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:58:41.457209 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:58:41.457661 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:58:41.470635 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:58:41.477735 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:58:41.480765 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:58:41.482375 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:41.489612 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:58:41.494762 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:58:41.562562 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:58:41.569316 lvm[1663]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:58:41.608193 systemd-networkd[1559]: lo: Link UP May 13 23:58:41.608204 systemd-networkd[1559]: lo: Gained carrier May 13 23:58:41.608353 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:58:41.610424 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:58:41.615459 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:58:41.622543 systemd-networkd[1559]: Enumeration completed May 13 23:58:41.625105 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:58:41.625602 systemd-networkd[1559]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:58:41.625610 systemd-networkd[1559]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:58:41.629867 lvm[1674]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:58:41.630160 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:58:41.638648 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:58:41.663411 systemd-resolved[1560]: Positive Trust Anchors: May 13 23:58:41.663425 systemd-resolved[1560]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:58:41.663484 systemd-resolved[1560]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:58:41.678464 systemd-resolved[1560]: Using system hostname 'ci-4284.0.0-n-c527831f7b'. May 13 23:58:41.681065 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:58:41.692320 kernel: mlx5_core 8e9b:00:02.0 enP36507s1: Link up May 13 23:58:41.716334 kernel: hv_netvsc 7ced8d2c-77b2-7ced-8d2c-77b27ced8d2c eth0: Data path switched to VF: enP36507s1 May 13 23:58:41.718840 systemd-networkd[1559]: enP36507s1: Link UP May 13 23:58:41.719013 systemd-networkd[1559]: eth0: Link UP May 13 23:58:41.719019 systemd-networkd[1559]: eth0: Gained carrier May 13 23:58:41.719041 systemd-networkd[1559]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:58:41.719451 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:58:41.720592 systemd[1]: Reached target network.target - Network. May 13 23:58:41.720910 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:58:41.723360 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:58:41.737248 systemd-networkd[1559]: enP36507s1: Gained carrier May 13 23:58:41.780366 systemd-networkd[1559]: eth0: DHCPv4 address 10.200.8.5/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 13 23:58:41.787017 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:58:41.788486 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:58:41.892104 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:58:42.964527 systemd-networkd[1559]: eth0: Gained IPv6LL May 13 23:58:42.967669 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:58:42.971860 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:58:43.156602 systemd-networkd[1559]: enP36507s1: Gained IPv6LL May 13 23:58:48.018687 ldconfig[1297]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:58:48.028846 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:58:48.033834 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:58:48.084743 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:58:48.087879 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:58:48.090764 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:58:48.093851 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:58:48.097047 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:58:48.099778 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:58:48.102801 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:58:48.105869 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:58:48.105913 systemd[1]: Reached target paths.target - Path Units. May 13 23:58:48.108201 systemd[1]: Reached target timers.target - Timer Units. May 13 23:58:48.111125 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:58:48.115462 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:58:48.120660 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:58:48.124498 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:58:48.127498 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:58:48.132129 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:58:48.135279 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:58:48.139029 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:58:48.141711 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:58:48.144150 systemd[1]: Reached target basic.target - Basic System. May 13 23:58:48.146492 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:58:48.146533 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:58:48.148785 systemd[1]: Starting chronyd.service - NTP client/server... May 13 23:58:48.154392 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:58:48.159648 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 13 23:58:48.171458 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:58:48.179388 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:58:48.188559 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:58:48.190981 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:58:48.191039 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). May 13 23:58:48.194505 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. May 13 23:58:48.197208 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). May 13 23:58:48.202447 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:58:48.206538 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:58:48.216273 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:58:48.217241 jq[1696]: false May 13 23:58:48.223407 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:58:48.229023 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:58:48.234006 KVP[1698]: KVP starting; pid is:1698 May 13 23:58:48.241432 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:58:48.243908 kernel: hv_utils: KVP IC version 4.0 May 13 23:58:48.244539 KVP[1698]: KVP LIC Version: 3.1 May 13 23:58:48.250018 (chronyd)[1689]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 13 23:58:48.256445 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:58:48.264133 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:58:48.264804 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:58:48.268649 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:58:48.274485 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:58:48.277741 chronyd[1710]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 13 23:58:48.283377 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:58:48.283668 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:58:48.297568 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:58:48.297823 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:58:48.315359 chronyd[1710]: Timezone right/UTC failed leap second check, ignoring May 13 23:58:48.315556 chronyd[1710]: Loaded seccomp filter (level 2) May 13 23:58:48.317212 systemd[1]: Started chronyd.service - NTP client/server. May 13 23:58:48.324662 jq[1712]: true May 13 23:58:48.328517 extend-filesystems[1697]: Found loop4 May 13 23:58:48.328517 extend-filesystems[1697]: Found loop5 May 13 23:58:48.328517 extend-filesystems[1697]: Found loop6 May 13 23:58:48.328517 extend-filesystems[1697]: Found loop7 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda May 13 23:58:48.328517 extend-filesystems[1697]: Found sda1 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda2 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda3 May 13 23:58:48.328517 extend-filesystems[1697]: Found usr May 13 23:58:48.328517 extend-filesystems[1697]: Found sda4 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda6 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda7 May 13 23:58:48.328517 extend-filesystems[1697]: Found sda9 May 13 23:58:48.328517 extend-filesystems[1697]: Checking size of /dev/sda9 May 13 23:58:48.412396 extend-filesystems[1697]: Old size kept for /dev/sda9 May 13 23:58:48.412396 extend-filesystems[1697]: Found sr0 May 13 23:58:48.334956 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:58:48.435471 update_engine[1707]: I20250513 23:58:48.422379 1707 main.cc:92] Flatcar Update Engine starting May 13 23:58:48.435772 jq[1729]: true May 13 23:58:48.337863 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:58:48.377512 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:58:48.377806 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:58:48.389738 (ntainerd)[1735]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:58:48.441496 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:58:48.449880 dbus-daemon[1692]: [system] SELinux support is enabled May 13 23:58:48.450394 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:58:48.461552 tar[1720]: linux-amd64/helm May 13 23:58:48.462688 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:58:48.462732 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:58:48.467647 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:58:48.467674 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:58:48.487480 systemd[1]: Started update-engine.service - Update Engine. May 13 23:58:48.490553 update_engine[1707]: I20250513 23:58:48.490494 1707 update_check_scheduler.cc:74] Next update check in 10m9s May 13 23:58:48.494522 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:58:48.584031 sshd_keygen[1730]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:58:48.641827 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1776) May 13 23:58:48.669380 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:58:48.698467 systemd-logind[1706]: New seat seat0. May 13 23:58:48.707654 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:58:48.710687 systemd-logind[1706]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) May 13 23:58:48.713430 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 13 23:58:48.720424 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:58:48.791374 bash[1774]: Updated "/home/core/.ssh/authorized_keys" May 13 23:58:48.793729 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:58:48.805734 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 23:58:48.811716 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:58:48.813161 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:58:48.822880 coreos-metadata[1691]: May 13 23:58:48.822 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 13 23:58:48.837700 coreos-metadata[1691]: May 13 23:58:48.835 INFO Fetch successful May 13 23:58:48.837700 coreos-metadata[1691]: May 13 23:58:48.837 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 13 23:58:48.842610 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:58:48.847624 coreos-metadata[1691]: May 13 23:58:48.847 INFO Fetch successful May 13 23:58:48.848138 coreos-metadata[1691]: May 13 23:58:48.848 INFO Fetching http://168.63.129.16/machine/6c7b7f2e-f326-4d4c-9c56-269c0800e589/222925a7%2Ddaea%2D4d4a%2D9dc4%2Df107f95180a2.%5Fci%2D4284.0.0%2Dn%2Dc527831f7b?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 13 23:58:48.850392 coreos-metadata[1691]: May 13 23:58:48.850 INFO Fetch successful May 13 23:58:48.851340 coreos-metadata[1691]: May 13 23:58:48.851 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 13 23:58:48.869217 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 13 23:58:48.871938 coreos-metadata[1691]: May 13 23:58:48.871 INFO Fetch successful May 13 23:58:48.926582 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 13 23:58:48.930070 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:58:48.941119 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:58:48.948493 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:58:48.957235 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 23:58:48.962731 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:58:48.977747 locksmithd[1763]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:58:49.277416 tar[1720]: linux-amd64/LICENSE May 13 23:58:49.278347 tar[1720]: linux-amd64/README.md May 13 23:58:49.298257 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:58:49.692959 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:58:49.702640 (kubelet)[1873]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:58:50.171462 kubelet[1873]: E0513 23:58:50.171348 1873 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:58:50.173753 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:58:50.173946 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:58:50.174391 systemd[1]: kubelet.service: Consumed 878ms CPU time, 237.4M memory peak. May 13 23:58:50.628123 containerd[1735]: time="2025-05-13T23:58:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:58:50.629382 containerd[1735]: time="2025-05-13T23:58:50.629336500Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:58:50.639623 containerd[1735]: time="2025-05-13T23:58:50.639585700Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.5µs" May 13 23:58:50.639623 containerd[1735]: time="2025-05-13T23:58:50.639611300Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:58:50.639749 containerd[1735]: time="2025-05-13T23:58:50.639633500Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:58:50.639819 containerd[1735]: time="2025-05-13T23:58:50.639789100Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:58:50.639860 containerd[1735]: time="2025-05-13T23:58:50.639815700Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:58:50.639860 containerd[1735]: time="2025-05-13T23:58:50.639847500Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:58:50.639939 containerd[1735]: time="2025-05-13T23:58:50.639917000Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:58:50.639939 containerd[1735]: time="2025-05-13T23:58:50.639933100Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:58:50.640179 containerd[1735]: time="2025-05-13T23:58:50.640147300Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:58:50.640179 containerd[1735]: time="2025-05-13T23:58:50.640168300Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:58:50.640262 containerd[1735]: time="2025-05-13T23:58:50.640183400Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:58:50.640262 containerd[1735]: time="2025-05-13T23:58:50.640194600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:58:50.640366 containerd[1735]: time="2025-05-13T23:58:50.640321200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:58:50.640576 containerd[1735]: time="2025-05-13T23:58:50.640543100Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:58:50.640637 containerd[1735]: time="2025-05-13T23:58:50.640581800Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:58:50.640637 containerd[1735]: time="2025-05-13T23:58:50.640608600Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:58:50.640715 containerd[1735]: time="2025-05-13T23:58:50.640647400Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:58:50.640957 containerd[1735]: time="2025-05-13T23:58:50.640928300Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:58:50.641030 containerd[1735]: time="2025-05-13T23:58:50.641008000Z" level=info msg="metadata content store policy set" policy=shared May 13 23:58:50.654967 containerd[1735]: time="2025-05-13T23:58:50.654930400Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:58:50.655050 containerd[1735]: time="2025-05-13T23:58:50.654991800Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:58:50.655050 containerd[1735]: time="2025-05-13T23:58:50.655016000Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:58:50.655050 containerd[1735]: time="2025-05-13T23:58:50.655034700Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655049800Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655064900Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655081800Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655098900Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655113900Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655129200Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655143200Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:58:50.655160 containerd[1735]: time="2025-05-13T23:58:50.655159000Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655333100Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655362900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655379500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655394400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655410200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655424400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655440800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655463100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655482600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:58:50.655501 containerd[1735]: time="2025-05-13T23:58:50.655499800Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:58:50.655885 containerd[1735]: time="2025-05-13T23:58:50.655514100Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:58:50.655885 containerd[1735]: time="2025-05-13T23:58:50.655584000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:58:50.655885 containerd[1735]: time="2025-05-13T23:58:50.655600700Z" level=info msg="Start snapshots syncer" May 13 23:58:50.655885 containerd[1735]: time="2025-05-13T23:58:50.655631500Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:58:50.656018 containerd[1735]: time="2025-05-13T23:58:50.655945500Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:58:50.656179 containerd[1735]: time="2025-05-13T23:58:50.656017200Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:58:50.656179 containerd[1735]: time="2025-05-13T23:58:50.656110700Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:58:50.656268 containerd[1735]: time="2025-05-13T23:58:50.656244800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:58:50.656322 containerd[1735]: time="2025-05-13T23:58:50.656275200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:58:50.656322 containerd[1735]: time="2025-05-13T23:58:50.656291600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:58:50.656399 containerd[1735]: time="2025-05-13T23:58:50.656341400Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:58:50.656399 containerd[1735]: time="2025-05-13T23:58:50.656367800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:58:50.656399 containerd[1735]: time="2025-05-13T23:58:50.656385000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:58:50.656493 containerd[1735]: time="2025-05-13T23:58:50.656400400Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:58:50.656493 containerd[1735]: time="2025-05-13T23:58:50.656434900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:58:50.656493 containerd[1735]: time="2025-05-13T23:58:50.656463300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:58:50.656493 containerd[1735]: time="2025-05-13T23:58:50.656479400Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656528400Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656547900Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656561200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656574300Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656585900Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656604200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:58:50.656625 containerd[1735]: time="2025-05-13T23:58:50.656620100Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:58:50.656850 containerd[1735]: time="2025-05-13T23:58:50.656640400Z" level=info msg="runtime interface created" May 13 23:58:50.656850 containerd[1735]: time="2025-05-13T23:58:50.656648300Z" level=info msg="created NRI interface" May 13 23:58:50.656850 containerd[1735]: time="2025-05-13T23:58:50.656670300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:58:50.656850 containerd[1735]: time="2025-05-13T23:58:50.656687800Z" level=info msg="Connect containerd service" May 13 23:58:50.656850 containerd[1735]: time="2025-05-13T23:58:50.656720100Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:58:50.657463 containerd[1735]: time="2025-05-13T23:58:50.657433400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305323600Z" level=info msg="Start subscribing containerd event" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305393400Z" level=info msg="Start recovering state" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305564500Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305574600Z" level=info msg="Start event monitor" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305624200Z" level=info msg="Start cni network conf syncer for default" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305641500Z" level=info msg="Start streaming server" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305657300Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305667500Z" level=info msg="runtime interface starting up..." May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305680500Z" level=info msg="starting plugins..." May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305699900Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305631200Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:58:52.305883 containerd[1735]: time="2025-05-13T23:58:52.305832500Z" level=info msg="containerd successfully booted in 1.678320s" May 13 23:58:52.306426 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:58:52.310812 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:58:52.316095 systemd[1]: Startup finished in 592ms (firmware) + 49.661s (loader) + 967ms (kernel) + 18.498s (initrd) + 28.657s (userspace) = 1min 38.378s. May 13 23:58:52.880564 waagent[1850]: 2025-05-13T23:58:52.880472Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 13 23:58:52.883648 waagent[1850]: 2025-05-13T23:58:52.883593Z INFO Daemon Daemon OS: flatcar 4284.0.0 May 13 23:58:52.885924 waagent[1850]: 2025-05-13T23:58:52.885876Z INFO Daemon Daemon Python: 3.11.11 May 13 23:58:52.888252 waagent[1850]: 2025-05-13T23:58:52.888193Z INFO Daemon Daemon Run daemon May 13 23:58:52.890323 waagent[1850]: 2025-05-13T23:58:52.890205Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4284.0.0' May 13 23:58:52.896442 waagent[1850]: 2025-05-13T23:58:52.894471Z INFO Daemon Daemon Using waagent for provisioning May 13 23:58:52.897172 waagent[1850]: 2025-05-13T23:58:52.897123Z INFO Daemon Daemon Activate resource disk May 13 23:58:52.900580 waagent[1850]: 2025-05-13T23:58:52.899537Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 13 23:58:52.904526 login[1858]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 13 23:58:52.907162 login[1859]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 13 23:58:52.912776 waagent[1850]: 2025-05-13T23:58:52.911944Z INFO Daemon Daemon Found device: None May 13 23:58:52.913244 waagent[1850]: 2025-05-13T23:58:52.913210Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 13 23:58:52.914166 waagent[1850]: 2025-05-13T23:58:52.914137Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 13 23:58:52.915510 waagent[1850]: 2025-05-13T23:58:52.915476Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 13 23:58:52.921030 waagent[1850]: 2025-05-13T23:58:52.920996Z INFO Daemon Daemon Running default provisioning handler May 13 23:58:52.942205 waagent[1850]: 2025-05-13T23:58:52.942149Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 13 23:58:52.943917 waagent[1850]: 2025-05-13T23:58:52.943878Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 13 23:58:52.945178 waagent[1850]: 2025-05-13T23:58:52.945144Z INFO Daemon Daemon cloud-init is enabled: False May 13 23:58:52.946008 waagent[1850]: 2025-05-13T23:58:52.945979Z INFO Daemon Daemon Copying ovf-env.xml May 13 23:58:52.958121 systemd-logind[1706]: New session 1 of user core. May 13 23:58:52.959843 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:58:52.961323 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:58:52.965307 systemd-logind[1706]: New session 2 of user core. May 13 23:58:52.980385 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:58:52.982922 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:58:53.034578 (systemd)[1911]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:58:53.037802 systemd-logind[1706]: New session c1 of user core. May 13 23:58:53.107359 waagent[1850]: 2025-05-13T23:58:53.107253Z INFO Daemon Daemon Successfully mounted dvd May 13 23:58:53.120882 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.121271Z INFO Daemon Daemon Detect protocol endpoint May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.122571Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.123429Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.123826Z INFO Daemon Daemon Test for route to 168.63.129.16 May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.124370Z INFO Daemon Daemon Route to 168.63.129.16 exists May 13 23:58:53.128763 waagent[1850]: 2025-05-13T23:58:53.124677Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 13 23:58:53.138517 waagent[1850]: 2025-05-13T23:58:53.136538Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 13 23:58:53.138517 waagent[1850]: 2025-05-13T23:58:53.137345Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 13 23:58:53.138517 waagent[1850]: 2025-05-13T23:58:53.138034Z INFO Daemon Daemon Server preferred version:2015-04-05 May 13 23:58:53.315434 waagent[1850]: 2025-05-13T23:58:53.315345Z INFO Daemon Daemon Initializing goal state during protocol detection May 13 23:58:53.321202 waagent[1850]: 2025-05-13T23:58:53.316713Z INFO Daemon Daemon Forcing an update of the goal state. May 13 23:58:53.323384 waagent[1850]: 2025-05-13T23:58:53.323333Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 13 23:58:53.366849 waagent[1850]: 2025-05-13T23:58:53.366783Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 13 23:58:53.370322 waagent[1850]: 2025-05-13T23:58:53.368452Z INFO Daemon May 13 23:58:53.370322 waagent[1850]: 2025-05-13T23:58:53.369982Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: e9df0c43-67c9-4fc1-a731-bd4da38af1b5 eTag: 12946018005606401728 source: Fabric] May 13 23:58:53.384411 waagent[1850]: 2025-05-13T23:58:53.371614Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 13 23:58:53.384411 waagent[1850]: 2025-05-13T23:58:53.372461Z INFO Daemon May 13 23:58:53.384411 waagent[1850]: 2025-05-13T23:58:53.373084Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 13 23:58:53.384411 waagent[1850]: 2025-05-13T23:58:53.377976Z INFO Daemon Daemon Downloading artifacts profile blob May 13 23:58:53.473989 waagent[1850]: 2025-05-13T23:58:53.473918Z INFO Daemon Downloaded certificate {'thumbprint': '8ADCCCA9FA1CDF9A34C372A48F4AE2AF4B6D02A4', 'hasPrivateKey': True} May 13 23:58:53.482117 waagent[1850]: 2025-05-13T23:58:53.482066Z INFO Daemon Downloaded certificate {'thumbprint': '7852FEBB06C265E0E96D2F3D8E52A7B4CF0CCF29', 'hasPrivateKey': False} May 13 23:58:53.489751 waagent[1850]: 2025-05-13T23:58:53.486848Z INFO Daemon Fetch goal state completed May 13 23:58:53.500065 waagent[1850]: 2025-05-13T23:58:53.495753Z INFO Daemon Daemon Starting provisioning May 13 23:58:53.500065 waagent[1850]: 2025-05-13T23:58:53.498134Z INFO Daemon Daemon Handle ovf-env.xml. May 13 23:58:53.500407 waagent[1850]: 2025-05-13T23:58:53.500354Z INFO Daemon Daemon Set hostname [ci-4284.0.0-n-c527831f7b] May 13 23:58:53.511658 waagent[1850]: 2025-05-13T23:58:53.511619Z INFO Daemon Daemon Publish hostname [ci-4284.0.0-n-c527831f7b] May 13 23:58:53.514850 waagent[1850]: 2025-05-13T23:58:53.514805Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 13 23:58:53.517936 waagent[1850]: 2025-05-13T23:58:53.517896Z INFO Daemon Daemon Primary interface is [eth0] May 13 23:58:53.531400 systemd-networkd[1559]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:58:53.532329 systemd-networkd[1559]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:58:53.532379 systemd-networkd[1559]: eth0: DHCP lease lost May 13 23:58:53.533025 waagent[1850]: 2025-05-13T23:58:53.532980Z INFO Daemon Daemon Create user account if not exists May 13 23:58:53.535705 waagent[1850]: 2025-05-13T23:58:53.535665Z INFO Daemon Daemon User core already exists, skip useradd May 13 23:58:53.538490 waagent[1850]: 2025-05-13T23:58:53.538448Z INFO Daemon Daemon Configure sudoer May 13 23:58:53.541411 waagent[1850]: 2025-05-13T23:58:53.541369Z INFO Daemon Daemon Configure sshd May 13 23:58:53.543627 waagent[1850]: 2025-05-13T23:58:53.543567Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 13 23:58:53.549283 waagent[1850]: 2025-05-13T23:58:53.549243Z INFO Daemon Daemon Deploy ssh public key. May 13 23:58:53.574270 systemd[1911]: Queued start job for default target default.target. May 13 23:58:53.574596 systemd-networkd[1559]: eth0: DHCPv4 address 10.200.8.5/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 13 23:58:53.580422 systemd[1911]: Created slice app.slice - User Application Slice. May 13 23:58:53.580457 systemd[1911]: Reached target paths.target - Paths. May 13 23:58:53.580510 systemd[1911]: Reached target timers.target - Timers. May 13 23:58:53.582420 systemd[1911]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:58:53.593274 systemd[1911]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:58:53.595170 systemd[1911]: Reached target sockets.target - Sockets. May 13 23:58:53.595467 systemd[1911]: Reached target basic.target - Basic System. May 13 23:58:53.595520 systemd[1911]: Reached target default.target - Main User Target. May 13 23:58:53.595551 systemd[1911]: Startup finished in 548ms. May 13 23:58:53.595937 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:58:53.600479 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:58:53.601423 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:58:54.651189 waagent[1850]: 2025-05-13T23:58:54.651119Z INFO Daemon Daemon Provisioning complete May 13 23:58:54.665188 waagent[1850]: 2025-05-13T23:58:54.665137Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 13 23:58:54.668319 waagent[1850]: 2025-05-13T23:58:54.668261Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 13 23:58:54.672446 waagent[1850]: 2025-05-13T23:58:54.672398Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 13 23:58:54.797498 waagent[1955]: 2025-05-13T23:58:54.797386Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 13 23:58:54.797900 waagent[1955]: 2025-05-13T23:58:54.797550Z INFO ExtHandler ExtHandler OS: flatcar 4284.0.0 May 13 23:58:54.797900 waagent[1955]: 2025-05-13T23:58:54.797622Z INFO ExtHandler ExtHandler Python: 3.11.11 May 13 23:58:54.797900 waagent[1955]: 2025-05-13T23:58:54.797693Z INFO ExtHandler ExtHandler CPU Arch: x86_64 May 13 23:58:57.432339 waagent[1955]: 2025-05-13T23:58:57.430718Z INFO ExtHandler ExtHandler Distro: flatcar-4284.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.11; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; May 13 23:58:57.432339 waagent[1955]: 2025-05-13T23:58:57.431073Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 13 23:58:57.432339 waagent[1955]: 2025-05-13T23:58:57.431200Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 13 23:58:57.440024 waagent[1955]: 2025-05-13T23:58:57.439962Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 13 23:58:57.445958 waagent[1955]: 2025-05-13T23:58:57.445913Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 13 23:58:57.446405 waagent[1955]: 2025-05-13T23:58:57.446358Z INFO ExtHandler May 13 23:58:57.446497 waagent[1955]: 2025-05-13T23:58:57.446442Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 58cb2588-aa59-4612-8234-c9df9cbe2a28 eTag: 12946018005606401728 source: Fabric] May 13 23:58:57.446786 waagent[1955]: 2025-05-13T23:58:57.446740Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 13 23:58:57.447289 waagent[1955]: 2025-05-13T23:58:57.447242Z INFO ExtHandler May 13 23:58:57.447366 waagent[1955]: 2025-05-13T23:58:57.447326Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 13 23:58:57.450909 waagent[1955]: 2025-05-13T23:58:57.450871Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 13 23:58:57.518733 waagent[1955]: 2025-05-13T23:58:57.518575Z INFO ExtHandler Downloaded certificate {'thumbprint': '8ADCCCA9FA1CDF9A34C372A48F4AE2AF4B6D02A4', 'hasPrivateKey': True} May 13 23:58:57.519407 waagent[1955]: 2025-05-13T23:58:57.519359Z INFO ExtHandler Downloaded certificate {'thumbprint': '7852FEBB06C265E0E96D2F3D8E52A7B4CF0CCF29', 'hasPrivateKey': False} May 13 23:58:57.519834 waagent[1955]: 2025-05-13T23:58:57.519790Z INFO ExtHandler Fetch goal state completed May 13 23:58:57.537425 waagent[1955]: 2025-05-13T23:58:57.537376Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 13 23:58:57.542051 waagent[1955]: 2025-05-13T23:58:57.541998Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1955 May 13 23:58:57.542187 waagent[1955]: 2025-05-13T23:58:57.542151Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 13 23:58:57.542527 waagent[1955]: 2025-05-13T23:58:57.542486Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 13 23:58:57.543922 waagent[1955]: 2025-05-13T23:58:57.543877Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4284.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 13 23:58:57.544336 waagent[1955]: 2025-05-13T23:58:57.544279Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4284.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 13 23:58:57.544492 waagent[1955]: 2025-05-13T23:58:57.544458Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 13 23:58:57.545092 waagent[1955]: 2025-05-13T23:58:57.545050Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 13 23:58:57.560216 waagent[1955]: 2025-05-13T23:58:57.560181Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 13 23:58:57.560397 waagent[1955]: 2025-05-13T23:58:57.560361Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 13 23:58:57.566939 waagent[1955]: 2025-05-13T23:58:57.566642Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 13 23:58:57.573652 systemd[1]: Reload requested from client PID 1980 ('systemctl') (unit waagent.service)... May 13 23:58:57.573669 systemd[1]: Reloading... May 13 23:58:57.655364 zram_generator::config[2015]: No configuration found. May 13 23:58:57.790606 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:58:57.902553 systemd[1]: Reloading finished in 328 ms. May 13 23:58:57.920190 waagent[1955]: 2025-05-13T23:58:57.919140Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 13 23:58:57.920190 waagent[1955]: 2025-05-13T23:58:57.919334Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 13 23:58:58.031401 waagent[1955]: 2025-05-13T23:58:58.031325Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 13 23:58:58.031723 waagent[1955]: 2025-05-13T23:58:58.031680Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 13 23:58:58.032464 waagent[1955]: 2025-05-13T23:58:58.032403Z INFO ExtHandler ExtHandler Starting env monitor service. May 13 23:58:58.032857 waagent[1955]: 2025-05-13T23:58:58.032808Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 13 23:58:58.032975 waagent[1955]: 2025-05-13T23:58:58.032937Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 13 23:58:58.033049 waagent[1955]: 2025-05-13T23:58:58.033012Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 13 23:58:58.033126 waagent[1955]: 2025-05-13T23:58:58.033093Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 13 23:58:58.033352 waagent[1955]: 2025-05-13T23:58:58.033316Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 13 23:58:58.034142 waagent[1955]: 2025-05-13T23:58:58.034102Z INFO EnvHandler ExtHandler Configure routes May 13 23:58:58.034232 waagent[1955]: 2025-05-13T23:58:58.034185Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 13 23:58:58.034379 waagent[1955]: 2025-05-13T23:58:58.034323Z INFO EnvHandler ExtHandler Gateway:None May 13 23:58:58.034428 waagent[1955]: 2025-05-13T23:58:58.034373Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 13 23:58:58.034817 waagent[1955]: 2025-05-13T23:58:58.034774Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 13 23:58:58.035083 waagent[1955]: 2025-05-13T23:58:58.035023Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 13 23:58:58.035387 waagent[1955]: 2025-05-13T23:58:58.035349Z INFO EnvHandler ExtHandler Routes:None May 13 23:58:58.035548 waagent[1955]: 2025-05-13T23:58:58.035513Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 13 23:58:58.035633 waagent[1955]: 2025-05-13T23:58:58.035557Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 13 23:58:58.035847 waagent[1955]: 2025-05-13T23:58:58.035813Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 13 23:58:58.035847 waagent[1955]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 13 23:58:58.035847 waagent[1955]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 May 13 23:58:58.035847 waagent[1955]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 13 23:58:58.035847 waagent[1955]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 13 23:58:58.035847 waagent[1955]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 13 23:58:58.035847 waagent[1955]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 13 23:58:58.045533 waagent[1955]: 2025-05-13T23:58:58.045451Z INFO ExtHandler ExtHandler May 13 23:58:58.045609 waagent[1955]: 2025-05-13T23:58:58.045532Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 972ef650-2893-4a45-bb25-bdc0b8f309c8 correlation 81e449aa-3b07-4a5f-9db7-c7893a560f1a created: 2025-05-13T23:56:57.700401Z] May 13 23:58:58.045915 waagent[1955]: 2025-05-13T23:58:58.045872Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 13 23:58:58.046931 waagent[1955]: 2025-05-13T23:58:58.046755Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] May 13 23:58:58.065417 waagent[1955]: 2025-05-13T23:58:58.065350Z INFO MonitorHandler ExtHandler Network interfaces: May 13 23:58:58.065417 waagent[1955]: Executing ['ip', '-a', '-o', 'link']: May 13 23:58:58.065417 waagent[1955]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 13 23:58:58.065417 waagent[1955]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:2c:77:b2 brd ff:ff:ff:ff:ff:ff May 13 23:58:58.065417 waagent[1955]: 3: enP36507s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:2c:77:b2 brd ff:ff:ff:ff:ff:ff\ altname enP36507p0s2 May 13 23:58:58.065417 waagent[1955]: Executing ['ip', '-4', '-a', '-o', 'address']: May 13 23:58:58.065417 waagent[1955]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 13 23:58:58.065417 waagent[1955]: 2: eth0 inet 10.200.8.5/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever May 13 23:58:58.065417 waagent[1955]: Executing ['ip', '-6', '-a', '-o', 'address']: May 13 23:58:58.065417 waagent[1955]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 13 23:58:58.065417 waagent[1955]: 2: eth0 inet6 fe80::7eed:8dff:fe2c:77b2/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 13 23:58:58.065417 waagent[1955]: 3: enP36507s1 inet6 fe80::7eed:8dff:fe2c:77b2/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 13 23:58:58.085135 waagent[1955]: 2025-05-13T23:58:58.085057Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 89365A20-16FC-4C3D-B159-A1D376A5174F;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 13 23:58:58.107436 waagent[1955]: 2025-05-13T23:58:58.107380Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 13 23:58:58.107436 waagent[1955]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.107436 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.107436 waagent[1955]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.107436 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.107436 waagent[1955]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.107436 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.107436 waagent[1955]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 13 23:58:58.107436 waagent[1955]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 13 23:58:58.107436 waagent[1955]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 13 23:58:58.110609 waagent[1955]: 2025-05-13T23:58:58.110558Z INFO EnvHandler ExtHandler Current Firewall rules: May 13 23:58:58.110609 waagent[1955]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.110609 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.110609 waagent[1955]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.110609 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.110609 waagent[1955]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 13 23:58:58.110609 waagent[1955]: pkts bytes target prot opt in out source destination May 13 23:58:58.110609 waagent[1955]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 13 23:58:58.110609 waagent[1955]: 1 52 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 13 23:58:58.110609 waagent[1955]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 13 23:58:58.111018 waagent[1955]: 2025-05-13T23:58:58.110846Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 May 13 23:59:00.186145 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:59:00.188066 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:00.915257 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:00.928638 (kubelet)[2117]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:59:01.355948 kubelet[2117]: E0513 23:59:01.355886 2117 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:59:01.359577 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:59:01.359765 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:59:01.360142 systemd[1]: kubelet.service: Consumed 148ms CPU time, 96.1M memory peak. May 13 23:59:11.436355 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:59:11.438532 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:11.559150 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:11.569643 (kubelet)[2133]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:59:12.109339 chronyd[1710]: Selected source PHC0 May 13 23:59:12.145801 kubelet[2133]: E0513 23:59:12.145742 2133 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:59:12.148134 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:59:12.148424 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:59:12.148819 systemd[1]: kubelet.service: Consumed 145ms CPU time, 97.2M memory peak. May 13 23:59:14.745699 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:59:14.747417 systemd[1]: Started sshd@0-10.200.8.5:22-10.200.16.10:44936.service - OpenSSH per-connection server daemon (10.200.16.10:44936). May 13 23:59:15.464085 sshd[2142]: Accepted publickey for core from 10.200.16.10 port 44936 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:15.465817 sshd-session[2142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:15.471945 systemd-logind[1706]: New session 3 of user core. May 13 23:59:15.478453 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:59:16.012879 systemd[1]: Started sshd@1-10.200.8.5:22-10.200.16.10:44952.service - OpenSSH per-connection server daemon (10.200.16.10:44952). May 13 23:59:16.640136 sshd[2147]: Accepted publickey for core from 10.200.16.10 port 44952 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:16.641807 sshd-session[2147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:16.646780 systemd-logind[1706]: New session 4 of user core. May 13 23:59:16.653458 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:59:17.082800 sshd[2149]: Connection closed by 10.200.16.10 port 44952 May 13 23:59:17.083785 sshd-session[2147]: pam_unix(sshd:session): session closed for user core May 13 23:59:17.086898 systemd[1]: sshd@1-10.200.8.5:22-10.200.16.10:44952.service: Deactivated successfully. May 13 23:59:17.088833 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:59:17.090211 systemd-logind[1706]: Session 4 logged out. Waiting for processes to exit. May 13 23:59:17.091515 systemd-logind[1706]: Removed session 4. May 13 23:59:17.193934 systemd[1]: Started sshd@2-10.200.8.5:22-10.200.16.10:44958.service - OpenSSH per-connection server daemon (10.200.16.10:44958). May 13 23:59:17.829220 sshd[2155]: Accepted publickey for core from 10.200.16.10 port 44958 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:17.830811 sshd-session[2155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:17.835024 systemd-logind[1706]: New session 5 of user core. May 13 23:59:17.841462 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:59:18.269259 sshd[2157]: Connection closed by 10.200.16.10 port 44958 May 13 23:59:18.270190 sshd-session[2155]: pam_unix(sshd:session): session closed for user core May 13 23:59:18.274585 systemd[1]: sshd@2-10.200.8.5:22-10.200.16.10:44958.service: Deactivated successfully. May 13 23:59:18.276732 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:59:18.277665 systemd-logind[1706]: Session 5 logged out. Waiting for processes to exit. May 13 23:59:18.278702 systemd-logind[1706]: Removed session 5. May 13 23:59:18.380927 systemd[1]: Started sshd@3-10.200.8.5:22-10.200.16.10:44964.service - OpenSSH per-connection server daemon (10.200.16.10:44964). May 13 23:59:19.011642 sshd[2163]: Accepted publickey for core from 10.200.16.10 port 44964 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:19.013284 sshd-session[2163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:19.019405 systemd-logind[1706]: New session 6 of user core. May 13 23:59:19.028487 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:59:19.461426 sshd[2165]: Connection closed by 10.200.16.10 port 44964 May 13 23:59:19.462340 sshd-session[2163]: pam_unix(sshd:session): session closed for user core May 13 23:59:19.466660 systemd[1]: sshd@3-10.200.8.5:22-10.200.16.10:44964.service: Deactivated successfully. May 13 23:59:19.468917 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:59:19.469884 systemd-logind[1706]: Session 6 logged out. Waiting for processes to exit. May 13 23:59:19.470864 systemd-logind[1706]: Removed session 6. May 13 23:59:19.576088 systemd[1]: Started sshd@4-10.200.8.5:22-10.200.16.10:43930.service - OpenSSH per-connection server daemon (10.200.16.10:43930). May 13 23:59:20.210761 sshd[2171]: Accepted publickey for core from 10.200.16.10 port 43930 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:20.212449 sshd-session[2171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:20.216848 systemd-logind[1706]: New session 7 of user core. May 13 23:59:20.225438 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:59:20.615698 sudo[2174]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:59:20.616044 sudo[2174]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:59:20.629795 sudo[2174]: pam_unix(sudo:session): session closed for user root May 13 23:59:20.731511 sshd[2173]: Connection closed by 10.200.16.10 port 43930 May 13 23:59:20.732721 sshd-session[2171]: pam_unix(sshd:session): session closed for user core May 13 23:59:20.736169 systemd[1]: sshd@4-10.200.8.5:22-10.200.16.10:43930.service: Deactivated successfully. May 13 23:59:20.738115 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:59:20.739607 systemd-logind[1706]: Session 7 logged out. Waiting for processes to exit. May 13 23:59:20.740613 systemd-logind[1706]: Removed session 7. May 13 23:59:20.843222 systemd[1]: Started sshd@5-10.200.8.5:22-10.200.16.10:43932.service - OpenSSH per-connection server daemon (10.200.16.10:43932). May 13 23:59:21.478759 sshd[2180]: Accepted publickey for core from 10.200.16.10 port 43932 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:21.480485 sshd-session[2180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:21.484889 systemd-logind[1706]: New session 8 of user core. May 13 23:59:21.495454 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:59:21.822090 sudo[2184]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:59:21.822455 sudo[2184]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:59:21.825809 sudo[2184]: pam_unix(sudo:session): session closed for user root May 13 23:59:21.830798 sudo[2183]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:59:21.831134 sudo[2183]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:59:21.840380 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:59:21.875528 augenrules[2206]: No rules May 13 23:59:21.876391 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:59:21.876618 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:59:21.877855 sudo[2183]: pam_unix(sudo:session): session closed for user root May 13 23:59:21.979747 sshd[2182]: Connection closed by 10.200.16.10 port 43932 May 13 23:59:21.980567 sshd-session[2180]: pam_unix(sshd:session): session closed for user core May 13 23:59:21.984940 systemd[1]: sshd@5-10.200.8.5:22-10.200.16.10:43932.service: Deactivated successfully. May 13 23:59:21.987278 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:59:21.988282 systemd-logind[1706]: Session 8 logged out. Waiting for processes to exit. May 13 23:59:21.989473 systemd-logind[1706]: Removed session 8. May 13 23:59:22.091054 systemd[1]: Started sshd@6-10.200.8.5:22-10.200.16.10:43940.service - OpenSSH per-connection server daemon (10.200.16.10:43940). May 13 23:59:22.186399 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:59:22.188437 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:22.305082 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:22.312603 (kubelet)[2225]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:59:22.349775 kubelet[2225]: E0513 23:59:22.349641 2225 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:59:22.352060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:59:22.352242 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:59:22.352633 systemd[1]: kubelet.service: Consumed 137ms CPU time, 94.2M memory peak. May 13 23:59:22.723977 sshd[2215]: Accepted publickey for core from 10.200.16.10 port 43940 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 13 23:59:22.725451 sshd-session[2215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:22.729815 systemd-logind[1706]: New session 9 of user core. May 13 23:59:22.736453 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:59:23.069778 sudo[2234]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:59:23.070211 sudo[2234]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:59:25.091400 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:59:25.102684 (dockerd)[2251]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:59:25.573603 dockerd[2251]: time="2025-05-13T23:59:25.573537757Z" level=info msg="Starting up" May 13 23:59:25.575696 dockerd[2251]: time="2025-05-13T23:59:25.575660371Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:59:25.619833 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2564517613-merged.mount: Deactivated successfully. May 13 23:59:25.669461 dockerd[2251]: time="2025-05-13T23:59:25.669409890Z" level=info msg="Loading containers: start." May 13 23:59:25.820324 kernel: Initializing XFRM netlink socket May 13 23:59:25.874915 systemd-networkd[1559]: docker0: Link UP May 13 23:59:25.943641 dockerd[2251]: time="2025-05-13T23:59:25.943590602Z" level=info msg="Loading containers: done." May 13 23:59:25.962733 dockerd[2251]: time="2025-05-13T23:59:25.962682328Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:59:25.962927 dockerd[2251]: time="2025-05-13T23:59:25.962786528Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:59:25.962927 dockerd[2251]: time="2025-05-13T23:59:25.962916829Z" level=info msg="Daemon has completed initialization" May 13 23:59:26.014242 dockerd[2251]: time="2025-05-13T23:59:26.013952966Z" level=info msg="API listen on /run/docker.sock" May 13 23:59:26.014127 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:59:27.409900 containerd[1735]: time="2025-05-13T23:59:27.409849988Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 23:59:28.144853 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2370791464.mount: Deactivated successfully. May 13 23:59:28.894325 kernel: hv_balloon: Max. dynamic memory size: 8192 MB May 13 23:59:29.740602 containerd[1735]: time="2025-05-13T23:59:29.740550123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:29.743759 containerd[1735]: time="2025-05-13T23:59:29.743681903Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960995" May 13 23:59:29.747318 containerd[1735]: time="2025-05-13T23:59:29.747266895Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:29.752284 containerd[1735]: time="2025-05-13T23:59:29.752218321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:29.753557 containerd[1735]: time="2025-05-13T23:59:29.753115344Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 2.343214656s" May 13 23:59:29.753557 containerd[1735]: time="2025-05-13T23:59:29.753158645Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 13 23:59:29.754804 containerd[1735]: time="2025-05-13T23:59:29.754778187Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 23:59:31.492762 containerd[1735]: time="2025-05-13T23:59:31.492706480Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:31.496335 containerd[1735]: time="2025-05-13T23:59:31.496259071Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713784" May 13 23:59:31.499214 containerd[1735]: time="2025-05-13T23:59:31.499156245Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:31.506531 containerd[1735]: time="2025-05-13T23:59:31.506497232Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:31.507526 containerd[1735]: time="2025-05-13T23:59:31.507389155Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 1.752577568s" May 13 23:59:31.507526 containerd[1735]: time="2025-05-13T23:59:31.507427656Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 13 23:59:31.508221 containerd[1735]: time="2025-05-13T23:59:31.508128874Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 23:59:32.436823 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:59:32.441525 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:32.593490 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:32.603639 (kubelet)[2516]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:59:33.094210 kubelet[2516]: E0513 23:59:33.094140 2516 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:59:33.096384 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:59:33.096578 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:59:33.096993 systemd[1]: kubelet.service: Consumed 179ms CPU time, 95.5M memory peak. May 13 23:59:33.155185 containerd[1735]: time="2025-05-13T23:59:33.155128571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:33.157284 containerd[1735]: time="2025-05-13T23:59:33.157211789Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780394" May 13 23:59:33.160718 containerd[1735]: time="2025-05-13T23:59:33.160664618Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:33.166648 containerd[1735]: time="2025-05-13T23:59:33.166587267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:33.168038 containerd[1735]: time="2025-05-13T23:59:33.167449275Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 1.659286999s" May 13 23:59:33.168038 containerd[1735]: time="2025-05-13T23:59:33.167484675Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 13 23:59:33.168205 containerd[1735]: time="2025-05-13T23:59:33.168183181Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 23:59:34.149440 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1771577362.mount: Deactivated successfully. May 13 23:59:34.200396 update_engine[1707]: I20250513 23:59:34.200335 1707 update_attempter.cc:509] Updating boot flags... May 13 23:59:34.290361 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2546) May 13 23:59:34.530453 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2550) May 13 23:59:34.996501 containerd[1735]: time="2025-05-13T23:59:34.996442790Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:35.000593 containerd[1735]: time="2025-05-13T23:59:35.000491524Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354633" May 13 23:59:35.004202 containerd[1735]: time="2025-05-13T23:59:35.004136755Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:35.009650 containerd[1735]: time="2025-05-13T23:59:35.009587000Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:35.010523 containerd[1735]: time="2025-05-13T23:59:35.010162705Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 1.841941524s" May 13 23:59:35.010523 containerd[1735]: time="2025-05-13T23:59:35.010211506Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 13 23:59:35.010808 containerd[1735]: time="2025-05-13T23:59:35.010785810Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 23:59:35.581947 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3676947240.mount: Deactivated successfully. May 13 23:59:36.763928 containerd[1735]: time="2025-05-13T23:59:36.763875605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:36.767094 containerd[1735]: time="2025-05-13T23:59:36.767023127Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 13 23:59:36.770493 containerd[1735]: time="2025-05-13T23:59:36.770438251Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:36.776802 containerd[1735]: time="2025-05-13T23:59:36.776749696Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:36.777686 containerd[1735]: time="2025-05-13T23:59:36.777653502Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.766787391s" May 13 23:59:36.777928 containerd[1735]: time="2025-05-13T23:59:36.777789503Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 13 23:59:36.778617 containerd[1735]: time="2025-05-13T23:59:36.778588909Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 23:59:37.325538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount995957421.mount: Deactivated successfully. May 13 23:59:37.348268 containerd[1735]: time="2025-05-13T23:59:37.348217654Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:59:37.351569 containerd[1735]: time="2025-05-13T23:59:37.351498877Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 13 23:59:37.357214 containerd[1735]: time="2025-05-13T23:59:37.357159418Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:59:37.361833 containerd[1735]: time="2025-05-13T23:59:37.361775750Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:59:37.362860 containerd[1735]: time="2025-05-13T23:59:37.362368755Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 583.746446ms" May 13 23:59:37.362860 containerd[1735]: time="2025-05-13T23:59:37.362404155Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 13 23:59:37.363100 containerd[1735]: time="2025-05-13T23:59:37.363055460Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 23:59:38.008924 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3319980661.mount: Deactivated successfully. May 13 23:59:40.190580 containerd[1735]: time="2025-05-13T23:59:40.190528838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:40.194413 containerd[1735]: time="2025-05-13T23:59:40.194330565Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" May 13 23:59:40.198612 containerd[1735]: time="2025-05-13T23:59:40.198552595Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:40.206665 containerd[1735]: time="2025-05-13T23:59:40.206522751Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.843432291s" May 13 23:59:40.206665 containerd[1735]: time="2025-05-13T23:59:40.206564152Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 13 23:59:40.207369 containerd[1735]: time="2025-05-13T23:59:40.207341057Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:59:43.168329 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:59:43.170270 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:43.182031 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 23:59:43.182123 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 23:59:43.182404 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:43.186553 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:43.222071 systemd[1]: Reload requested from client PID 2775 ('systemctl') (unit session-9.scope)... May 13 23:59:43.222086 systemd[1]: Reloading... May 13 23:59:43.368328 zram_generator::config[2822]: No configuration found. May 13 23:59:43.486974 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:59:43.603279 systemd[1]: Reloading finished in 380 ms. May 13 23:59:43.663069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:43.667262 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:43.669467 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:59:43.669713 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:43.669768 systemd[1]: kubelet.service: Consumed 118ms CPU time, 83.6M memory peak. May 13 23:59:43.671372 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:59:50.591203 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:59:50.601835 (kubelet)[2894]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:59:50.639643 kubelet[2894]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:59:50.639643 kubelet[2894]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:59:50.639643 kubelet[2894]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:59:50.640109 kubelet[2894]: I0513 23:59:50.639738 2894 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:59:50.992392 kubelet[2894]: I0513 23:59:50.992033 2894 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 23:59:50.992392 kubelet[2894]: I0513 23:59:50.992070 2894 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:59:51.284142 kubelet[2894]: I0513 23:59:51.284010 2894 server.go:929] "Client rotation is on, will bootstrap in background" May 13 23:59:51.312699 kubelet[2894]: E0513 23:59:51.312505 2894 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.5:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:51.312699 kubelet[2894]: I0513 23:59:51.312541 2894 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:59:51.319945 kubelet[2894]: I0513 23:59:51.319921 2894 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:59:51.325665 kubelet[2894]: I0513 23:59:51.325646 2894 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:59:51.326693 kubelet[2894]: I0513 23:59:51.326662 2894 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 23:59:51.326887 kubelet[2894]: I0513 23:59:51.326848 2894 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:59:51.327076 kubelet[2894]: I0513 23:59:51.326884 2894 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-n-c527831f7b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:59:51.327215 kubelet[2894]: I0513 23:59:51.327090 2894 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:59:51.327215 kubelet[2894]: I0513 23:59:51.327103 2894 container_manager_linux.go:300] "Creating device plugin manager" May 13 23:59:51.327291 kubelet[2894]: I0513 23:59:51.327229 2894 state_mem.go:36] "Initialized new in-memory state store" May 13 23:59:51.329172 kubelet[2894]: I0513 23:59:51.329143 2894 kubelet.go:408] "Attempting to sync node with API server" May 13 23:59:51.329254 kubelet[2894]: I0513 23:59:51.329183 2894 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:59:51.329254 kubelet[2894]: I0513 23:59:51.329224 2894 kubelet.go:314] "Adding apiserver pod source" May 13 23:59:51.329254 kubelet[2894]: I0513 23:59:51.329244 2894 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:59:51.332946 kubelet[2894]: W0513 23:59:51.332695 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:51.332946 kubelet[2894]: E0513 23:59:51.332766 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:51.334741 kubelet[2894]: W0513 23:59:51.334644 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:51.334741 kubelet[2894]: E0513 23:59:51.334702 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:51.335052 kubelet[2894]: I0513 23:59:51.334923 2894 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:59:51.336760 kubelet[2894]: I0513 23:59:51.336735 2894 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:59:51.337637 kubelet[2894]: W0513 23:59:51.337599 2894 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:59:51.340321 kubelet[2894]: I0513 23:59:51.338518 2894 server.go:1269] "Started kubelet" May 13 23:59:51.340321 kubelet[2894]: I0513 23:59:51.340268 2894 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:59:51.341781 kubelet[2894]: I0513 23:59:51.341748 2894 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:59:51.344215 kubelet[2894]: I0513 23:59:51.344197 2894 server.go:460] "Adding debug handlers to kubelet server" May 13 23:59:51.346865 kubelet[2894]: I0513 23:59:51.346803 2894 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:59:51.347169 kubelet[2894]: I0513 23:59:51.347150 2894 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:59:51.352786 kubelet[2894]: I0513 23:59:51.352756 2894 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:59:51.354851 kubelet[2894]: I0513 23:59:51.354828 2894 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 23:59:51.354978 kubelet[2894]: E0513 23:59:51.354957 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.355503 kubelet[2894]: I0513 23:59:51.355481 2894 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 23:59:51.355574 kubelet[2894]: I0513 23:59:51.355556 2894 reconciler.go:26] "Reconciler: start to sync state" May 13 23:59:51.360024 kubelet[2894]: E0513 23:59:51.357492 2894 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.5:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.5:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284.0.0-n-c527831f7b.183f3bac3a1dec56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284.0.0-n-c527831f7b,UID:ci-4284.0.0-n-c527831f7b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284.0.0-n-c527831f7b,},FirstTimestamp:2025-05-13 23:59:51.338495062 +0000 UTC m=+0.732825348,LastTimestamp:2025-05-13 23:59:51.338495062 +0000 UTC m=+0.732825348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284.0.0-n-c527831f7b,}" May 13 23:59:51.360157 kubelet[2894]: W0513 23:59:51.360095 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:51.360157 kubelet[2894]: E0513 23:59:51.360147 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:51.360249 kubelet[2894]: E0513 23:59:51.360223 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="200ms" May 13 23:59:51.362656 kubelet[2894]: I0513 23:59:51.361603 2894 factory.go:221] Registration of the containerd container factory successfully May 13 23:59:51.362656 kubelet[2894]: I0513 23:59:51.361623 2894 factory.go:221] Registration of the systemd container factory successfully May 13 23:59:51.362656 kubelet[2894]: I0513 23:59:51.361702 2894 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:59:51.370828 kubelet[2894]: I0513 23:59:51.370720 2894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:59:51.371769 kubelet[2894]: I0513 23:59:51.371752 2894 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:59:51.371853 kubelet[2894]: I0513 23:59:51.371846 2894 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:59:51.371913 kubelet[2894]: I0513 23:59:51.371905 2894 kubelet.go:2321] "Starting kubelet main sync loop" May 13 23:59:51.372002 kubelet[2894]: E0513 23:59:51.371972 2894 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:59:51.372197 kubelet[2894]: E0513 23:59:51.372187 2894 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:59:51.378335 kubelet[2894]: W0513 23:59:51.378160 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:51.378335 kubelet[2894]: E0513 23:59:51.378205 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:51.387822 kubelet[2894]: I0513 23:59:51.387772 2894 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:59:51.387822 kubelet[2894]: I0513 23:59:51.387786 2894 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:59:51.387822 kubelet[2894]: I0513 23:59:51.387803 2894 state_mem.go:36] "Initialized new in-memory state store" May 13 23:59:51.455523 kubelet[2894]: E0513 23:59:51.455288 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.472782 kubelet[2894]: E0513 23:59:51.472725 2894 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:59:51.556237 kubelet[2894]: E0513 23:59:51.556078 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.561734 kubelet[2894]: E0513 23:59:51.561681 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="400ms" May 13 23:59:51.656961 kubelet[2894]: E0513 23:59:51.656904 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.673200 kubelet[2894]: E0513 23:59:51.673149 2894 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:59:51.757922 kubelet[2894]: E0513 23:59:51.757685 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.858997 kubelet[2894]: E0513 23:59:51.858859 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.959490 kubelet[2894]: E0513 23:59:51.959435 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:51.963042 kubelet[2894]: E0513 23:59:51.962996 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="800ms" May 13 23:59:52.060627 kubelet[2894]: E0513 23:59:52.060570 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:52.073884 kubelet[2894]: E0513 23:59:52.073825 2894 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.161523 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.262328 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.363322 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.464072 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: W0513 23:59:52.469783 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.469824 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.564789 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.665611 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.763562 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="1.6s" May 13 23:59:53.985327 kubelet[2894]: E0513 23:59:52.766594 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986046 kubelet[2894]: W0513 23:59:52.823845 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:53.986046 kubelet[2894]: E0513 23:59:52.823895 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:53.986046 kubelet[2894]: E0513 23:59:52.867236 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986046 kubelet[2894]: E0513 23:59:52.874460 2894 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:59:53.986046 kubelet[2894]: W0513 23:59:52.882175 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:53.986046 kubelet[2894]: E0513 23:59:52.882247 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:53.986046 kubelet[2894]: W0513 23:59:52.893919 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:52.893975 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:52.967926 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.068753 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.169549 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.270341 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.370727 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.410253 2894 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.5:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.471052 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.572044 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986248 kubelet[2894]: E0513 23:59:53.672965 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986557 kubelet[2894]: E0513 23:59:53.773745 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986557 kubelet[2894]: E0513 23:59:53.874452 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:53.986557 kubelet[2894]: E0513 23:59:53.975144 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.076156 kubelet[2894]: E0513 23:59:54.076093 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.176933 kubelet[2894]: E0513 23:59:54.176880 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.426532 kubelet[2894]: E0513 23:59:54.277705 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.426532 kubelet[2894]: E0513 23:59:54.356452 2894 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.5:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.5:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284.0.0-n-c527831f7b.183f3bac3a1dec56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284.0.0-n-c527831f7b,UID:ci-4284.0.0-n-c527831f7b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284.0.0-n-c527831f7b,},FirstTimestamp:2025-05-13 23:59:51.338495062 +0000 UTC m=+0.732825348,LastTimestamp:2025-05-13 23:59:51.338495062 +0000 UTC m=+0.732825348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284.0.0-n-c527831f7b,}" May 13 23:59:54.426532 kubelet[2894]: E0513 23:59:54.364862 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="3.2s" May 13 23:59:54.426532 kubelet[2894]: E0513 23:59:54.378448 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.475395 kubelet[2894]: E0513 23:59:54.475332 2894 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:59:54.478696 kubelet[2894]: E0513 23:59:54.478608 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.579860 kubelet[2894]: E0513 23:59:54.579768 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.680730 kubelet[2894]: E0513 23:59:54.680593 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.781320 kubelet[2894]: E0513 23:59:54.781249 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.881257 kubelet[2894]: W0513 23:59:54.881211 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:54.881465 kubelet[2894]: E0513 23:59:54.881267 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.5:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:54.881465 kubelet[2894]: E0513 23:59:54.881334 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:54.883494 kubelet[2894]: I0513 23:59:54.883465 2894 policy_none.go:49] "None policy: Start" May 13 23:59:54.884356 kubelet[2894]: I0513 23:59:54.884285 2894 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:59:54.884356 kubelet[2894]: I0513 23:59:54.884333 2894 state_mem.go:35] "Initializing new in-memory state store" May 13 23:59:54.982182 kubelet[2894]: E0513 23:59:54.982104 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.083045 kubelet[2894]: E0513 23:59:55.082991 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.183996 kubelet[2894]: E0513 23:59:55.183925 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.284818 kubelet[2894]: E0513 23:59:55.284675 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.385237 kubelet[2894]: E0513 23:59:55.385186 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.486195 kubelet[2894]: E0513 23:59:55.486142 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.492892 kubelet[2894]: W0513 23:59:55.492855 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:55.493011 kubelet[2894]: E0513 23:59:55.492908 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.5:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:55.672840 kubelet[2894]: W0513 23:59:55.551778 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:55.672840 kubelet[2894]: E0513 23:59:55.551835 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.5:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-n-c527831f7b&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:55.672840 kubelet[2894]: E0513 23:59:55.586620 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.687796 kubelet[2894]: E0513 23:59:55.687748 2894 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.745387 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:59:55.760226 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:59:55.763448 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:59:55.771745 kubelet[2894]: I0513 23:59:55.771111 2894 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:59:55.771745 kubelet[2894]: I0513 23:59:55.771345 2894 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:59:55.771745 kubelet[2894]: I0513 23:59:55.771359 2894 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:59:55.771745 kubelet[2894]: I0513 23:59:55.771638 2894 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:59:55.773531 kubelet[2894]: E0513 23:59:55.773486 2894 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284.0.0-n-c527831f7b\" not found" May 13 23:59:55.875737 kubelet[2894]: I0513 23:59:55.875667 2894 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:55.876145 kubelet[2894]: E0513 23:59:55.876101 2894 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.5:6443/api/v1/nodes\": dial tcp 10.200.8.5:6443: connect: connection refused" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:55.941956 kubelet[2894]: W0513 23:59:55.941908 2894 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.5:6443: connect: connection refused May 13 23:59:55.942137 kubelet[2894]: E0513 23:59:55.941977 2894 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.5:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:56.078162 kubelet[2894]: I0513 23:59:56.078127 2894 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:56.078569 kubelet[2894]: E0513 23:59:56.078534 2894 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.5:6443/api/v1/nodes\": dial tcp 10.200.8.5:6443: connect: connection refused" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:56.481544 kubelet[2894]: I0513 23:59:56.481500 2894 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:56.482029 kubelet[2894]: E0513 23:59:56.481923 2894 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.5:6443/api/v1/nodes\": dial tcp 10.200.8.5:6443: connect: connection refused" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:57.283947 kubelet[2894]: I0513 23:59:57.283909 2894 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:57.284371 kubelet[2894]: E0513 23:59:57.284334 2894 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.5:6443/api/v1/nodes\": dial tcp 10.200.8.5:6443: connect: connection refused" node="ci-4284.0.0-n-c527831f7b" May 13 23:59:57.460933 kubelet[2894]: E0513 23:59:57.460888 2894 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.5:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.5:6443: connect: connection refused" logger="UnhandledError" May 13 23:59:57.566349 kubelet[2894]: E0513 23:59:57.566154 2894 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.5:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-n-c527831f7b?timeout=10s\": dial tcp 10.200.8.5:6443: connect: connection refused" interval="6.4s" May 13 23:59:57.686451 systemd[1]: Created slice kubepods-burstable-pod7efa40d50df1dbbb58bc332a999ae4e9.slice - libcontainer container kubepods-burstable-pod7efa40d50df1dbbb58bc332a999ae4e9.slice. May 13 23:59:57.696955 systemd[1]: Created slice kubepods-burstable-podfb5ea7fb8a0d57aa8db2be89cce7501b.slice - libcontainer container kubepods-burstable-podfb5ea7fb8a0d57aa8db2be89cce7501b.slice. May 13 23:59:57.706924 systemd[1]: Created slice kubepods-burstable-pod4cc813018357797c40a351f6ace6f0d0.slice - libcontainer container kubepods-burstable-pod4cc813018357797c40a351f6ace6f0d0.slice. May 13 23:59:57.797486 kubelet[2894]: I0513 23:59:57.797418 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797486 kubelet[2894]: I0513 23:59:57.797483 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797727 kubelet[2894]: I0513 23:59:57.797513 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797727 kubelet[2894]: I0513 23:59:57.797573 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797727 kubelet[2894]: I0513 23:59:57.797600 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797727 kubelet[2894]: I0513 23:59:57.797626 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7efa40d50df1dbbb58bc332a999ae4e9-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-n-c527831f7b\" (UID: \"7efa40d50df1dbbb58bc332a999ae4e9\") " pod="kube-system/kube-scheduler-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797727 kubelet[2894]: I0513 23:59:57.797649 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797944 kubelet[2894]: I0513 23:59:57.797676 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.797944 kubelet[2894]: I0513 23:59:57.797705 2894 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 13 23:59:57.999330 containerd[1735]: time="2025-05-13T23:59:57.997644990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-n-c527831f7b,Uid:7efa40d50df1dbbb58bc332a999ae4e9,Namespace:kube-system,Attempt:0,}" May 13 23:59:58.005184 containerd[1735]: time="2025-05-13T23:59:58.005151352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-n-c527831f7b,Uid:fb5ea7fb8a0d57aa8db2be89cce7501b,Namespace:kube-system,Attempt:0,}" May 13 23:59:58.009904 containerd[1735]: time="2025-05-13T23:59:58.009873991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-n-c527831f7b,Uid:4cc813018357797c40a351f6ace6f0d0,Namespace:kube-system,Attempt:0,}" May 13 23:59:58.090737 containerd[1735]: time="2025-05-13T23:59:58.090398857Z" level=info msg="connecting to shim 1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4" address="unix:///run/containerd/s/a9e4fc3a5689f6bc21fc7574f1ab7691c300af89ea65d3be19e9a6e106117a00" namespace=k8s.io protocol=ttrpc version=3 May 13 23:59:58.125224 containerd[1735]: time="2025-05-13T23:59:58.124098036Z" level=info msg="connecting to shim 50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae" address="unix:///run/containerd/s/2711a76310c5ef142910d191ec8474995639adb4d8668eebeadb9376b8ea5885" namespace=k8s.io protocol=ttrpc version=3 May 13 23:59:58.133980 systemd[1]: Started cri-containerd-1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4.scope - libcontainer container 1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4. May 13 23:59:58.157515 containerd[1735]: time="2025-05-13T23:59:58.157278210Z" level=info msg="connecting to shim f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9" address="unix:///run/containerd/s/9661ee54960aba1b6f6c5b65de39a3f89109ee8e0f7e7257f8fc47c90c9169f3" namespace=k8s.io protocol=ttrpc version=3 May 13 23:59:58.181675 systemd[1]: Started cri-containerd-50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae.scope - libcontainer container 50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae. May 13 23:59:58.198489 systemd[1]: Started cri-containerd-f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9.scope - libcontainer container f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9. May 13 23:59:58.256403 containerd[1735]: time="2025-05-13T23:59:58.255843726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-n-c527831f7b,Uid:7efa40d50df1dbbb58bc332a999ae4e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4\"" May 13 23:59:58.261336 containerd[1735]: time="2025-05-13T23:59:58.261092969Z" level=info msg="CreateContainer within sandbox \"1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:59:58.282096 containerd[1735]: time="2025-05-13T23:59:58.282064443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-n-c527831f7b,Uid:4cc813018357797c40a351f6ace6f0d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae\"" May 13 23:59:58.284123 containerd[1735]: time="2025-05-13T23:59:58.284087059Z" level=info msg="CreateContainer within sandbox \"50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:59:58.289786 containerd[1735]: time="2025-05-13T23:59:58.289751906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-n-c527831f7b,Uid:fb5ea7fb8a0d57aa8db2be89cce7501b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9\"" May 13 23:59:58.292420 containerd[1735]: time="2025-05-13T23:59:58.292377828Z" level=info msg="CreateContainer within sandbox \"f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:59:58.294921 containerd[1735]: time="2025-05-13T23:59:58.294893449Z" level=info msg="Container 1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659: CDI devices from CRI Config.CDIDevices: []" May 13 23:59:58.328420 containerd[1735]: time="2025-05-13T23:59:58.328333626Z" level=info msg="CreateContainer within sandbox \"1a226a4758921d2de6a38f4acdd88732b3035a5892c77d7a0c37df2f46ff2ca4\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659\"" May 13 23:59:58.329101 containerd[1735]: time="2025-05-13T23:59:58.329065532Z" level=info msg="StartContainer for \"1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659\"" May 13 23:59:58.330180 containerd[1735]: time="2025-05-13T23:59:58.330147241Z" level=info msg="connecting to shim 1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659" address="unix:///run/containerd/s/a9e4fc3a5689f6bc21fc7574f1ab7691c300af89ea65d3be19e9a6e106117a00" protocol=ttrpc version=3 May 13 23:59:58.340112 containerd[1735]: time="2025-05-13T23:59:58.340070523Z" level=info msg="Container 48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed: CDI devices from CRI Config.CDIDevices: []" May 13 23:59:58.346111 containerd[1735]: time="2025-05-13T23:59:58.345786870Z" level=info msg="Container 17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95: CDI devices from CRI Config.CDIDevices: []" May 13 23:59:58.356506 systemd[1]: Started cri-containerd-1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659.scope - libcontainer container 1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659. May 13 23:59:58.374995 containerd[1735]: time="2025-05-13T23:59:58.374948411Z" level=info msg="CreateContainer within sandbox \"f4b472b3cd7c82c1834c0fe26676847baffc1785c8c2e085c6b18d888f4d84d9\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95\"" May 13 23:59:58.375803 containerd[1735]: time="2025-05-13T23:59:58.375773818Z" level=info msg="StartContainer for \"17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95\"" May 13 23:59:58.377718 containerd[1735]: time="2025-05-13T23:59:58.377658334Z" level=info msg="connecting to shim 17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95" address="unix:///run/containerd/s/9661ee54960aba1b6f6c5b65de39a3f89109ee8e0f7e7257f8fc47c90c9169f3" protocol=ttrpc version=3 May 13 23:59:58.381320 containerd[1735]: time="2025-05-13T23:59:58.380171454Z" level=info msg="CreateContainer within sandbox \"50113f2a2d67f71f485e75567b18c6efcaec074f51a57b6847135123e35c4bae\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed\"" May 13 23:59:58.381320 containerd[1735]: time="2025-05-13T23:59:58.380998361Z" level=info msg="StartContainer for \"48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed\"" May 13 23:59:58.383319 containerd[1735]: time="2025-05-13T23:59:58.382271172Z" level=info msg="connecting to shim 48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed" address="unix:///run/containerd/s/2711a76310c5ef142910d191ec8474995639adb4d8668eebeadb9376b8ea5885" protocol=ttrpc version=3 May 13 23:59:58.418485 systemd[1]: Started cri-containerd-17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95.scope - libcontainer container 17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95. May 13 23:59:58.423572 systemd[1]: Started cri-containerd-48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed.scope - libcontainer container 48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed. May 13 23:59:58.451268 containerd[1735]: time="2025-05-13T23:59:58.451214542Z" level=info msg="StartContainer for \"1aea6e7ed88b99cbf9374fbca63504f77bb38c2f53318fa237208a25a46ac659\" returns successfully" May 13 23:59:58.514384 containerd[1735]: time="2025-05-13T23:59:58.513370956Z" level=info msg="StartContainer for \"17e56406e590c8d352baff6d8a0f84287ef26fe051b104cce18493a7f4f04a95\" returns successfully" May 13 23:59:58.544788 containerd[1735]: time="2025-05-13T23:59:58.544742216Z" level=info msg="StartContainer for \"48447aed89c1647b2fff9de418801163424ca3354e4010ab88a0a7855edc0aed\" returns successfully" May 13 23:59:58.886280 kubelet[2894]: I0513 23:59:58.886166 2894 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 14 00:00:00.338238 kubelet[2894]: I0514 00:00:00.338203 2894 apiserver.go:52] "Watching apiserver" May 14 00:00:00.435506 kubelet[2894]: I0514 00:00:00.434548 2894 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284.0.0-n-c527831f7b" May 14 00:00:00.455861 kubelet[2894]: I0514 00:00:00.455831 2894 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 00:00:01.443399 kubelet[2894]: W0514 00:00:01.443224 2894 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:01.446139 kubelet[2894]: W0514 00:00:01.446005 2894 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:02.596075 systemd[1]: Started logrotate.service - Rotate and Compress System Logs. May 14 00:00:02.607552 systemd[1]: Reload requested from client PID 3164 ('systemctl') (unit session-9.scope)... May 14 00:00:02.607569 systemd[1]: Reloading... May 14 00:00:02.710332 zram_generator::config[3208]: No configuration found. May 14 00:00:02.869896 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 00:00:05.078887 kubelet[2894]: I0514 00:00:03.033071 2894 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 00:00:03.002625 systemd[1]: Reloading finished in 394 ms. May 14 00:00:03.033160 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:00:03.054724 systemd[1]: kubelet.service: Deactivated successfully. May 14 00:00:03.054941 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:00:03.054994 systemd[1]: kubelet.service: Consumed 1.153s CPU time, 116.2M memory peak. May 14 00:00:03.057884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:00:06.085740 systemd[1]: logrotate.service: Deactivated successfully. May 14 00:00:20.830038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:00:20.840677 (kubelet)[3280]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 00:00:21.714319 kubelet[3280]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:00:21.714319 kubelet[3280]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 00:00:21.714319 kubelet[3280]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:00:21.714805 kubelet[3280]: I0514 00:00:21.714165 3280 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 00:00:21.721145 kubelet[3280]: I0514 00:00:21.721113 3280 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 14 00:00:21.721145 kubelet[3280]: I0514 00:00:21.721134 3280 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 00:00:21.721434 kubelet[3280]: I0514 00:00:21.721414 3280 server.go:929] "Client rotation is on, will bootstrap in background" May 14 00:00:21.722887 kubelet[3280]: I0514 00:00:21.722656 3280 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 14 00:00:21.727220 kubelet[3280]: I0514 00:00:21.727068 3280 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 00:00:21.732785 kubelet[3280]: I0514 00:00:21.732757 3280 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 00:00:21.742953 kubelet[3280]: I0514 00:00:21.742921 3280 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 00:00:21.744312 kubelet[3280]: I0514 00:00:21.743259 3280 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 14 00:00:21.744312 kubelet[3280]: I0514 00:00:21.743664 3280 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 00:00:21.744831 kubelet[3280]: I0514 00:00:21.743800 3280 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-n-c527831f7b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 00:00:21.744996 kubelet[3280]: I0514 00:00:21.744845 3280 topology_manager.go:138] "Creating topology manager with none policy" May 14 00:00:21.744996 kubelet[3280]: I0514 00:00:21.744865 3280 container_manager_linux.go:300] "Creating device plugin manager" May 14 00:00:21.745173 kubelet[3280]: I0514 00:00:21.745155 3280 state_mem.go:36] "Initialized new in-memory state store" May 14 00:00:21.745314 kubelet[3280]: I0514 00:00:21.745288 3280 kubelet.go:408] "Attempting to sync node with API server" May 14 00:00:21.745552 kubelet[3280]: I0514 00:00:21.745528 3280 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 00:00:21.745623 kubelet[3280]: I0514 00:00:21.745583 3280 kubelet.go:314] "Adding apiserver pod source" May 14 00:00:21.745623 kubelet[3280]: I0514 00:00:21.745606 3280 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 00:00:21.750934 kubelet[3280]: I0514 00:00:21.750821 3280 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 00:00:21.751573 kubelet[3280]: I0514 00:00:21.751352 3280 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 00:00:21.752371 kubelet[3280]: I0514 00:00:21.751819 3280 server.go:1269] "Started kubelet" May 14 00:00:21.757320 kubelet[3280]: I0514 00:00:21.754325 3280 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 00:00:21.761866 kubelet[3280]: I0514 00:00:21.760501 3280 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 00:00:21.761866 kubelet[3280]: I0514 00:00:21.761635 3280 server.go:460] "Adding debug handlers to kubelet server" May 14 00:00:21.762995 kubelet[3280]: I0514 00:00:21.762922 3280 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 00:00:21.763181 kubelet[3280]: I0514 00:00:21.763160 3280 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 00:00:21.764014 kubelet[3280]: I0514 00:00:21.763981 3280 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 00:00:21.766874 kubelet[3280]: I0514 00:00:21.765742 3280 volume_manager.go:289] "Starting Kubelet Volume Manager" May 14 00:00:21.766874 kubelet[3280]: E0514 00:00:21.765967 3280 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284.0.0-n-c527831f7b\" not found" May 14 00:00:21.769316 kubelet[3280]: I0514 00:00:21.768125 3280 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 14 00:00:21.769316 kubelet[3280]: I0514 00:00:21.768272 3280 reconciler.go:26] "Reconciler: start to sync state" May 14 00:00:21.770574 kubelet[3280]: I0514 00:00:21.770353 3280 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 00:00:21.772477 kubelet[3280]: I0514 00:00:21.771672 3280 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 00:00:21.772477 kubelet[3280]: I0514 00:00:21.771707 3280 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 00:00:21.772477 kubelet[3280]: I0514 00:00:21.771734 3280 kubelet.go:2321] "Starting kubelet main sync loop" May 14 00:00:21.772477 kubelet[3280]: E0514 00:00:21.771775 3280 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 00:00:21.781741 kubelet[3280]: I0514 00:00:21.781189 3280 factory.go:221] Registration of the systemd container factory successfully May 14 00:00:21.781741 kubelet[3280]: I0514 00:00:21.781389 3280 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 00:00:21.784040 kubelet[3280]: I0514 00:00:21.783918 3280 factory.go:221] Registration of the containerd container factory successfully May 14 00:00:21.786980 kubelet[3280]: E0514 00:00:21.786769 3280 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 00:00:21.872839 kubelet[3280]: E0514 00:00:21.872797 3280 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 14 00:00:21.877677 kubelet[3280]: I0514 00:00:21.877645 3280 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 00:00:21.877677 kubelet[3280]: I0514 00:00:21.877667 3280 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 00:00:21.877823 kubelet[3280]: I0514 00:00:21.877691 3280 state_mem.go:36] "Initialized new in-memory state store" May 14 00:00:21.877966 kubelet[3280]: I0514 00:00:21.877935 3280 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 14 00:00:21.878106 kubelet[3280]: I0514 00:00:21.878040 3280 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 14 00:00:21.878106 kubelet[3280]: I0514 00:00:21.878110 3280 policy_none.go:49] "None policy: Start" May 14 00:00:21.879240 kubelet[3280]: I0514 00:00:21.879218 3280 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 00:00:21.879351 kubelet[3280]: I0514 00:00:21.879249 3280 state_mem.go:35] "Initializing new in-memory state store" May 14 00:00:21.879624 kubelet[3280]: I0514 00:00:21.879532 3280 state_mem.go:75] "Updated machine memory state" May 14 00:00:21.887598 kubelet[3280]: I0514 00:00:21.887572 3280 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 00:00:21.887795 kubelet[3280]: I0514 00:00:21.887767 3280 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 00:00:21.887863 kubelet[3280]: I0514 00:00:21.887783 3280 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 00:00:21.892476 kubelet[3280]: I0514 00:00:21.892453 3280 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 00:00:21.892762 kubelet[3280]: I0514 00:00:21.892743 3280 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 14 00:00:21.897079 containerd[1735]: time="2025-05-14T00:00:21.897038031Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 14 00:00:21.900227 kubelet[3280]: I0514 00:00:21.899789 3280 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 14 00:00:22.012894 kubelet[3280]: I0514 00:00:22.012537 3280 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284.0.0-n-c527831f7b" May 14 00:00:22.028122 kubelet[3280]: I0514 00:00:22.028007 3280 kubelet_node_status.go:111] "Node was previously registered" node="ci-4284.0.0-n-c527831f7b" May 14 00:00:22.028122 kubelet[3280]: I0514 00:00:22.028095 3280 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284.0.0-n-c527831f7b" May 14 00:00:22.086190 kubelet[3280]: W0514 00:00:22.084687 3280 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:22.086190 kubelet[3280]: W0514 00:00:22.084963 3280 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:22.086190 kubelet[3280]: E0514 00:00:22.085015 3280 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4284.0.0-n-c527831f7b\" already exists" pod="kube-system/kube-scheduler-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.086190 kubelet[3280]: W0514 00:00:22.086203 3280 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:22.086499 kubelet[3280]: E0514 00:00:22.086329 3280 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" already exists" pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.170953 kubelet[3280]: I0514 00:00:22.170903 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171133 kubelet[3280]: I0514 00:00:22.170950 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171133 kubelet[3280]: I0514 00:00:22.170989 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7efa40d50df1dbbb58bc332a999ae4e9-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-n-c527831f7b\" (UID: \"7efa40d50df1dbbb58bc332a999ae4e9\") " pod="kube-system/kube-scheduler-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171133 kubelet[3280]: I0514 00:00:22.171008 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171133 kubelet[3280]: I0514 00:00:22.171029 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171133 kubelet[3280]: I0514 00:00:22.171048 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171317 kubelet[3280]: I0514 00:00:22.171070 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171317 kubelet[3280]: I0514 00:00:22.171089 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4cc813018357797c40a351f6ace6f0d0-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" (UID: \"4cc813018357797c40a351f6ace6f0d0\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.171317 kubelet[3280]: I0514 00:00:22.171116 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb5ea7fb8a0d57aa8db2be89cce7501b-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" (UID: \"fb5ea7fb8a0d57aa8db2be89cce7501b\") " pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.747016 kubelet[3280]: I0514 00:00:22.746757 3280 apiserver.go:52] "Watching apiserver" May 14 00:00:22.770191 kubelet[3280]: I0514 00:00:22.768934 3280 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 00:00:22.773832 kubelet[3280]: I0514 00:00:22.773010 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f2188759-86ff-40eb-92bf-bd5f39a8a3e3-kube-proxy\") pod \"kube-proxy-qm8jq\" (UID: \"f2188759-86ff-40eb-92bf-bd5f39a8a3e3\") " pod="kube-system/kube-proxy-qm8jq" May 14 00:00:22.773832 kubelet[3280]: I0514 00:00:22.773046 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2188759-86ff-40eb-92bf-bd5f39a8a3e3-lib-modules\") pod \"kube-proxy-qm8jq\" (UID: \"f2188759-86ff-40eb-92bf-bd5f39a8a3e3\") " pod="kube-system/kube-proxy-qm8jq" May 14 00:00:22.773832 kubelet[3280]: I0514 00:00:22.773069 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f2188759-86ff-40eb-92bf-bd5f39a8a3e3-xtables-lock\") pod \"kube-proxy-qm8jq\" (UID: \"f2188759-86ff-40eb-92bf-bd5f39a8a3e3\") " pod="kube-system/kube-proxy-qm8jq" May 14 00:00:22.773832 kubelet[3280]: I0514 00:00:22.773090 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvj8\" (UniqueName: \"kubernetes.io/projected/f2188759-86ff-40eb-92bf-bd5f39a8a3e3-kube-api-access-xxvj8\") pod \"kube-proxy-qm8jq\" (UID: \"f2188759-86ff-40eb-92bf-bd5f39a8a3e3\") " pod="kube-system/kube-proxy-qm8jq" May 14 00:00:22.773183 systemd[1]: Created slice kubepods-besteffort-podf2188759_86ff_40eb_92bf_bd5f39a8a3e3.slice - libcontainer container kubepods-besteffort-podf2188759_86ff_40eb_92bf_bd5f39a8a3e3.slice. May 14 00:00:22.850786 kubelet[3280]: I0514 00:00:22.850715 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" podStartSLOduration=0.850691821 podStartE2EDuration="850.691821ms" podCreationTimestamp="2025-05-14 00:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:00:22.827614401 +0000 UTC m=+1.982573797" watchObservedRunningTime="2025-05-14 00:00:22.850691821 +0000 UTC m=+2.005651217" May 14 00:00:22.878596 kubelet[3280]: W0514 00:00:22.878569 3280 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:22.878879 kubelet[3280]: E0514 00:00:22.878857 3280 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4284.0.0-n-c527831f7b\" already exists" pod="kube-system/kube-controller-manager-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.879334 kubelet[3280]: W0514 00:00:22.879315 3280 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:00:22.879500 kubelet[3280]: E0514 00:00:22.879484 3280 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284.0.0-n-c527831f7b\" already exists" pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" May 14 00:00:22.881328 kubelet[3280]: I0514 00:00:22.881253 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284.0.0-n-c527831f7b" podStartSLOduration=21.881239412 podStartE2EDuration="21.881239412s" podCreationTimestamp="2025-05-14 00:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:00:22.851987433 +0000 UTC m=+2.006946829" watchObservedRunningTime="2025-05-14 00:00:22.881239412 +0000 UTC m=+2.036198808" May 14 00:00:22.931195 kubelet[3280]: I0514 00:00:22.931126 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284.0.0-n-c527831f7b" podStartSLOduration=21.931107087 podStartE2EDuration="21.931107087s" podCreationTimestamp="2025-05-14 00:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:00:22.905473143 +0000 UTC m=+2.060432539" watchObservedRunningTime="2025-05-14 00:00:22.931107087 +0000 UTC m=+2.086066483" May 14 00:00:23.084412 containerd[1735]: time="2025-05-14T00:00:23.084097345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qm8jq,Uid:f2188759-86ff-40eb-92bf-bd5f39a8a3e3,Namespace:kube-system,Attempt:0,}" May 14 00:00:23.150492 containerd[1735]: time="2025-05-14T00:00:23.147361648Z" level=info msg="connecting to shim 46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce" address="unix:///run/containerd/s/02c439f8f294e6f475412c4099e2c13e89df8ed301713de30cbe39450e581bf4" namespace=k8s.io protocol=ttrpc version=3 May 14 00:00:23.189510 systemd[1]: Started cri-containerd-46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce.scope - libcontainer container 46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce. May 14 00:00:23.272951 containerd[1735]: time="2025-05-14T00:00:23.272906945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qm8jq,Uid:f2188759-86ff-40eb-92bf-bd5f39a8a3e3,Namespace:kube-system,Attempt:0,} returns sandbox id \"46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce\"" May 14 00:00:23.280642 containerd[1735]: time="2025-05-14T00:00:23.280598418Z" level=info msg="CreateContainer within sandbox \"46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 14 00:00:23.314182 containerd[1735]: time="2025-05-14T00:00:23.314120338Z" level=info msg="Container 69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7: CDI devices from CRI Config.CDIDevices: []" May 14 00:00:23.321837 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2062533442.mount: Deactivated successfully. May 14 00:00:23.345962 containerd[1735]: time="2025-05-14T00:00:23.345847840Z" level=info msg="CreateContainer within sandbox \"46c63b53f144142821ef9046e0659823f3bfdca5303d379507274199d8d8a5ce\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7\"" May 14 00:00:23.348476 containerd[1735]: time="2025-05-14T00:00:23.348440965Z" level=info msg="StartContainer for \"69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7\"" May 14 00:00:23.352347 containerd[1735]: time="2025-05-14T00:00:23.352290802Z" level=info msg="connecting to shim 69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7" address="unix:///run/containerd/s/02c439f8f294e6f475412c4099e2c13e89df8ed301713de30cbe39450e581bf4" protocol=ttrpc version=3 May 14 00:00:23.414523 systemd[1]: Started cri-containerd-69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7.scope - libcontainer container 69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7. May 14 00:00:23.490377 containerd[1735]: time="2025-05-14T00:00:23.490331517Z" level=info msg="StartContainer for \"69a00084003372663bacd61568ccab2c038908187fd15c794123ff49a5d292e7\" returns successfully" May 14 00:00:24.074136 kubelet[3280]: I0514 00:00:24.073709 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qm8jq" podStartSLOduration=3.073682177 podStartE2EDuration="3.073682177s" podCreationTimestamp="2025-05-14 00:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:00:23.954078637 +0000 UTC m=+3.109038033" watchObservedRunningTime="2025-05-14 00:00:24.073682177 +0000 UTC m=+3.228641673" May 14 00:00:24.088496 kubelet[3280]: W0514 00:00:24.088125 3280 reflector.go:561] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284.0.0-n-c527831f7b" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object May 14 00:00:24.088496 kubelet[3280]: E0514 00:00:24.088189 3280 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284.0.0-n-c527831f7b\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object" logger="UnhandledError" May 14 00:00:24.088496 kubelet[3280]: W0514 00:00:24.088274 3280 reflector.go:561] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284.0.0-n-c527831f7b" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object May 14 00:00:24.088496 kubelet[3280]: E0514 00:00:24.088291 3280 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ci-4284.0.0-n-c527831f7b\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object" logger="UnhandledError" May 14 00:00:24.089604 systemd[1]: Created slice kubepods-besteffort-podaf9a0f8c_80f4_43a4_96cd_225990c18db6.slice - libcontainer container kubepods-besteffort-podaf9a0f8c_80f4_43a4_96cd_225990c18db6.slice. May 14 00:00:24.182326 kubelet[3280]: I0514 00:00:24.182027 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5jq\" (UniqueName: \"kubernetes.io/projected/af9a0f8c-80f4-43a4-96cd-225990c18db6-kube-api-access-hx5jq\") pod \"tigera-operator-6f6897fdc5-cnpnz\" (UID: \"af9a0f8c-80f4-43a4-96cd-225990c18db6\") " pod="tigera-operator/tigera-operator-6f6897fdc5-cnpnz" May 14 00:00:24.182326 kubelet[3280]: I0514 00:00:24.182077 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/af9a0f8c-80f4-43a4-96cd-225990c18db6-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-cnpnz\" (UID: \"af9a0f8c-80f4-43a4-96cd-225990c18db6\") " pod="tigera-operator/tigera-operator-6f6897fdc5-cnpnz" May 14 00:00:25.297738 kubelet[3280]: E0514 00:00:25.297687 3280 projected.go:288] Couldn't get configMap tigera-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 14 00:00:25.297738 kubelet[3280]: E0514 00:00:25.297734 3280 projected.go:194] Error preparing data for projected volume kube-api-access-hx5jq for pod tigera-operator/tigera-operator-6f6897fdc5-cnpnz: failed to sync configmap cache: timed out waiting for the condition May 14 00:00:25.298249 kubelet[3280]: E0514 00:00:25.297814 3280 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af9a0f8c-80f4-43a4-96cd-225990c18db6-kube-api-access-hx5jq podName:af9a0f8c-80f4-43a4-96cd-225990c18db6 nodeName:}" failed. No retries permitted until 2025-05-14 00:00:25.797791146 +0000 UTC m=+4.952750642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hx5jq" (UniqueName: "kubernetes.io/projected/af9a0f8c-80f4-43a4-96cd-225990c18db6-kube-api-access-hx5jq") pod "tigera-operator-6f6897fdc5-cnpnz" (UID: "af9a0f8c-80f4-43a4-96cd-225990c18db6") : failed to sync configmap cache: timed out waiting for the condition May 14 00:00:25.898356 containerd[1735]: time="2025-05-14T00:00:25.898281867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-cnpnz,Uid:af9a0f8c-80f4-43a4-96cd-225990c18db6,Namespace:tigera-operator,Attempt:0,}" May 14 00:00:25.967071 containerd[1735]: time="2025-05-14T00:00:25.966488344Z" level=info msg="connecting to shim f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c" address="unix:///run/containerd/s/a78825e0ef86532721368e9d03391362fdca15115bf0a86018fb9f5d11beb0c5" namespace=k8s.io protocol=ttrpc version=3 May 14 00:00:25.995713 systemd[1]: Started cri-containerd-f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c.scope - libcontainer container f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c. May 14 00:00:26.044572 containerd[1735]: time="2025-05-14T00:00:26.044524447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-cnpnz,Uid:af9a0f8c-80f4-43a4-96cd-225990c18db6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c\"" May 14 00:00:26.046321 containerd[1735]: time="2025-05-14T00:00:26.046268570Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 14 00:00:27.119112 sudo[2234]: pam_unix(sudo:session): session closed for user root May 14 00:00:27.219171 sshd[2233]: Connection closed by 10.200.16.10 port 43940 May 14 00:00:27.220140 sshd-session[2215]: pam_unix(sshd:session): session closed for user core May 14 00:00:27.224035 systemd[1]: sshd@6-10.200.8.5:22-10.200.16.10:43940.service: Deactivated successfully. May 14 00:00:27.227152 systemd[1]: session-9.scope: Deactivated successfully. May 14 00:00:27.227383 systemd[1]: session-9.scope: Consumed 4.499s CPU time, 223.9M memory peak. May 14 00:00:27.229611 systemd-logind[1706]: Session 9 logged out. Waiting for processes to exit. May 14 00:00:27.230673 systemd-logind[1706]: Removed session 9. May 14 00:00:27.896240 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3972140017.mount: Deactivated successfully. May 14 00:00:28.459019 containerd[1735]: time="2025-05-14T00:00:28.458967391Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:28.461645 containerd[1735]: time="2025-05-14T00:00:28.461566124Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 14 00:00:28.466639 containerd[1735]: time="2025-05-14T00:00:28.466585089Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:28.471575 containerd[1735]: time="2025-05-14T00:00:28.471520452Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:28.472126 containerd[1735]: time="2025-05-14T00:00:28.472094159Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.425559386s" May 14 00:00:28.472203 containerd[1735]: time="2025-05-14T00:00:28.472131860Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 14 00:00:28.475097 containerd[1735]: time="2025-05-14T00:00:28.474388389Z" level=info msg="CreateContainer within sandbox \"f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 14 00:00:28.501583 containerd[1735]: time="2025-05-14T00:00:28.501538638Z" level=info msg="Container 2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34: CDI devices from CRI Config.CDIDevices: []" May 14 00:00:28.524822 containerd[1735]: time="2025-05-14T00:00:28.524779137Z" level=info msg="CreateContainer within sandbox \"f13e7811366977f74f02ba6575d4c6e239313708b8d0853ee0a7e72a68ce187c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34\"" May 14 00:00:28.526373 containerd[1735]: time="2025-05-14T00:00:28.525349344Z" level=info msg="StartContainer for \"2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34\"" May 14 00:00:28.526373 containerd[1735]: time="2025-05-14T00:00:28.526221055Z" level=info msg="connecting to shim 2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34" address="unix:///run/containerd/s/a78825e0ef86532721368e9d03391362fdca15115bf0a86018fb9f5d11beb0c5" protocol=ttrpc version=3 May 14 00:00:28.553481 systemd[1]: Started cri-containerd-2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34.scope - libcontainer container 2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34. May 14 00:00:28.583446 containerd[1735]: time="2025-05-14T00:00:28.583403791Z" level=info msg="StartContainer for \"2d41e85c01af7f02f3c63ebeca202675f342c53a2b04654895019e2e85ec5e34\" returns successfully" May 14 00:00:28.887398 kubelet[3280]: I0514 00:00:28.887108 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-cnpnz" podStartSLOduration=2.459845287 podStartE2EDuration="4.887086895s" podCreationTimestamp="2025-05-14 00:00:24 +0000 UTC" firstStartedPulling="2025-05-14 00:00:26.045770663 +0000 UTC m=+5.200730059" lastFinishedPulling="2025-05-14 00:00:28.473012271 +0000 UTC m=+7.627971667" observedRunningTime="2025-05-14 00:00:28.886966494 +0000 UTC m=+8.041925990" watchObservedRunningTime="2025-05-14 00:00:28.887086895 +0000 UTC m=+8.042046291" May 14 00:00:31.600357 systemd[1]: Created slice kubepods-besteffort-podb82f7689_fc25_493b_8c14_8dbc58f28401.slice - libcontainer container kubepods-besteffort-podb82f7689_fc25_493b_8c14_8dbc58f28401.slice. May 14 00:00:31.633053 kubelet[3280]: I0514 00:00:31.633003 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b82f7689-fc25-493b-8c14-8dbc58f28401-tigera-ca-bundle\") pod \"calico-typha-67f754989f-vbzhk\" (UID: \"b82f7689-fc25-493b-8c14-8dbc58f28401\") " pod="calico-system/calico-typha-67f754989f-vbzhk" May 14 00:00:31.633053 kubelet[3280]: I0514 00:00:31.633056 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b82f7689-fc25-493b-8c14-8dbc58f28401-typha-certs\") pod \"calico-typha-67f754989f-vbzhk\" (UID: \"b82f7689-fc25-493b-8c14-8dbc58f28401\") " pod="calico-system/calico-typha-67f754989f-vbzhk" May 14 00:00:31.633621 kubelet[3280]: I0514 00:00:31.633080 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx86\" (UniqueName: \"kubernetes.io/projected/b82f7689-fc25-493b-8c14-8dbc58f28401-kube-api-access-7dx86\") pod \"calico-typha-67f754989f-vbzhk\" (UID: \"b82f7689-fc25-493b-8c14-8dbc58f28401\") " pod="calico-system/calico-typha-67f754989f-vbzhk" May 14 00:00:31.734403 kubelet[3280]: I0514 00:00:31.733341 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19f6124b-9bfb-4b82-9190-f05cc81702d8-tigera-ca-bundle\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734403 kubelet[3280]: I0514 00:00:31.733387 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/19f6124b-9bfb-4b82-9190-f05cc81702d8-node-certs\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734403 kubelet[3280]: I0514 00:00:31.733414 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-var-lib-calico\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734403 kubelet[3280]: I0514 00:00:31.733437 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-policysync\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734403 kubelet[3280]: I0514 00:00:31.733459 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-cni-bin-dir\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734710 kubelet[3280]: I0514 00:00:31.733483 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-xtables-lock\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734710 kubelet[3280]: I0514 00:00:31.733505 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-flexvol-driver-host\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734710 kubelet[3280]: I0514 00:00:31.733528 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-var-run-calico\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734710 kubelet[3280]: I0514 00:00:31.733551 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-cni-net-dir\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734710 kubelet[3280]: I0514 00:00:31.733591 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-lib-modules\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734916 kubelet[3280]: I0514 00:00:31.733613 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/19f6124b-9bfb-4b82-9190-f05cc81702d8-cni-log-dir\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.734916 kubelet[3280]: I0514 00:00:31.733662 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9pw\" (UniqueName: \"kubernetes.io/projected/19f6124b-9bfb-4b82-9190-f05cc81702d8-kube-api-access-br9pw\") pod \"calico-node-n5cf8\" (UID: \"19f6124b-9bfb-4b82-9190-f05cc81702d8\") " pod="calico-system/calico-node-n5cf8" May 14 00:00:31.740342 systemd[1]: Created slice kubepods-besteffort-pod19f6124b_9bfb_4b82_9190_f05cc81702d8.slice - libcontainer container kubepods-besteffort-pod19f6124b_9bfb_4b82_9190_f05cc81702d8.slice. May 14 00:00:31.835594 kubelet[3280]: E0514 00:00:31.835546 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.835594 kubelet[3280]: W0514 00:00:31.835577 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.835594 kubelet[3280]: E0514 00:00:31.835628 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.836324 kubelet[3280]: E0514 00:00:31.836161 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.836324 kubelet[3280]: W0514 00:00:31.836176 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.836324 kubelet[3280]: E0514 00:00:31.836198 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.836698 kubelet[3280]: E0514 00:00:31.836416 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.836698 kubelet[3280]: W0514 00:00:31.836438 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.836698 kubelet[3280]: E0514 00:00:31.836466 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.837771 kubelet[3280]: E0514 00:00:31.837747 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.837771 kubelet[3280]: W0514 00:00:31.837764 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.838136 kubelet[3280]: E0514 00:00:31.837854 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.838497 kubelet[3280]: E0514 00:00:31.838215 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.838497 kubelet[3280]: W0514 00:00:31.838229 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.838497 kubelet[3280]: E0514 00:00:31.838260 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.838497 kubelet[3280]: E0514 00:00:31.838453 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.838497 kubelet[3280]: W0514 00:00:31.838463 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.839076 kubelet[3280]: E0514 00:00:31.838615 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.839076 kubelet[3280]: E0514 00:00:31.838695 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.839076 kubelet[3280]: W0514 00:00:31.838704 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.839076 kubelet[3280]: E0514 00:00:31.838977 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.839076 kubelet[3280]: W0514 00:00:31.838988 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.839076 kubelet[3280]: E0514 00:00:31.839001 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.839352 kubelet[3280]: E0514 00:00:31.839173 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.839352 kubelet[3280]: W0514 00:00:31.839182 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.839352 kubelet[3280]: E0514 00:00:31.839193 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.839476 kubelet[3280]: E0514 00:00:31.839441 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.839476 kubelet[3280]: W0514 00:00:31.839451 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.839476 kubelet[3280]: E0514 00:00:31.839464 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.840681 kubelet[3280]: E0514 00:00:31.840553 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.840681 kubelet[3280]: E0514 00:00:31.840565 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.840681 kubelet[3280]: W0514 00:00:31.840576 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.840681 kubelet[3280]: E0514 00:00:31.840589 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.842333 kubelet[3280]: E0514 00:00:31.841459 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.842333 kubelet[3280]: W0514 00:00:31.841472 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.842333 kubelet[3280]: E0514 00:00:31.841509 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.842333 kubelet[3280]: E0514 00:00:31.841782 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.842333 kubelet[3280]: W0514 00:00:31.841793 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.842333 kubelet[3280]: E0514 00:00:31.841820 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.842858 kubelet[3280]: E0514 00:00:31.842752 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.842858 kubelet[3280]: W0514 00:00:31.842767 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.842858 kubelet[3280]: E0514 00:00:31.842783 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.843023 kubelet[3280]: E0514 00:00:31.843011 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.843023 kubelet[3280]: W0514 00:00:31.843021 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.843105 kubelet[3280]: E0514 00:00:31.843037 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.843502 kubelet[3280]: E0514 00:00:31.843228 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.843502 kubelet[3280]: W0514 00:00:31.843243 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.843502 kubelet[3280]: E0514 00:00:31.843255 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.843502 kubelet[3280]: E0514 00:00:31.843473 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.843502 kubelet[3280]: W0514 00:00:31.843483 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.843502 kubelet[3280]: E0514 00:00:31.843496 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.845134 kubelet[3280]: E0514 00:00:31.845112 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.845134 kubelet[3280]: W0514 00:00:31.845131 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.845602 kubelet[3280]: E0514 00:00:31.845146 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.845602 kubelet[3280]: E0514 00:00:31.845371 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.845602 kubelet[3280]: W0514 00:00:31.845382 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.845602 kubelet[3280]: E0514 00:00:31.845394 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.846560 kubelet[3280]: E0514 00:00:31.846517 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.846560 kubelet[3280]: W0514 00:00:31.846533 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.846560 kubelet[3280]: E0514 00:00:31.846547 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.847897 kubelet[3280]: E0514 00:00:31.847869 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.847897 kubelet[3280]: W0514 00:00:31.847885 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.848231 kubelet[3280]: E0514 00:00:31.847899 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.850703 kubelet[3280]: E0514 00:00:31.849886 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.850703 kubelet[3280]: W0514 00:00:31.849902 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.850703 kubelet[3280]: E0514 00:00:31.849915 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.853388 kubelet[3280]: E0514 00:00:31.853359 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.853388 kubelet[3280]: W0514 00:00:31.853379 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.853567 kubelet[3280]: E0514 00:00:31.853393 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.853701 kubelet[3280]: E0514 00:00:31.853684 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.853701 kubelet[3280]: W0514 00:00:31.853702 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.853832 kubelet[3280]: E0514 00:00:31.853715 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.856559 kubelet[3280]: E0514 00:00:31.853932 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.856559 kubelet[3280]: W0514 00:00:31.853948 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.856559 kubelet[3280]: E0514 00:00:31.853973 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.861514 kubelet[3280]: E0514 00:00:31.861495 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.861514 kubelet[3280]: W0514 00:00:31.861511 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.861663 kubelet[3280]: E0514 00:00:31.861623 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.861893 kubelet[3280]: E0514 00:00:31.861874 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.861965 kubelet[3280]: W0514 00:00:31.861894 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.862011 kubelet[3280]: E0514 00:00:31.861982 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.862154 kubelet[3280]: E0514 00:00:31.862136 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.862154 kubelet[3280]: W0514 00:00:31.862152 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.862266 kubelet[3280]: E0514 00:00:31.862169 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862399 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863381 kubelet[3280]: W0514 00:00:31.862410 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862423 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862626 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863381 kubelet[3280]: W0514 00:00:31.862636 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862647 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862820 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863381 kubelet[3280]: W0514 00:00:31.862828 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862837 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863381 kubelet[3280]: E0514 00:00:31.862989 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863802 kubelet[3280]: W0514 00:00:31.862996 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863005 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863152 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863802 kubelet[3280]: W0514 00:00:31.863159 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863168 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863364 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863802 kubelet[3280]: W0514 00:00:31.863375 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863387 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.863802 kubelet[3280]: E0514 00:00:31.863580 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.863802 kubelet[3280]: W0514 00:00:31.863590 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.863601 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.863767 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.865623 kubelet[3280]: W0514 00:00:31.863777 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.863787 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.863957 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.865623 kubelet[3280]: W0514 00:00:31.863967 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.863977 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.864147 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.865623 kubelet[3280]: W0514 00:00:31.864156 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.865623 kubelet[3280]: E0514 00:00:31.864167 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864364 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866041 kubelet[3280]: W0514 00:00:31.864373 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864386 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864568 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866041 kubelet[3280]: W0514 00:00:31.864577 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864588 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864761 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866041 kubelet[3280]: W0514 00:00:31.864770 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864780 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866041 kubelet[3280]: E0514 00:00:31.864954 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866505 kubelet[3280]: W0514 00:00:31.864963 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.864975 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.865408 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866505 kubelet[3280]: W0514 00:00:31.865421 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.865434 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.865627 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866505 kubelet[3280]: W0514 00:00:31.865637 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.865650 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866505 kubelet[3280]: E0514 00:00:31.865831 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866505 kubelet[3280]: W0514 00:00:31.865840 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866859 kubelet[3280]: E0514 00:00:31.865850 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866859 kubelet[3280]: E0514 00:00:31.866056 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.866859 kubelet[3280]: W0514 00:00:31.866066 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.866859 kubelet[3280]: E0514 00:00:31.866077 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.866859 kubelet[3280]: E0514 00:00:31.866482 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.868841 kubelet[3280]: W0514 00:00:31.866494 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.868841 kubelet[3280]: E0514 00:00:31.867837 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.868841 kubelet[3280]: E0514 00:00:31.868503 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.868841 kubelet[3280]: W0514 00:00:31.868515 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.868841 kubelet[3280]: E0514 00:00:31.868529 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.868841 kubelet[3280]: E0514 00:00:31.868732 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.868841 kubelet[3280]: W0514 00:00:31.868744 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.868841 kubelet[3280]: E0514 00:00:31.868758 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.869252 kubelet[3280]: E0514 00:00:31.868951 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.869252 kubelet[3280]: W0514 00:00:31.868962 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.869252 kubelet[3280]: E0514 00:00:31.868975 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.870400 kubelet[3280]: E0514 00:00:31.870378 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.870400 kubelet[3280]: W0514 00:00:31.870398 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.870624 kubelet[3280]: E0514 00:00:31.870412 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.871129 kubelet[3280]: E0514 00:00:31.871048 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.871129 kubelet[3280]: W0514 00:00:31.871064 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.871129 kubelet[3280]: E0514 00:00:31.871078 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.871835 kubelet[3280]: E0514 00:00:31.871810 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.871835 kubelet[3280]: W0514 00:00:31.871833 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.871976 kubelet[3280]: E0514 00:00:31.871857 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872044 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.873321 kubelet[3280]: W0514 00:00:31.872058 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872070 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872415 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.873321 kubelet[3280]: W0514 00:00:31.872426 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872440 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872884 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.873321 kubelet[3280]: W0514 00:00:31.872895 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.873321 kubelet[3280]: E0514 00:00:31.872909 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.873827 kubelet[3280]: E0514 00:00:31.873807 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.873827 kubelet[3280]: W0514 00:00:31.873825 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.873935 kubelet[3280]: E0514 00:00:31.873840 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.874657 kubelet[3280]: E0514 00:00:31.874639 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.874657 kubelet[3280]: W0514 00:00:31.874656 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.874907 kubelet[3280]: E0514 00:00:31.874670 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.879321 kubelet[3280]: E0514 00:00:31.879291 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.879321 kubelet[3280]: W0514 00:00:31.879320 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.879473 kubelet[3280]: E0514 00:00:31.879335 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.880026 kubelet[3280]: E0514 00:00:31.880006 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.880026 kubelet[3280]: W0514 00:00:31.880023 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.880139 kubelet[3280]: E0514 00:00:31.880037 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.883531 kubelet[3280]: E0514 00:00:31.883493 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:31.886910 kubelet[3280]: E0514 00:00:31.885460 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.886910 kubelet[3280]: W0514 00:00:31.885476 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.886910 kubelet[3280]: E0514 00:00:31.885492 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.887075 kubelet[3280]: E0514 00:00:31.887018 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.887075 kubelet[3280]: W0514 00:00:31.887029 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.887075 kubelet[3280]: E0514 00:00:31.887049 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.888433 kubelet[3280]: E0514 00:00:31.888411 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.888433 kubelet[3280]: W0514 00:00:31.888430 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.888583 kubelet[3280]: E0514 00:00:31.888562 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.888730 kubelet[3280]: E0514 00:00:31.888712 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.888797 kubelet[3280]: W0514 00:00:31.888731 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.888797 kubelet[3280]: E0514 00:00:31.888763 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.890775 kubelet[3280]: E0514 00:00:31.890710 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.890775 kubelet[3280]: W0514 00:00:31.890774 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.891419 kubelet[3280]: E0514 00:00:31.890790 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.891419 kubelet[3280]: E0514 00:00:31.891224 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.891419 kubelet[3280]: W0514 00:00:31.891235 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.891419 kubelet[3280]: E0514 00:00:31.891266 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.892349 kubelet[3280]: E0514 00:00:31.892316 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.892349 kubelet[3280]: W0514 00:00:31.892348 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.892488 kubelet[3280]: E0514 00:00:31.892362 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.893226 kubelet[3280]: E0514 00:00:31.893163 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.893226 kubelet[3280]: W0514 00:00:31.893178 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.893226 kubelet[3280]: E0514 00:00:31.893205 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.894887 kubelet[3280]: E0514 00:00:31.894858 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.894887 kubelet[3280]: W0514 00:00:31.894877 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.895036 kubelet[3280]: E0514 00:00:31.894893 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.898937 kubelet[3280]: E0514 00:00:31.898908 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.898937 kubelet[3280]: W0514 00:00:31.898928 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.899071 kubelet[3280]: E0514 00:00:31.898943 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.899593 kubelet[3280]: E0514 00:00:31.899573 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.899593 kubelet[3280]: W0514 00:00:31.899591 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.899744 kubelet[3280]: E0514 00:00:31.899605 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.899977 kubelet[3280]: E0514 00:00:31.899885 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.899977 kubelet[3280]: W0514 00:00:31.899901 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.899977 kubelet[3280]: E0514 00:00:31.899915 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.900412 kubelet[3280]: E0514 00:00:31.900228 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.900412 kubelet[3280]: W0514 00:00:31.900245 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.900412 kubelet[3280]: E0514 00:00:31.900258 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.901407 kubelet[3280]: E0514 00:00:31.901135 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.901407 kubelet[3280]: W0514 00:00:31.901149 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.901407 kubelet[3280]: E0514 00:00:31.901163 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.904577 containerd[1735]: time="2025-05-14T00:00:31.904536180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67f754989f-vbzhk,Uid:b82f7689-fc25-493b-8c14-8dbc58f28401,Namespace:calico-system,Attempt:0,}" May 14 00:00:31.936004 kubelet[3280]: E0514 00:00:31.935973 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.936334 kubelet[3280]: W0514 00:00:31.936058 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.936334 kubelet[3280]: E0514 00:00:31.936090 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.936921 kubelet[3280]: E0514 00:00:31.936732 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.936921 kubelet[3280]: W0514 00:00:31.936759 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.936921 kubelet[3280]: E0514 00:00:31.936785 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.937258 kubelet[3280]: E0514 00:00:31.937235 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.937528 kubelet[3280]: W0514 00:00:31.937392 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.937528 kubelet[3280]: E0514 00:00:31.937422 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.937915 kubelet[3280]: E0514 00:00:31.937797 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.937915 kubelet[3280]: W0514 00:00:31.937812 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.937915 kubelet[3280]: E0514 00:00:31.937849 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.938575 kubelet[3280]: E0514 00:00:31.938548 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.938575 kubelet[3280]: W0514 00:00:31.938572 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.939260 kubelet[3280]: E0514 00:00:31.938599 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.939260 kubelet[3280]: E0514 00:00:31.938856 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.939260 kubelet[3280]: W0514 00:00:31.938948 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.939260 kubelet[3280]: E0514 00:00:31.938994 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.940389 kubelet[3280]: E0514 00:00:31.940263 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.940389 kubelet[3280]: W0514 00:00:31.940279 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.940389 kubelet[3280]: E0514 00:00:31.940347 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.940772 kubelet[3280]: E0514 00:00:31.940754 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.940931 kubelet[3280]: W0514 00:00:31.940882 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.940931 kubelet[3280]: E0514 00:00:31.940902 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.941553 kubelet[3280]: E0514 00:00:31.941529 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.941553 kubelet[3280]: W0514 00:00:31.941548 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.943454 kubelet[3280]: E0514 00:00:31.941692 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.943454 kubelet[3280]: E0514 00:00:31.942336 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.943454 kubelet[3280]: W0514 00:00:31.942350 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.943454 kubelet[3280]: E0514 00:00:31.942364 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.943454 kubelet[3280]: E0514 00:00:31.943170 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.943454 kubelet[3280]: W0514 00:00:31.943184 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.943454 kubelet[3280]: E0514 00:00:31.943212 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.944221 kubelet[3280]: E0514 00:00:31.944192 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:31.944221 kubelet[3280]: W0514 00:00:31.944213 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:31.944416 kubelet[3280]: E0514 00:00:31.944227 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:31.990930 containerd[1735]: time="2025-05-14T00:00:31.990878262Z" level=info msg="connecting to shim 97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64" address="unix:///run/containerd/s/10300fe148435f43a8c54d33a8921b2644bddaab7fe0c9c708d86f19ead1fabb" namespace=k8s.io protocol=ttrpc version=3 May 14 00:00:32.003366 kubelet[3280]: E0514 00:00:32.003337 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.003959 kubelet[3280]: W0514 00:00:32.003921 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.004420 kubelet[3280]: E0514 00:00:32.004339 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.005157 kubelet[3280]: E0514 00:00:32.004863 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.005366 kubelet[3280]: W0514 00:00:32.005347 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.005604 kubelet[3280]: E0514 00:00:32.005575 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.005843 kubelet[3280]: E0514 00:00:32.005822 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.005908 kubelet[3280]: W0514 00:00:32.005841 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.005908 kubelet[3280]: E0514 00:00:32.005877 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.007329 kubelet[3280]: E0514 00:00:32.006551 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.007329 kubelet[3280]: W0514 00:00:32.006568 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.007329 kubelet[3280]: E0514 00:00:32.006582 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.007329 kubelet[3280]: E0514 00:00:32.007285 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.007835 kubelet[3280]: W0514 00:00:32.007810 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.007909 kubelet[3280]: E0514 00:00:32.007837 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.009335 kubelet[3280]: E0514 00:00:32.008604 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.009335 kubelet[3280]: W0514 00:00:32.008620 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.009335 kubelet[3280]: E0514 00:00:32.008634 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.010331 kubelet[3280]: E0514 00:00:32.009688 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.010331 kubelet[3280]: W0514 00:00:32.009704 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.010331 kubelet[3280]: E0514 00:00:32.009719 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.010331 kubelet[3280]: E0514 00:00:32.010125 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.010331 kubelet[3280]: W0514 00:00:32.010137 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.010331 kubelet[3280]: E0514 00:00:32.010152 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.010616 kubelet[3280]: E0514 00:00:32.010436 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.010616 kubelet[3280]: W0514 00:00:32.010447 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.010616 kubelet[3280]: E0514 00:00:32.010475 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.011249 kubelet[3280]: E0514 00:00:32.010747 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.011249 kubelet[3280]: W0514 00:00:32.010761 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.011249 kubelet[3280]: E0514 00:00:32.010773 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.011249 kubelet[3280]: E0514 00:00:32.011176 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.011249 kubelet[3280]: W0514 00:00:32.011219 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.011249 kubelet[3280]: E0514 00:00:32.011232 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.011893 kubelet[3280]: E0514 00:00:32.011603 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.011893 kubelet[3280]: W0514 00:00:32.011616 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.011893 kubelet[3280]: E0514 00:00:32.011717 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.012548 kubelet[3280]: E0514 00:00:32.012351 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.012548 kubelet[3280]: W0514 00:00:32.012368 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.012548 kubelet[3280]: E0514 00:00:32.012381 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.012869 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014375 kubelet[3280]: W0514 00:00:32.012884 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.012899 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.013255 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014375 kubelet[3280]: W0514 00:00:32.013267 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.013281 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.013595 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014375 kubelet[3280]: W0514 00:00:32.013626 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.013640 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014375 kubelet[3280]: E0514 00:00:32.013877 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014788 kubelet[3280]: W0514 00:00:32.013890 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.013904 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.014138 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014788 kubelet[3280]: W0514 00:00:32.014149 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.014182 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.014409 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014788 kubelet[3280]: W0514 00:00:32.014420 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.014433 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.014788 kubelet[3280]: E0514 00:00:32.014656 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.014788 kubelet[3280]: W0514 00:00:32.014666 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.015141 kubelet[3280]: E0514 00:00:32.014678 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.040008 kubelet[3280]: E0514 00:00:32.039982 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.040521 kubelet[3280]: W0514 00:00:32.040435 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.040521 kubelet[3280]: E0514 00:00:32.040481 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.040761 kubelet[3280]: I0514 00:00:32.040626 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/de27c4f6-c0d2-40b8-bd37-0674db8e9821-varrun\") pod \"csi-node-driver-vj6mt\" (UID: \"de27c4f6-c0d2-40b8-bd37-0674db8e9821\") " pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:32.042861 systemd[1]: Started cri-containerd-97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64.scope - libcontainer container 97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64. May 14 00:00:32.043543 kubelet[3280]: E0514 00:00:32.043272 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.043543 kubelet[3280]: W0514 00:00:32.043287 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.043543 kubelet[3280]: E0514 00:00:32.043322 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.045540 kubelet[3280]: E0514 00:00:32.045127 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.045540 kubelet[3280]: W0514 00:00:32.045143 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.045540 kubelet[3280]: E0514 00:00:32.045205 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.046195 kubelet[3280]: E0514 00:00:32.045918 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.046195 kubelet[3280]: W0514 00:00:32.045930 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.046195 kubelet[3280]: E0514 00:00:32.046075 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.046828 kubelet[3280]: E0514 00:00:32.046730 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.046828 kubelet[3280]: W0514 00:00:32.046759 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.046828 kubelet[3280]: E0514 00:00:32.046774 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.047139 kubelet[3280]: I0514 00:00:32.047005 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqdr\" (UniqueName: \"kubernetes.io/projected/de27c4f6-c0d2-40b8-bd37-0674db8e9821-kube-api-access-vcqdr\") pod \"csi-node-driver-vj6mt\" (UID: \"de27c4f6-c0d2-40b8-bd37-0674db8e9821\") " pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:32.047436 kubelet[3280]: E0514 00:00:32.047318 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.047436 kubelet[3280]: W0514 00:00:32.047334 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.047436 kubelet[3280]: E0514 00:00:32.047347 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.048206 kubelet[3280]: E0514 00:00:32.048155 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.048206 kubelet[3280]: W0514 00:00:32.048170 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.048206 kubelet[3280]: E0514 00:00:32.048186 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.048596 kubelet[3280]: I0514 00:00:32.048345 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de27c4f6-c0d2-40b8-bd37-0674db8e9821-kubelet-dir\") pod \"csi-node-driver-vj6mt\" (UID: \"de27c4f6-c0d2-40b8-bd37-0674db8e9821\") " pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:32.049759 kubelet[3280]: E0514 00:00:32.049543 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.049759 kubelet[3280]: W0514 00:00:32.049686 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.049759 kubelet[3280]: E0514 00:00:32.049712 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.049759 kubelet[3280]: I0514 00:00:32.049736 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de27c4f6-c0d2-40b8-bd37-0674db8e9821-socket-dir\") pod \"csi-node-driver-vj6mt\" (UID: \"de27c4f6-c0d2-40b8-bd37-0674db8e9821\") " pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:32.050317 kubelet[3280]: E0514 00:00:32.050228 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.050317 kubelet[3280]: W0514 00:00:32.050244 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.050317 kubelet[3280]: E0514 00:00:32.050261 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.051721 containerd[1735]: time="2025-05-14T00:00:32.050638003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n5cf8,Uid:19f6124b-9bfb-4b82-9190-f05cc81702d8,Namespace:calico-system,Attempt:0,}" May 14 00:00:32.051902 kubelet[3280]: E0514 00:00:32.051349 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.051902 kubelet[3280]: W0514 00:00:32.051362 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.051902 kubelet[3280]: E0514 00:00:32.051474 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.052177 kubelet[3280]: E0514 00:00:32.051967 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.052177 kubelet[3280]: W0514 00:00:32.051981 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.052313 kubelet[3280]: E0514 00:00:32.052280 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.052505 kubelet[3280]: I0514 00:00:32.052398 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de27c4f6-c0d2-40b8-bd37-0674db8e9821-registration-dir\") pod \"csi-node-driver-vj6mt\" (UID: \"de27c4f6-c0d2-40b8-bd37-0674db8e9821\") " pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:32.052739 kubelet[3280]: E0514 00:00:32.052725 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.052927 kubelet[3280]: W0514 00:00:32.052822 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.052927 kubelet[3280]: E0514 00:00:32.052847 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.053399 kubelet[3280]: E0514 00:00:32.053384 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.053596 kubelet[3280]: W0514 00:00:32.053510 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.053596 kubelet[3280]: E0514 00:00:32.053544 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.054147 kubelet[3280]: E0514 00:00:32.053997 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.054147 kubelet[3280]: W0514 00:00:32.054011 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.054147 kubelet[3280]: E0514 00:00:32.054045 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.054612 kubelet[3280]: E0514 00:00:32.054554 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.054612 kubelet[3280]: W0514 00:00:32.054571 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.054612 kubelet[3280]: E0514 00:00:32.054585 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.118810 containerd[1735]: time="2025-05-14T00:00:32.118349116Z" level=info msg="connecting to shim 9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1" address="unix:///run/containerd/s/9ab1047aaf5fc9d6f3cc1618c01d11ecb947c19f84b21440a6e63d3f4761509a" namespace=k8s.io protocol=ttrpc version=3 May 14 00:00:32.139875 containerd[1735]: time="2025-05-14T00:00:32.139836210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67f754989f-vbzhk,Uid:b82f7689-fc25-493b-8c14-8dbc58f28401,Namespace:calico-system,Attempt:0,} returns sandbox id \"97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64\"" May 14 00:00:32.142695 containerd[1735]: time="2025-05-14T00:00:32.142424433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 14 00:00:32.154237 kubelet[3280]: E0514 00:00:32.154036 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.154237 kubelet[3280]: W0514 00:00:32.154082 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.154237 kubelet[3280]: E0514 00:00:32.154109 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.156450 kubelet[3280]: E0514 00:00:32.155043 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.156450 kubelet[3280]: W0514 00:00:32.155062 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.156450 kubelet[3280]: E0514 00:00:32.155873 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.156450 kubelet[3280]: E0514 00:00:32.156146 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.156450 kubelet[3280]: W0514 00:00:32.156160 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.156450 kubelet[3280]: E0514 00:00:32.156175 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.156886 kubelet[3280]: E0514 00:00:32.156811 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.156998 kubelet[3280]: W0514 00:00:32.156975 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.157134 kubelet[3280]: E0514 00:00:32.157080 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.157977 kubelet[3280]: E0514 00:00:32.157962 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.158187 kubelet[3280]: W0514 00:00:32.158016 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.158187 kubelet[3280]: E0514 00:00:32.158103 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.158734 kubelet[3280]: E0514 00:00:32.158577 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.158734 kubelet[3280]: W0514 00:00:32.158592 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.159024 kubelet[3280]: E0514 00:00:32.158883 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.159520 kubelet[3280]: E0514 00:00:32.159413 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.159520 kubelet[3280]: W0514 00:00:32.159428 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.159520 kubelet[3280]: E0514 00:00:32.159467 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.160209 kubelet[3280]: E0514 00:00:32.159964 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.160209 kubelet[3280]: W0514 00:00:32.159979 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.160209 kubelet[3280]: E0514 00:00:32.160001 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.160638 kubelet[3280]: E0514 00:00:32.160499 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.160638 kubelet[3280]: W0514 00:00:32.160514 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.160638 kubelet[3280]: E0514 00:00:32.160565 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.161347 kubelet[3280]: E0514 00:00:32.160998 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.161347 kubelet[3280]: W0514 00:00:32.161013 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.161347 kubelet[3280]: E0514 00:00:32.161113 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.162907 kubelet[3280]: E0514 00:00:32.162452 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.162907 kubelet[3280]: W0514 00:00:32.162468 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.162907 kubelet[3280]: E0514 00:00:32.162587 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.163664 kubelet[3280]: E0514 00:00:32.163644 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.163745 kubelet[3280]: W0514 00:00:32.163666 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.163930 kubelet[3280]: E0514 00:00:32.163911 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.164144 kubelet[3280]: E0514 00:00:32.164103 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.164144 kubelet[3280]: W0514 00:00:32.164116 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.164269 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.164602 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.165490 kubelet[3280]: W0514 00:00:32.164613 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.164700 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.164886 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.165490 kubelet[3280]: W0514 00:00:32.164895 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.164969 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.165104 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.165490 kubelet[3280]: W0514 00:00:32.165112 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.165490 kubelet[3280]: E0514 00:00:32.165182 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.165338 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.167160 kubelet[3280]: W0514 00:00:32.165349 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.165365 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.165566 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.167160 kubelet[3280]: W0514 00:00:32.165915 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.165933 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.166400 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.167160 kubelet[3280]: W0514 00:00:32.166411 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.166423 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.167160 kubelet[3280]: E0514 00:00:32.166878 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168054 kubelet[3280]: W0514 00:00:32.166889 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.166907 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.167124 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168054 kubelet[3280]: W0514 00:00:32.167135 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.167149 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.167402 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168054 kubelet[3280]: W0514 00:00:32.167413 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.167587 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168054 kubelet[3280]: W0514 00:00:32.167597 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168054 kubelet[3280]: E0514 00:00:32.167609 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.168436 kubelet[3280]: E0514 00:00:32.167818 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168436 kubelet[3280]: W0514 00:00:32.167828 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168436 kubelet[3280]: E0514 00:00:32.167841 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.168436 kubelet[3280]: E0514 00:00:32.167861 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.168436 kubelet[3280]: E0514 00:00:32.168057 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.168436 kubelet[3280]: W0514 00:00:32.168066 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.168436 kubelet[3280]: E0514 00:00:32.168078 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.170900 systemd[1]: Started cri-containerd-9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1.scope - libcontainer container 9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1. May 14 00:00:32.181892 kubelet[3280]: E0514 00:00:32.181855 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.181892 kubelet[3280]: W0514 00:00:32.181877 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.182040 kubelet[3280]: E0514 00:00:32.181895 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.220183 containerd[1735]: time="2025-05-14T00:00:32.220128037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n5cf8,Uid:19f6124b-9bfb-4b82-9190-f05cc81702d8,Namespace:calico-system,Attempt:0,} returns sandbox id \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\"" May 14 00:00:32.518687 kubelet[3280]: E0514 00:00:32.518652 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.518687 kubelet[3280]: W0514 00:00:32.518681 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.518900 kubelet[3280]: E0514 00:00:32.518707 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.519015 kubelet[3280]: E0514 00:00:32.518996 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.519080 kubelet[3280]: W0514 00:00:32.519016 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.519080 kubelet[3280]: E0514 00:00:32.519035 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.519831 kubelet[3280]: E0514 00:00:32.519804 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.519831 kubelet[3280]: W0514 00:00:32.519825 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.519982 kubelet[3280]: E0514 00:00:32.519840 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.520253 kubelet[3280]: E0514 00:00:32.520222 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.520253 kubelet[3280]: W0514 00:00:32.520241 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.520384 kubelet[3280]: E0514 00:00:32.520255 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.520875 kubelet[3280]: E0514 00:00:32.520851 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.521000 kubelet[3280]: W0514 00:00:32.520981 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.521064 kubelet[3280]: E0514 00:00:32.521006 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.521627 kubelet[3280]: E0514 00:00:32.521605 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.522269 kubelet[3280]: W0514 00:00:32.522241 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.522361 kubelet[3280]: E0514 00:00:32.522273 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.522726 kubelet[3280]: E0514 00:00:32.522533 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.522726 kubelet[3280]: W0514 00:00:32.522560 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.522726 kubelet[3280]: E0514 00:00:32.522578 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.522923 kubelet[3280]: E0514 00:00:32.522905 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.522997 kubelet[3280]: W0514 00:00:32.522979 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.523038 kubelet[3280]: E0514 00:00:32.522996 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.523845 kubelet[3280]: E0514 00:00:32.523827 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.523929 kubelet[3280]: W0514 00:00:32.523843 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.523929 kubelet[3280]: E0514 00:00:32.523882 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.524370 kubelet[3280]: E0514 00:00:32.524348 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.524370 kubelet[3280]: W0514 00:00:32.524367 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.524501 kubelet[3280]: E0514 00:00:32.524384 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.524873 kubelet[3280]: E0514 00:00:32.524854 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.524873 kubelet[3280]: W0514 00:00:32.524871 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.524996 kubelet[3280]: E0514 00:00:32.524885 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.525577 kubelet[3280]: E0514 00:00:32.525553 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.525577 kubelet[3280]: W0514 00:00:32.525573 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.525714 kubelet[3280]: E0514 00:00:32.525588 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.526412 kubelet[3280]: E0514 00:00:32.526389 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.526412 kubelet[3280]: W0514 00:00:32.526408 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.526625 kubelet[3280]: E0514 00:00:32.526422 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.527122 kubelet[3280]: E0514 00:00:32.526711 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.527122 kubelet[3280]: W0514 00:00:32.526726 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.527122 kubelet[3280]: E0514 00:00:32.526740 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.527698 kubelet[3280]: E0514 00:00:32.527675 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.527787 kubelet[3280]: W0514 00:00:32.527708 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.527787 kubelet[3280]: E0514 00:00:32.527723 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.528526 kubelet[3280]: E0514 00:00:32.528507 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.528526 kubelet[3280]: W0514 00:00:32.528524 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.528674 kubelet[3280]: E0514 00:00:32.528537 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.528768 kubelet[3280]: E0514 00:00:32.528752 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.528838 kubelet[3280]: W0514 00:00:32.528768 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.528838 kubelet[3280]: E0514 00:00:32.528782 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.529731 kubelet[3280]: E0514 00:00:32.529388 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.529731 kubelet[3280]: W0514 00:00:32.529402 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.529731 kubelet[3280]: E0514 00:00:32.529416 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.529920 kubelet[3280]: E0514 00:00:32.529902 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.529981 kubelet[3280]: W0514 00:00:32.529922 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.529981 kubelet[3280]: E0514 00:00:32.529936 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.530442 kubelet[3280]: E0514 00:00:32.530421 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.530442 kubelet[3280]: W0514 00:00:32.530441 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.530573 kubelet[3280]: E0514 00:00:32.530456 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.530749 kubelet[3280]: E0514 00:00:32.530728 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.530749 kubelet[3280]: W0514 00:00:32.530747 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.530867 kubelet[3280]: E0514 00:00:32.530761 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.530973 kubelet[3280]: E0514 00:00:32.530956 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.531034 kubelet[3280]: W0514 00:00:32.530973 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.531034 kubelet[3280]: E0514 00:00:32.530986 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531176 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.531782 kubelet[3280]: W0514 00:00:32.531188 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531200 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531429 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.531782 kubelet[3280]: W0514 00:00:32.531443 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531456 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531646 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:32.531782 kubelet[3280]: W0514 00:00:32.531656 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:32.531782 kubelet[3280]: E0514 00:00:32.531668 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:33.772660 kubelet[3280]: E0514 00:00:33.772612 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:34.957107 containerd[1735]: time="2025-05-14T00:00:34.957046197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:34.959778 containerd[1735]: time="2025-05-14T00:00:34.959707622Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 14 00:00:34.964327 containerd[1735]: time="2025-05-14T00:00:34.964255664Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:34.969681 containerd[1735]: time="2025-05-14T00:00:34.969631714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:34.970187 containerd[1735]: time="2025-05-14T00:00:34.970154719Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 2.827689385s" May 14 00:00:34.970251 containerd[1735]: time="2025-05-14T00:00:34.970193820Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 14 00:00:34.972358 containerd[1735]: time="2025-05-14T00:00:34.971615533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 14 00:00:34.982358 containerd[1735]: time="2025-05-14T00:00:34.981500525Z" level=info msg="CreateContainer within sandbox \"97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 14 00:00:35.018915 containerd[1735]: time="2025-05-14T00:00:35.018873074Z" level=info msg="Container 9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d: CDI devices from CRI Config.CDIDevices: []" May 14 00:00:35.044426 containerd[1735]: time="2025-05-14T00:00:35.044379613Z" level=info msg="CreateContainer within sandbox \"97c1c37e695cf47e280304bba2bebd31ef415c813a11308a6d92f15566168c64\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d\"" May 14 00:00:35.044986 containerd[1735]: time="2025-05-14T00:00:35.044779516Z" level=info msg="StartContainer for \"9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d\"" May 14 00:00:35.046083 containerd[1735]: time="2025-05-14T00:00:35.046052228Z" level=info msg="connecting to shim 9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d" address="unix:///run/containerd/s/10300fe148435f43a8c54d33a8921b2644bddaab7fe0c9c708d86f19ead1fabb" protocol=ttrpc version=3 May 14 00:00:35.073499 systemd[1]: Started cri-containerd-9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d.scope - libcontainer container 9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d. May 14 00:00:35.124533 containerd[1735]: time="2025-05-14T00:00:35.124484961Z" level=info msg="StartContainer for \"9281e0a831f011a15393e39c57b63d7293c96752bfb6ecb532efbc8c5b8d4f6d\" returns successfully" May 14 00:00:35.774127 kubelet[3280]: E0514 00:00:35.772857 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:35.918381 kubelet[3280]: I0514 00:00:35.917866 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-67f754989f-vbzhk" podStartSLOduration=2.088612071 podStartE2EDuration="4.917845571s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:00:32.141858728 +0000 UTC m=+11.296818224" lastFinishedPulling="2025-05-14 00:00:34.971092228 +0000 UTC m=+14.126051724" observedRunningTime="2025-05-14 00:00:35.917076763 +0000 UTC m=+15.072036159" watchObservedRunningTime="2025-05-14 00:00:35.917845571 +0000 UTC m=+15.072804967" May 14 00:00:35.954345 kubelet[3280]: E0514 00:00:35.954161 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.954345 kubelet[3280]: W0514 00:00:35.954187 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.954345 kubelet[3280]: E0514 00:00:35.954213 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.954927 kubelet[3280]: E0514 00:00:35.954802 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.954927 kubelet[3280]: W0514 00:00:35.954820 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.954927 kubelet[3280]: E0514 00:00:35.954835 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.955208 kubelet[3280]: E0514 00:00:35.955057 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.955208 kubelet[3280]: W0514 00:00:35.955068 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.955492 kubelet[3280]: E0514 00:00:35.955289 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.955728 kubelet[3280]: E0514 00:00:35.955608 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.955728 kubelet[3280]: W0514 00:00:35.955622 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.955728 kubelet[3280]: E0514 00:00:35.955636 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.956681 kubelet[3280]: E0514 00:00:35.956456 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.956681 kubelet[3280]: W0514 00:00:35.956473 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.956681 kubelet[3280]: E0514 00:00:35.956488 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.957118 kubelet[3280]: E0514 00:00:35.956961 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.957118 kubelet[3280]: W0514 00:00:35.956978 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.957118 kubelet[3280]: E0514 00:00:35.957019 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.957624 kubelet[3280]: E0514 00:00:35.957503 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.957624 kubelet[3280]: W0514 00:00:35.957519 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.957624 kubelet[3280]: E0514 00:00:35.957533 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.958011 kubelet[3280]: E0514 00:00:35.957875 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.958011 kubelet[3280]: W0514 00:00:35.957889 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.958011 kubelet[3280]: E0514 00:00:35.957906 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.958672 kubelet[3280]: E0514 00:00:35.958559 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.958672 kubelet[3280]: W0514 00:00:35.958573 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.958672 kubelet[3280]: E0514 00:00:35.958588 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.958892 kubelet[3280]: E0514 00:00:35.958880 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.958969 kubelet[3280]: W0514 00:00:35.958956 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.959173 kubelet[3280]: E0514 00:00:35.959069 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.959554 kubelet[3280]: E0514 00:00:35.959472 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.959554 kubelet[3280]: W0514 00:00:35.959487 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.959554 kubelet[3280]: E0514 00:00:35.959501 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.959888 kubelet[3280]: E0514 00:00:35.959875 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.960060 kubelet[3280]: W0514 00:00:35.959959 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.960060 kubelet[3280]: E0514 00:00:35.959979 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.960520 kubelet[3280]: E0514 00:00:35.960404 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.960520 kubelet[3280]: W0514 00:00:35.960419 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.960520 kubelet[3280]: E0514 00:00:35.960434 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.961020 kubelet[3280]: E0514 00:00:35.960821 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.961020 kubelet[3280]: W0514 00:00:35.960836 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.961020 kubelet[3280]: E0514 00:00:35.960850 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.961490 kubelet[3280]: E0514 00:00:35.961217 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.961490 kubelet[3280]: W0514 00:00:35.961232 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.961490 kubelet[3280]: E0514 00:00:35.961247 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.996082 kubelet[3280]: E0514 00:00:35.995734 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.996082 kubelet[3280]: W0514 00:00:35.995760 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.996082 kubelet[3280]: E0514 00:00:35.995809 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.996510 kubelet[3280]: E0514 00:00:35.996485 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.996510 kubelet[3280]: W0514 00:00:35.996508 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.996660 kubelet[3280]: E0514 00:00:35.996542 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.997324 kubelet[3280]: E0514 00:00:35.996913 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.997324 kubelet[3280]: W0514 00:00:35.996929 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.997324 kubelet[3280]: E0514 00:00:35.997026 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.997491 kubelet[3280]: E0514 00:00:35.997347 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.997491 kubelet[3280]: W0514 00:00:35.997362 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.997586 kubelet[3280]: E0514 00:00:35.997493 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.997699 kubelet[3280]: E0514 00:00:35.997681 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.997751 kubelet[3280]: W0514 00:00:35.997698 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.997751 kubelet[3280]: E0514 00:00:35.997745 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.998040 kubelet[3280]: E0514 00:00:35.998018 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.998040 kubelet[3280]: W0514 00:00:35.998037 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.998159 kubelet[3280]: E0514 00:00:35.998091 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.998399 kubelet[3280]: E0514 00:00:35.998377 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.998473 kubelet[3280]: W0514 00:00:35.998413 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.998473 kubelet[3280]: E0514 00:00:35.998432 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.998848 kubelet[3280]: E0514 00:00:35.998745 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.998848 kubelet[3280]: W0514 00:00:35.998761 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.998848 kubelet[3280]: E0514 00:00:35.998806 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.999261 kubelet[3280]: E0514 00:00:35.999239 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.999466 kubelet[3280]: W0514 00:00:35.999361 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.999702 kubelet[3280]: E0514 00:00:35.999688 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:35.999891 kubelet[3280]: W0514 00:00:35.999797 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:35.999891 kubelet[3280]: E0514 00:00:35.999723 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:35.999891 kubelet[3280]: E0514 00:00:35.999873 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.000448 kubelet[3280]: E0514 00:00:36.000250 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.000448 kubelet[3280]: W0514 00:00:36.000264 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.000448 kubelet[3280]: E0514 00:00:36.000308 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.000995 kubelet[3280]: E0514 00:00:36.000776 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.000995 kubelet[3280]: W0514 00:00:36.000790 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.000995 kubelet[3280]: E0514 00:00:36.000813 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.001369 kubelet[3280]: E0514 00:00:36.001206 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.001369 kubelet[3280]: W0514 00:00:36.001219 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.001369 kubelet[3280]: E0514 00:00:36.001248 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.002028 kubelet[3280]: E0514 00:00:36.001706 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.002028 kubelet[3280]: W0514 00:00:36.001721 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.002028 kubelet[3280]: E0514 00:00:36.001804 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.002372 kubelet[3280]: E0514 00:00:36.002358 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.002601 kubelet[3280]: W0514 00:00:36.002456 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.002601 kubelet[3280]: E0514 00:00:36.002480 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.002882 kubelet[3280]: E0514 00:00:36.002758 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.002882 kubelet[3280]: W0514 00:00:36.002774 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.002882 kubelet[3280]: E0514 00:00:36.002787 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.003426 kubelet[3280]: E0514 00:00:36.003394 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.003426 kubelet[3280]: W0514 00:00:36.003406 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.003527 kubelet[3280]: E0514 00:00:36.003513 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.003697 kubelet[3280]: E0514 00:00:36.003670 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.003697 kubelet[3280]: W0514 00:00:36.003686 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.003697 kubelet[3280]: E0514 00:00:36.003699 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.905667 kubelet[3280]: I0514 00:00:36.905632 3280 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:00:36.968993 kubelet[3280]: E0514 00:00:36.968956 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.968993 kubelet[3280]: W0514 00:00:36.968983 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.968993 kubelet[3280]: E0514 00:00:36.969011 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.969419 kubelet[3280]: E0514 00:00:36.969244 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.969419 kubelet[3280]: W0514 00:00:36.969256 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.969419 kubelet[3280]: E0514 00:00:36.969276 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.969643 kubelet[3280]: E0514 00:00:36.969499 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.969643 kubelet[3280]: W0514 00:00:36.969511 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.969643 kubelet[3280]: E0514 00:00:36.969526 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.969846 kubelet[3280]: E0514 00:00:36.969719 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.969846 kubelet[3280]: W0514 00:00:36.969729 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.969846 kubelet[3280]: E0514 00:00:36.969744 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.970073 kubelet[3280]: E0514 00:00:36.969936 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.970073 kubelet[3280]: W0514 00:00:36.969947 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.970073 kubelet[3280]: E0514 00:00:36.969960 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.970260 kubelet[3280]: E0514 00:00:36.970136 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.970260 kubelet[3280]: W0514 00:00:36.970146 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.970260 kubelet[3280]: E0514 00:00:36.970159 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.970480 kubelet[3280]: E0514 00:00:36.970347 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.970480 kubelet[3280]: W0514 00:00:36.970357 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.970480 kubelet[3280]: E0514 00:00:36.970372 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.970663 kubelet[3280]: E0514 00:00:36.970552 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.970663 kubelet[3280]: W0514 00:00:36.970563 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.970663 kubelet[3280]: E0514 00:00:36.970575 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.970854 kubelet[3280]: E0514 00:00:36.970757 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.970854 kubelet[3280]: W0514 00:00:36.970767 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.970854 kubelet[3280]: E0514 00:00:36.970779 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.971080 kubelet[3280]: E0514 00:00:36.970945 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.971080 kubelet[3280]: W0514 00:00:36.970955 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.971080 kubelet[3280]: E0514 00:00:36.970966 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.971267 kubelet[3280]: E0514 00:00:36.971145 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.971267 kubelet[3280]: W0514 00:00:36.971155 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.971267 kubelet[3280]: E0514 00:00:36.971167 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.971511 kubelet[3280]: E0514 00:00:36.971358 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.971511 kubelet[3280]: W0514 00:00:36.971368 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.971511 kubelet[3280]: E0514 00:00:36.971380 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.971707 kubelet[3280]: E0514 00:00:36.971562 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.971707 kubelet[3280]: W0514 00:00:36.971571 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.971707 kubelet[3280]: E0514 00:00:36.971583 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.971907 kubelet[3280]: E0514 00:00:36.971771 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.971907 kubelet[3280]: W0514 00:00:36.971780 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.971907 kubelet[3280]: E0514 00:00:36.971792 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:36.972068 kubelet[3280]: E0514 00:00:36.971971 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:36.972068 kubelet[3280]: W0514 00:00:36.971981 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:36.972068 kubelet[3280]: E0514 00:00:36.971992 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.003840 kubelet[3280]: E0514 00:00:37.003813 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.003840 kubelet[3280]: W0514 00:00:37.003833 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.004083 kubelet[3280]: E0514 00:00:37.003853 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.004383 kubelet[3280]: E0514 00:00:37.004363 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.004383 kubelet[3280]: W0514 00:00:37.004378 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.004534 kubelet[3280]: E0514 00:00:37.004399 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.004707 kubelet[3280]: E0514 00:00:37.004688 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.004707 kubelet[3280]: W0514 00:00:37.004703 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.004892 kubelet[3280]: E0514 00:00:37.004724 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.005012 kubelet[3280]: E0514 00:00:37.004989 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.005012 kubelet[3280]: W0514 00:00:37.005003 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.005138 kubelet[3280]: E0514 00:00:37.005021 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.005313 kubelet[3280]: E0514 00:00:37.005286 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.005313 kubelet[3280]: W0514 00:00:37.005311 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.005424 kubelet[3280]: E0514 00:00:37.005330 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.005606 kubelet[3280]: E0514 00:00:37.005557 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.005606 kubelet[3280]: W0514 00:00:37.005578 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.005606 kubelet[3280]: E0514 00:00:37.005594 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.005821 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.006419 kubelet[3280]: W0514 00:00:37.005832 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.005858 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.006053 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.006419 kubelet[3280]: W0514 00:00:37.006064 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.006262 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.006419 kubelet[3280]: W0514 00:00:37.006272 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.006287 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.006419 kubelet[3280]: E0514 00:00:37.006342 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.006825 kubelet[3280]: E0514 00:00:37.006495 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.006825 kubelet[3280]: W0514 00:00:37.006505 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.006825 kubelet[3280]: E0514 00:00:37.006522 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.006825 kubelet[3280]: E0514 00:00:37.006683 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.006825 kubelet[3280]: W0514 00:00:37.006692 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.006825 kubelet[3280]: E0514 00:00:37.006702 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.007057 kubelet[3280]: E0514 00:00:37.006893 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.007057 kubelet[3280]: W0514 00:00:37.006902 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.007057 kubelet[3280]: E0514 00:00:37.006914 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.007338 kubelet[3280]: E0514 00:00:37.007318 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.007338 kubelet[3280]: W0514 00:00:37.007333 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.007525 kubelet[3280]: E0514 00:00:37.007357 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.007589 kubelet[3280]: E0514 00:00:37.007578 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.007647 kubelet[3280]: W0514 00:00:37.007589 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.007647 kubelet[3280]: E0514 00:00:37.007623 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.007859 kubelet[3280]: E0514 00:00:37.007845 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.007933 kubelet[3280]: W0514 00:00:37.007911 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.007933 kubelet[3280]: E0514 00:00:37.007927 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.008141 kubelet[3280]: E0514 00:00:37.008119 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.008141 kubelet[3280]: W0514 00:00:37.008132 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.008268 kubelet[3280]: E0514 00:00:37.008144 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.008409 kubelet[3280]: E0514 00:00:37.008388 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.008409 kubelet[3280]: W0514 00:00:37.008400 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.008519 kubelet[3280]: E0514 00:00:37.008412 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.008747 kubelet[3280]: E0514 00:00:37.008729 3280 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:00:37.008747 kubelet[3280]: W0514 00:00:37.008743 3280 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:00:37.008828 kubelet[3280]: E0514 00:00:37.008755 3280 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:00:37.319061 containerd[1735]: time="2025-05-14T00:00:37.319018658Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:37.322003 containerd[1735]: time="2025-05-14T00:00:37.321920285Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 14 00:00:37.327625 containerd[1735]: time="2025-05-14T00:00:37.327556637Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:37.333126 containerd[1735]: time="2025-05-14T00:00:37.333045689Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:37.334560 containerd[1735]: time="2025-05-14T00:00:37.334364101Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.362710568s" May 14 00:00:37.334560 containerd[1735]: time="2025-05-14T00:00:37.334410201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 14 00:00:37.337800 containerd[1735]: time="2025-05-14T00:00:37.337624731Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 00:00:37.369677 containerd[1735]: time="2025-05-14T00:00:37.367666912Z" level=info msg="Container e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded: CDI devices from CRI Config.CDIDevices: []" May 14 00:00:37.388246 containerd[1735]: time="2025-05-14T00:00:37.388200504Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\"" May 14 00:00:37.389900 containerd[1735]: time="2025-05-14T00:00:37.388800109Z" level=info msg="StartContainer for \"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\"" May 14 00:00:37.390582 containerd[1735]: time="2025-05-14T00:00:37.390550526Z" level=info msg="connecting to shim e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded" address="unix:///run/containerd/s/9ab1047aaf5fc9d6f3cc1618c01d11ecb947c19f84b21440a6e63d3f4761509a" protocol=ttrpc version=3 May 14 00:00:37.418455 systemd[1]: Started cri-containerd-e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded.scope - libcontainer container e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded. May 14 00:00:37.465778 containerd[1735]: time="2025-05-14T00:00:37.465731828Z" level=info msg="StartContainer for \"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\" returns successfully" May 14 00:00:37.470642 systemd[1]: cri-containerd-e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded.scope: Deactivated successfully. May 14 00:00:37.473151 containerd[1735]: time="2025-05-14T00:00:37.473105897Z" level=info msg="received exit event container_id:\"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\" id:\"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\" pid:4066 exited_at:{seconds:1747180837 nanos:472666593}" May 14 00:00:37.473551 containerd[1735]: time="2025-05-14T00:00:37.473508500Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\" id:\"e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded\" pid:4066 exited_at:{seconds:1747180837 nanos:472666593}" May 14 00:00:37.504762 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e6103a4177136df6e5b5c7e5be6e2a47882b3d5166760b4e1a8de68a01d56ded-rootfs.mount: Deactivated successfully. May 14 00:00:37.773502 kubelet[3280]: E0514 00:00:37.773001 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:39.773677 kubelet[3280]: E0514 00:00:39.773106 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:39.918878 containerd[1735]: time="2025-05-14T00:00:39.918548937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 14 00:00:41.774332 kubelet[3280]: E0514 00:00:41.773291 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:43.774280 kubelet[3280]: E0514 00:00:43.772950 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:45.772853 kubelet[3280]: E0514 00:00:45.772399 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:47.774101 kubelet[3280]: E0514 00:00:47.773056 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:49.230908 kubelet[3280]: I0514 00:00:49.230555 3280 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:00:49.585051 containerd[1735]: time="2025-05-14T00:00:49.584895123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:49.629150 containerd[1735]: time="2025-05-14T00:00:49.628986735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 14 00:00:49.680925 containerd[1735]: time="2025-05-14T00:00:49.680833420Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:49.739327 containerd[1735]: time="2025-05-14T00:00:49.738108056Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:00:49.739327 containerd[1735]: time="2025-05-14T00:00:49.739271067Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 9.82067173s" May 14 00:00:49.739327 containerd[1735]: time="2025-05-14T00:00:49.739327068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 14 00:00:49.743640 containerd[1735]: time="2025-05-14T00:00:49.743602908Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 00:00:49.772491 kubelet[3280]: E0514 00:00:49.772446 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:49.888569 containerd[1735]: time="2025-05-14T00:00:49.888446963Z" level=info msg="Container dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5: CDI devices from CRI Config.CDIDevices: []" May 14 00:00:50.045703 containerd[1735]: time="2025-05-14T00:00:50.045655933Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\"" May 14 00:00:50.047637 containerd[1735]: time="2025-05-14T00:00:50.046177638Z" level=info msg="StartContainer for \"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\"" May 14 00:00:50.048160 containerd[1735]: time="2025-05-14T00:00:50.048103256Z" level=info msg="connecting to shim dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5" address="unix:///run/containerd/s/9ab1047aaf5fc9d6f3cc1618c01d11ecb947c19f84b21440a6e63d3f4761509a" protocol=ttrpc version=3 May 14 00:00:50.073473 systemd[1]: Started cri-containerd-dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5.scope - libcontainer container dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5. May 14 00:00:50.114028 containerd[1735]: time="2025-05-14T00:00:50.113905172Z" level=info msg="StartContainer for \"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\" returns successfully" May 14 00:00:51.773561 kubelet[3280]: E0514 00:00:51.773006 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:53.773177 kubelet[3280]: E0514 00:00:53.772671 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:55.772413 kubelet[3280]: E0514 00:00:55.772138 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:56.330313 systemd[1]: cri-containerd-dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5.scope: Deactivated successfully. May 14 00:00:56.330658 systemd[1]: cri-containerd-dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5.scope: Consumed 465ms CPU time, 173.3M memory peak, 154M written to disk. May 14 00:00:56.332327 containerd[1735]: time="2025-05-14T00:00:56.332064244Z" level=info msg="received exit event container_id:\"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\" id:\"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\" pid:4127 exited_at:{seconds:1747180856 nanos:330677631}" May 14 00:00:56.332907 containerd[1735]: time="2025-05-14T00:00:56.332641149Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\" id:\"dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5\" pid:4127 exited_at:{seconds:1747180856 nanos:330677631}" May 14 00:00:56.332956 kubelet[3280]: I0514 00:00:56.332828 3280 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 14 00:00:56.367591 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dae324f7ac4d10e1ad7e057d5da6efe79b7a51d9728f64d6e8f90ef8057f3ce5-rootfs.mount: Deactivated successfully. May 14 00:00:56.390778 systemd[1]: Created slice kubepods-burstable-podeebab07c_a2e0_4962_9267_45fd956a2e27.slice - libcontainer container kubepods-burstable-podeebab07c_a2e0_4962_9267_45fd956a2e27.slice. May 14 00:00:56.405126 systemd[1]: Created slice kubepods-besteffort-pod6fe88591_92ff_4491_90ae_c32f13e57639.slice - libcontainer container kubepods-besteffort-pod6fe88591_92ff_4491_90ae_c32f13e57639.slice. May 14 00:00:56.888570 kubelet[3280]: W0514 00:00:56.396702 3280 reflector.go:561] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4284.0.0-n-c527831f7b" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object May 14 00:00:56.888570 kubelet[3280]: E0514 00:00:56.396738 3280 reflector.go:158] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4284.0.0-n-c527831f7b\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284.0.0-n-c527831f7b' and this object" logger="UnhandledError" May 14 00:00:56.888570 kubelet[3280]: I0514 00:00:56.446039 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fxtg\" (UniqueName: \"kubernetes.io/projected/abd4dce2-aa29-4536-9c3c-73cb1b731afc-kube-api-access-6fxtg\") pod \"calico-apiserver-6d47754b8-jnbd2\" (UID: \"abd4dce2-aa29-4536-9c3c-73cb1b731afc\") " pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" May 14 00:00:56.888570 kubelet[3280]: I0514 00:00:56.446076 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/318fe614-9b16-49bc-840b-cb2287a86e40-config-volume\") pod \"coredns-6f6b679f8f-zl6h5\" (UID: \"318fe614-9b16-49bc-840b-cb2287a86e40\") " pod="kube-system/coredns-6f6b679f8f-zl6h5" May 14 00:00:56.888570 kubelet[3280]: I0514 00:00:56.446103 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lmfn\" (UniqueName: \"kubernetes.io/projected/6386bb04-f08c-46cd-8bd4-cc9e0fdf6662-kube-api-access-4lmfn\") pod \"calico-apiserver-6d47754b8-l5pcr\" (UID: \"6386bb04-f08c-46cd-8bd4-cc9e0fdf6662\") " pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" May 14 00:00:56.418812 systemd[1]: Created slice kubepods-besteffort-podabd4dce2_aa29_4536_9c3c_73cb1b731afc.slice - libcontainer container kubepods-besteffort-podabd4dce2_aa29_4536_9c3c_73cb1b731afc.slice. May 14 00:00:56.889202 kubelet[3280]: I0514 00:00:56.446128 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/abd4dce2-aa29-4536-9c3c-73cb1b731afc-calico-apiserver-certs\") pod \"calico-apiserver-6d47754b8-jnbd2\" (UID: \"abd4dce2-aa29-4536-9c3c-73cb1b731afc\") " pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" May 14 00:00:56.889202 kubelet[3280]: I0514 00:00:56.446162 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhd4\" (UniqueName: \"kubernetes.io/projected/eebab07c-a2e0-4962-9267-45fd956a2e27-kube-api-access-qhhd4\") pod \"coredns-6f6b679f8f-2dhjc\" (UID: \"eebab07c-a2e0-4962-9267-45fd956a2e27\") " pod="kube-system/coredns-6f6b679f8f-2dhjc" May 14 00:00:56.889202 kubelet[3280]: I0514 00:00:56.446186 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qx4\" (UniqueName: \"kubernetes.io/projected/6fe88591-92ff-4491-90ae-c32f13e57639-kube-api-access-54qx4\") pod \"calico-kube-controllers-b8c5bc6b6-6m9m2\" (UID: \"6fe88591-92ff-4491-90ae-c32f13e57639\") " pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" May 14 00:00:56.889202 kubelet[3280]: I0514 00:00:56.446209 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fe88591-92ff-4491-90ae-c32f13e57639-tigera-ca-bundle\") pod \"calico-kube-controllers-b8c5bc6b6-6m9m2\" (UID: \"6fe88591-92ff-4491-90ae-c32f13e57639\") " pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" May 14 00:00:56.889202 kubelet[3280]: I0514 00:00:56.446231 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzggj\" (UniqueName: \"kubernetes.io/projected/318fe614-9b16-49bc-840b-cb2287a86e40-kube-api-access-zzggj\") pod \"coredns-6f6b679f8f-zl6h5\" (UID: \"318fe614-9b16-49bc-840b-cb2287a86e40\") " pod="kube-system/coredns-6f6b679f8f-zl6h5" May 14 00:00:56.426383 systemd[1]: Created slice kubepods-burstable-pod318fe614_9b16_49bc_840b_cb2287a86e40.slice - libcontainer container kubepods-burstable-pod318fe614_9b16_49bc_840b_cb2287a86e40.slice. May 14 00:00:56.889530 kubelet[3280]: I0514 00:00:56.446257 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eebab07c-a2e0-4962-9267-45fd956a2e27-config-volume\") pod \"coredns-6f6b679f8f-2dhjc\" (UID: \"eebab07c-a2e0-4962-9267-45fd956a2e27\") " pod="kube-system/coredns-6f6b679f8f-2dhjc" May 14 00:00:56.889530 kubelet[3280]: I0514 00:00:56.446279 3280 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6386bb04-f08c-46cd-8bd4-cc9e0fdf6662-calico-apiserver-certs\") pod \"calico-apiserver-6d47754b8-l5pcr\" (UID: \"6386bb04-f08c-46cd-8bd4-cc9e0fdf6662\") " pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" May 14 00:00:56.437940 systemd[1]: Created slice kubepods-besteffort-pod6386bb04_f08c_46cd_8bd4_cc9e0fdf6662.slice - libcontainer container kubepods-besteffort-pod6386bb04_f08c_46cd_8bd4_cc9e0fdf6662.slice. May 14 00:00:57.193290 containerd[1735]: time="2025-05-14T00:00:57.193246272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-l5pcr,Uid:6386bb04-f08c-46cd-8bd4-cc9e0fdf6662,Namespace:calico-apiserver,Attempt:0,}" May 14 00:00:57.193668 containerd[1735]: time="2025-05-14T00:00:57.193246272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-jnbd2,Uid:abd4dce2-aa29-4536-9c3c-73cb1b731afc,Namespace:calico-apiserver,Attempt:0,}" May 14 00:00:57.213078 containerd[1735]: time="2025-05-14T00:00:57.213033856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b8c5bc6b6-6m9m2,Uid:6fe88591-92ff-4491-90ae-c32f13e57639,Namespace:calico-system,Attempt:0,}" May 14 00:00:57.547860 kubelet[3280]: E0514 00:00:57.547728 3280 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition May 14 00:00:57.547860 kubelet[3280]: E0514 00:00:57.547836 3280 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eebab07c-a2e0-4962-9267-45fd956a2e27-config-volume podName:eebab07c-a2e0-4962-9267-45fd956a2e27 nodeName:}" failed. No retries permitted until 2025-05-14 00:00:58.047815077 +0000 UTC m=+37.202774573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/eebab07c-a2e0-4962-9267-45fd956a2e27-config-volume") pod "coredns-6f6b679f8f-2dhjc" (UID: "eebab07c-a2e0-4962-9267-45fd956a2e27") : failed to sync configmap cache: timed out waiting for the condition May 14 00:00:57.548416 kubelet[3280]: E0514 00:00:57.548374 3280 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition May 14 00:00:57.548533 kubelet[3280]: E0514 00:00:57.548448 3280 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/318fe614-9b16-49bc-840b-cb2287a86e40-config-volume podName:318fe614-9b16-49bc-840b-cb2287a86e40 nodeName:}" failed. No retries permitted until 2025-05-14 00:00:58.048431383 +0000 UTC m=+37.203390779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/318fe614-9b16-49bc-840b-cb2287a86e40-config-volume") pod "coredns-6f6b679f8f-zl6h5" (UID: "318fe614-9b16-49bc-840b-cb2287a86e40") : failed to sync configmap cache: timed out waiting for the condition May 14 00:00:57.779232 systemd[1]: Created slice kubepods-besteffort-podde27c4f6_c0d2_40b8_bd37_0674db8e9821.slice - libcontainer container kubepods-besteffort-podde27c4f6_c0d2_40b8_bd37_0674db8e9821.slice. May 14 00:00:57.781561 containerd[1735]: time="2025-05-14T00:00:57.781518555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,}" May 14 00:00:58.092774 containerd[1735]: time="2025-05-14T00:00:58.092721356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2dhjc,Uid:eebab07c-a2e0-4962-9267-45fd956a2e27,Namespace:kube-system,Attempt:0,}" May 14 00:00:58.113652 containerd[1735]: time="2025-05-14T00:00:58.113437449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-zl6h5,Uid:318fe614-9b16-49bc-840b-cb2287a86e40,Namespace:kube-system,Attempt:0,}" May 14 00:00:58.219322 containerd[1735]: time="2025-05-14T00:00:58.219249136Z" level=error msg="Failed to destroy network for sandbox \"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.225437 containerd[1735]: time="2025-05-14T00:00:58.225076690Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-jnbd2,Uid:abd4dce2-aa29-4536-9c3c-73cb1b731afc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.227353 kubelet[3280]: E0514 00:00:58.227213 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.227764 kubelet[3280]: E0514 00:00:58.227413 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" May 14 00:00:58.227764 kubelet[3280]: E0514 00:00:58.227463 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" May 14 00:00:58.227764 kubelet[3280]: E0514 00:00:58.227555 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d47754b8-jnbd2_calico-apiserver(abd4dce2-aa29-4536-9c3c-73cb1b731afc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d47754b8-jnbd2_calico-apiserver(abd4dce2-aa29-4536-9c3c-73cb1b731afc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fbe051f15528651800cbf5014454df4ecc6e072dd7fdabcda65db6e932e7bf49\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" podUID="abd4dce2-aa29-4536-9c3c-73cb1b731afc" May 14 00:00:58.254499 containerd[1735]: time="2025-05-14T00:00:58.254113961Z" level=error msg="Failed to destroy network for sandbox \"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.259807 containerd[1735]: time="2025-05-14T00:00:58.259743013Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b8c5bc6b6-6m9m2,Uid:6fe88591-92ff-4491-90ae-c32f13e57639,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.261942 kubelet[3280]: E0514 00:00:58.260538 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.261942 kubelet[3280]: E0514 00:00:58.260751 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" May 14 00:00:58.261942 kubelet[3280]: E0514 00:00:58.260784 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" May 14 00:00:58.262182 kubelet[3280]: E0514 00:00:58.260859 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b8c5bc6b6-6m9m2_calico-system(6fe88591-92ff-4491-90ae-c32f13e57639)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b8c5bc6b6-6m9m2_calico-system(6fe88591-92ff-4491-90ae-c32f13e57639)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"650736c14659be23e09fe13ba5e376aa7b2a1bdcd960ae94a05065c95891b173\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" podUID="6fe88591-92ff-4491-90ae-c32f13e57639" May 14 00:00:58.266232 containerd[1735]: time="2025-05-14T00:00:58.266063472Z" level=error msg="Failed to destroy network for sandbox \"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.269915 containerd[1735]: time="2025-05-14T00:00:58.269785407Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-l5pcr,Uid:6386bb04-f08c-46cd-8bd4-cc9e0fdf6662,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.270237 kubelet[3280]: E0514 00:00:58.270101 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.270352 kubelet[3280]: E0514 00:00:58.270265 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" May 14 00:00:58.270513 kubelet[3280]: E0514 00:00:58.270484 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" May 14 00:00:58.270799 kubelet[3280]: E0514 00:00:58.270671 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d47754b8-l5pcr_calico-apiserver(6386bb04-f08c-46cd-8bd4-cc9e0fdf6662)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d47754b8-l5pcr_calico-apiserver(6386bb04-f08c-46cd-8bd4-cc9e0fdf6662)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f176a57d5002a74287c9673459f365af999d6089dd8040bd8e2bca72efb98136\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" podUID="6386bb04-f08c-46cd-8bd4-cc9e0fdf6662" May 14 00:00:58.306110 containerd[1735]: time="2025-05-14T00:00:58.306050945Z" level=error msg="Failed to destroy network for sandbox \"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.310803 containerd[1735]: time="2025-05-14T00:00:58.310655588Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.312622 kubelet[3280]: E0514 00:00:58.311451 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.312622 kubelet[3280]: E0514 00:00:58.311523 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:58.312622 kubelet[3280]: E0514 00:00:58.311547 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vj6mt" May 14 00:00:58.312849 kubelet[3280]: E0514 00:00:58.311612 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vj6mt_calico-system(de27c4f6-c0d2-40b8-bd37-0674db8e9821)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vj6mt_calico-system(de27c4f6-c0d2-40b8-bd37-0674db8e9821)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"054c65219e60f0c643e187da3222210cf719062eca6ad9ff8a6e406061dccb07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:00:58.313400 containerd[1735]: time="2025-05-14T00:00:58.313240412Z" level=error msg="Failed to destroy network for sandbox \"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.318682 containerd[1735]: time="2025-05-14T00:00:58.318637462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2dhjc,Uid:eebab07c-a2e0-4962-9267-45fd956a2e27,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.319094 kubelet[3280]: E0514 00:00:58.319053 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.319277 kubelet[3280]: E0514 00:00:58.319253 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2dhjc" May 14 00:00:58.319552 kubelet[3280]: E0514 00:00:58.319422 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2dhjc" May 14 00:00:58.320315 kubelet[3280]: E0514 00:00:58.319534 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-2dhjc_kube-system(eebab07c-a2e0-4962-9267-45fd956a2e27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-2dhjc_kube-system(eebab07c-a2e0-4962-9267-45fd956a2e27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"707f7a91787d9897b98450f1d8fe0ad94d0188649d5f52fda90cc8d77ff0206c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-2dhjc" podUID="eebab07c-a2e0-4962-9267-45fd956a2e27" May 14 00:00:58.321483 containerd[1735]: time="2025-05-14T00:00:58.321441388Z" level=error msg="Failed to destroy network for sandbox \"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.325133 containerd[1735]: time="2025-05-14T00:00:58.325092222Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-zl6h5,Uid:318fe614-9b16-49bc-840b-cb2287a86e40,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.325316 kubelet[3280]: E0514 00:00:58.325279 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:00:58.325396 kubelet[3280]: E0514 00:00:58.325339 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-zl6h5" May 14 00:00:58.325396 kubelet[3280]: E0514 00:00:58.325363 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-zl6h5" May 14 00:00:58.325489 kubelet[3280]: E0514 00:00:58.325408 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-zl6h5_kube-system(318fe614-9b16-49bc-840b-cb2287a86e40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-zl6h5_kube-system(318fe614-9b16-49bc-840b-cb2287a86e40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9324e1e2e1c67700d3f368ac55ae4f23cd1ff72418625969191368ae53e881d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-zl6h5" podUID="318fe614-9b16-49bc-840b-cb2287a86e40" May 14 00:00:58.966802 containerd[1735]: time="2025-05-14T00:00:58.966443401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 14 00:00:58.989633 systemd[1]: run-netns-cni\x2d0321c412\x2d5484\x2d82b6\x2d6b57\x2d4e4c3b2e2f3d.mount: Deactivated successfully. May 14 00:00:58.989833 systemd[1]: run-netns-cni\x2db610868f\x2d2c99\x2d16b3\x2dc1a6\x2dfd2bb127faa9.mount: Deactivated successfully. May 14 00:00:58.989912 systemd[1]: run-netns-cni\x2dc9fa1fc1\x2d59d5\x2ddc8e\x2d9270\x2d058e94dba434.mount: Deactivated successfully. May 14 00:00:58.989985 systemd[1]: run-netns-cni\x2d0d1747b0\x2dd4f9\x2ddf45\x2d50a2\x2db950e9234f91.mount: Deactivated successfully. May 14 00:00:58.990070 systemd[1]: run-netns-cni\x2dbc7b485e\x2d327a\x2d5b33\x2d009e\x2de16d90d1f8a2.mount: Deactivated successfully. May 14 00:01:06.726456 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3564461313.mount: Deactivated successfully. May 14 00:01:06.829966 containerd[1735]: time="2025-05-14T00:01:06.829899007Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:07.077357 containerd[1735]: time="2025-05-14T00:01:07.077019830Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 14 00:01:07.283243 containerd[1735]: time="2025-05-14T00:01:07.282229660Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:07.332801 containerd[1735]: time="2025-05-14T00:01:07.332570433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:07.334139 containerd[1735]: time="2025-05-14T00:01:07.333498342Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.367000141s" May 14 00:01:07.334139 containerd[1735]: time="2025-05-14T00:01:07.333543542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 14 00:01:07.343800 containerd[1735]: time="2025-05-14T00:01:07.343658037Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 00:01:08.694937 containerd[1735]: time="2025-05-14T00:01:08.692540819Z" level=info msg="Container 4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:08.773699 containerd[1735]: time="2025-05-14T00:01:08.773632481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,}" May 14 00:01:08.889320 containerd[1735]: time="2025-05-14T00:01:08.889187067Z" level=info msg="CreateContainer within sandbox \"9c4d46a929bed3365bc6c22c1ecbc7c7ac810bfda3640228c7cca2edc428a7e1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\"" May 14 00:01:08.891266 containerd[1735]: time="2025-05-14T00:01:08.891116586Z" level=info msg="StartContainer for \"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\"" May 14 00:01:08.896137 containerd[1735]: time="2025-05-14T00:01:08.895804330Z" level=info msg="connecting to shim 4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b" address="unix:///run/containerd/s/9ab1047aaf5fc9d6f3cc1618c01d11ecb947c19f84b21440a6e63d3f4761509a" protocol=ttrpc version=3 May 14 00:01:08.924493 systemd[1]: Started cri-containerd-4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b.scope - libcontainer container 4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b. May 14 00:01:08.969627 containerd[1735]: time="2025-05-14T00:01:08.969349821Z" level=error msg="Failed to destroy network for sandbox \"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:01:08.975219 systemd[1]: run-netns-cni\x2d5f656b2e\x2d6612\x2d4625\x2d655e\x2d1ab37a7fc76c.mount: Deactivated successfully. May 14 00:01:08.977571 containerd[1735]: time="2025-05-14T00:01:08.977137794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:01:08.977846 kubelet[3280]: E0514 00:01:08.977405 3280 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:01:08.977846 kubelet[3280]: E0514 00:01:08.977661 3280 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vj6mt" May 14 00:01:08.977846 kubelet[3280]: E0514 00:01:08.977693 3280 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vj6mt" May 14 00:01:08.978899 kubelet[3280]: E0514 00:01:08.978355 3280 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vj6mt_calico-system(de27c4f6-c0d2-40b8-bd37-0674db8e9821)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vj6mt_calico-system(de27c4f6-c0d2-40b8-bd37-0674db8e9821)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4805797b57da1189ebefb03f379e0162b28eb41bd0de93678e8293729f1dba51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vj6mt" podUID="de27c4f6-c0d2-40b8-bd37-0674db8e9821" May 14 00:01:08.999651 containerd[1735]: time="2025-05-14T00:01:08.999605206Z" level=info msg="StartContainer for \"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" returns successfully" May 14 00:01:09.257986 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 14 00:01:09.258138 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 14 00:01:09.774007 containerd[1735]: time="2025-05-14T00:01:09.773611282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-jnbd2,Uid:abd4dce2-aa29-4536-9c3c-73cb1b731afc,Namespace:calico-apiserver,Attempt:0,}" May 14 00:01:09.990692 systemd-networkd[1559]: calie83ebffe94c: Link UP May 14 00:01:09.991701 systemd-networkd[1559]: calie83ebffe94c: Gained carrier May 14 00:01:10.010337 containerd[1735]: 2025-05-14 00:01:09.844 [INFO][4454] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 00:01:10.010337 containerd[1735]: 2025-05-14 00:01:09.856 [INFO][4454] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0 calico-apiserver-6d47754b8- calico-apiserver abd4dce2-aa29-4536-9c3c-73cb1b731afc 723 0 2025-05-14 00:00:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d47754b8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b calico-apiserver-6d47754b8-jnbd2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie83ebffe94c [] []}} ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-" May 14 00:01:10.010337 containerd[1735]: 2025-05-14 00:01:09.856 [INFO][4454] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.010337 containerd[1735]: 2025-05-14 00:01:09.880 [INFO][4465] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" HandleID="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.888 [INFO][4465] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" HandleID="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051ee0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-n-c527831f7b", "pod":"calico-apiserver-6d47754b8-jnbd2", "timestamp":"2025-05-14 00:01:09.880705089 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.888 [INFO][4465] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.888 [INFO][4465] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.888 [INFO][4465] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.890 [INFO][4465] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.893 [INFO][4465] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.897 [INFO][4465] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.899 [INFO][4465] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011033 containerd[1735]: 2025-05-14 00:01:09.901 [INFO][4465] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.901 [INFO][4465] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.903 [INFO][4465] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69 May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.909 [INFO][4465] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.915 [INFO][4465] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.129/26] block=192.168.70.128/26 handle="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.915 [INFO][4465] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.129/26] handle="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.915 [INFO][4465] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:10.011552 containerd[1735]: 2025-05-14 00:01:09.915 [INFO][4465] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.129/26] IPv6=[] ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" HandleID="k8s-pod-network.37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.011870 containerd[1735]: 2025-05-14 00:01:09.918 [INFO][4454] cni-plugin/k8s.go 386: Populated endpoint ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0", GenerateName:"calico-apiserver-6d47754b8-", Namespace:"calico-apiserver", SelfLink:"", UID:"abd4dce2-aa29-4536-9c3c-73cb1b731afc", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d47754b8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"calico-apiserver-6d47754b8-jnbd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie83ebffe94c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:10.012001 containerd[1735]: 2025-05-14 00:01:09.918 [INFO][4454] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.129/32] ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.012001 containerd[1735]: 2025-05-14 00:01:09.918 [INFO][4454] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie83ebffe94c ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.012001 containerd[1735]: 2025-05-14 00:01:09.990 [INFO][4454] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.012150 containerd[1735]: 2025-05-14 00:01:09.990 [INFO][4454] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0", GenerateName:"calico-apiserver-6d47754b8-", Namespace:"calico-apiserver", SelfLink:"", UID:"abd4dce2-aa29-4536-9c3c-73cb1b731afc", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d47754b8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69", Pod:"calico-apiserver-6d47754b8-jnbd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie83ebffe94c", MAC:"12:a8:f3:52:38:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:10.012262 containerd[1735]: 2025-05-14 00:01:10.007 [INFO][4454] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-jnbd2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--jnbd2-eth0" May 14 00:01:10.036030 kubelet[3280]: I0514 00:01:10.035111 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n5cf8" podStartSLOduration=3.924124557 podStartE2EDuration="39.035088141s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:00:32.223623768 +0000 UTC m=+11.378583164" lastFinishedPulling="2025-05-14 00:01:07.334587352 +0000 UTC m=+46.489546748" observedRunningTime="2025-05-14 00:01:10.034778038 +0000 UTC m=+49.189737534" watchObservedRunningTime="2025-05-14 00:01:10.035088141 +0000 UTC m=+49.190047637" May 14 00:01:10.095852 containerd[1735]: time="2025-05-14T00:01:10.095808811Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"68d14cbd9a9efce388b0aa76a446fc874050d9d57b7b6dfc97138edb46034065\" pid:4496 exit_status:1 exited_at:{seconds:1747180870 nanos:95459808}" May 14 00:01:10.348563 containerd[1735]: time="2025-05-14T00:01:10.347772480Z" level=info msg="connecting to shim 37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69" address="unix:///run/containerd/s/ddcf7995889688cdaa5a4b42dbb7d4723a0c270ba3b99f19f8dd3ffb03d9d94c" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:10.374468 systemd[1]: Started cri-containerd-37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69.scope - libcontainer container 37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69. May 14 00:01:10.423744 containerd[1735]: time="2025-05-14T00:01:10.423691094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-jnbd2,Uid:abd4dce2-aa29-4536-9c3c-73cb1b731afc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69\"" May 14 00:01:10.425735 containerd[1735]: time="2025-05-14T00:01:10.425458911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 00:01:10.838447 kernel: bpftool[4672]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 14 00:01:11.102327 containerd[1735]: time="2025-05-14T00:01:11.101889770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"cc3e7f3dd501dd7bd4f177b3cd5369e7903980fe5b9f29798936a7633bb2b453\" pid:4686 exit_status:1 exited_at:{seconds:1747180871 nanos:101530667}" May 14 00:01:11.143235 systemd-networkd[1559]: vxlan.calico: Link UP May 14 00:01:11.143245 systemd-networkd[1559]: vxlan.calico: Gained carrier May 14 00:01:11.700550 systemd-networkd[1559]: calie83ebffe94c: Gained IPv6LL May 14 00:01:11.774827 containerd[1735]: time="2025-05-14T00:01:11.774455693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2dhjc,Uid:eebab07c-a2e0-4962-9267-45fd956a2e27,Namespace:kube-system,Attempt:0,}" May 14 00:01:12.103989 systemd-networkd[1559]: cali161e281d3e2: Link UP May 14 00:01:12.104226 systemd-networkd[1559]: cali161e281d3e2: Gained carrier May 14 00:01:12.126190 containerd[1735]: 2025-05-14 00:01:12.026 [INFO][4766] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0 coredns-6f6b679f8f- kube-system eebab07c-a2e0-4962-9267-45fd956a2e27 719 0 2025-05-14 00:00:21 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b coredns-6f6b679f8f-2dhjc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali161e281d3e2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-" May 14 00:01:12.126190 containerd[1735]: 2025-05-14 00:01:12.026 [INFO][4766] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.126190 containerd[1735]: 2025-05-14 00:01:12.062 [INFO][4777] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" HandleID="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.072 [INFO][4777] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" HandleID="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000293110), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-n-c527831f7b", "pod":"coredns-6f6b679f8f-2dhjc", "timestamp":"2025-05-14 00:01:12.062890305 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.072 [INFO][4777] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.072 [INFO][4777] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.072 [INFO][4777] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.074 [INFO][4777] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.077 [INFO][4777] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.081 [INFO][4777] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.082 [INFO][4777] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.126779 containerd[1735]: 2025-05-14 00:01:12.084 [INFO][4777] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.084 [INFO][4777] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.085 [INFO][4777] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.090 [INFO][4777] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.098 [INFO][4777] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.130/26] block=192.168.70.128/26 handle="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.098 [INFO][4777] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.130/26] handle="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.098 [INFO][4777] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:12.127149 containerd[1735]: 2025-05-14 00:01:12.098 [INFO][4777] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.130/26] IPv6=[] ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" HandleID="k8s-pod-network.fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.100 [INFO][4766] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eebab07c-a2e0-4962-9267-45fd956a2e27", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"coredns-6f6b679f8f-2dhjc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali161e281d3e2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.100 [INFO][4766] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.130/32] ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.100 [INFO][4766] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali161e281d3e2 ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.103 [INFO][4766] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.105 [INFO][4766] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eebab07c-a2e0-4962-9267-45fd956a2e27", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a", Pod:"coredns-6f6b679f8f-2dhjc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali161e281d3e2", MAC:"4e:5a:40:49:5f:e2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:12.127449 containerd[1735]: 2025-05-14 00:01:12.121 [INFO][4766] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" Namespace="kube-system" Pod="coredns-6f6b679f8f-2dhjc" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--2dhjc-eth0" May 14 00:01:12.276516 systemd-networkd[1559]: vxlan.calico: Gained IPv6LL May 14 00:01:12.773694 containerd[1735]: time="2025-05-14T00:01:12.773525210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-zl6h5,Uid:318fe614-9b16-49bc-840b-cb2287a86e40,Namespace:kube-system,Attempt:0,}" May 14 00:01:12.773694 containerd[1735]: time="2025-05-14T00:01:12.773525110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b8c5bc6b6-6m9m2,Uid:6fe88591-92ff-4491-90ae-c32f13e57639,Namespace:calico-system,Attempt:0,}" May 14 00:01:13.058500 containerd[1735]: time="2025-05-14T00:01:13.058284343Z" level=info msg="connecting to shim fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a" address="unix:///run/containerd/s/7b4bdb611e059cdf269d8a290f4727a91c1aedbcbb41439d3fa8722de81a16f7" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:13.067922 systemd-networkd[1559]: cali9272e245f08: Link UP May 14 00:01:13.068209 systemd-networkd[1559]: cali9272e245f08: Gained carrier May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:12.955 [INFO][4798] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0 coredns-6f6b679f8f- kube-system 318fe614-9b16-49bc-840b-cb2287a86e40 727 0 2025-05-14 00:00:21 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b coredns-6f6b679f8f-zl6h5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9272e245f08 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:12.955 [INFO][4798] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:12.996 [INFO][4811] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" HandleID="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.014 [INFO][4811] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" HandleID="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291480), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-n-c527831f7b", "pod":"coredns-6f6b679f8f-zl6h5", "timestamp":"2025-05-14 00:01:12.996795074 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.014 [INFO][4811] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.014 [INFO][4811] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.014 [INFO][4811] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.018 [INFO][4811] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.021 [INFO][4811] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.026 [INFO][4811] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.029 [INFO][4811] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.032 [INFO][4811] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.032 [INFO][4811] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.033 [INFO][4811] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999 May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.039 [INFO][4811] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.060 [INFO][4811] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.131/26] block=192.168.70.128/26 handle="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.060 [INFO][4811] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.131/26] handle="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.060 [INFO][4811] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:13.106660 containerd[1735]: 2025-05-14 00:01:13.060 [INFO][4811] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.131/26] IPv6=[] ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" HandleID="k8s-pod-network.3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Workload="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.063 [INFO][4798] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"318fe614-9b16-49bc-840b-cb2287a86e40", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"coredns-6f6b679f8f-zl6h5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9272e245f08", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.063 [INFO][4798] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.131/32] ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.063 [INFO][4798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9272e245f08 ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.070 [INFO][4798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.071 [INFO][4798] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"318fe614-9b16-49bc-840b-cb2287a86e40", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999", Pod:"coredns-6f6b679f8f-zl6h5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9272e245f08", MAC:"7a:1f:da:3d:b8:ac", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:13.109609 containerd[1735]: 2025-05-14 00:01:13.100 [INFO][4798] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" Namespace="kube-system" Pod="coredns-6f6b679f8f-zl6h5" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-coredns--6f6b679f8f--zl6h5-eth0" May 14 00:01:13.120508 systemd[1]: Started cri-containerd-fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a.scope - libcontainer container fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a. May 14 00:01:13.620537 systemd-networkd[1559]: cali161e281d3e2: Gained IPv6LL May 14 00:01:13.774148 containerd[1735]: time="2025-05-14T00:01:13.773946962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-l5pcr,Uid:6386bb04-f08c-46cd-8bd4-cc9e0fdf6662,Namespace:calico-apiserver,Attempt:0,}" May 14 00:01:14.260593 systemd-networkd[1559]: cali9272e245f08: Gained IPv6LL May 14 00:01:14.778594 systemd-networkd[1559]: calidfe8a8b32df: Link UP May 14 00:01:14.779546 systemd-networkd[1559]: calidfe8a8b32df: Gained carrier May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.617 [INFO][4878] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0 calico-kube-controllers-b8c5bc6b6- calico-system 6fe88591-92ff-4491-90ae-c32f13e57639 726 0 2025-05-14 00:00:31 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:b8c5bc6b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b calico-kube-controllers-b8c5bc6b6-6m9m2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidfe8a8b32df [] []}} ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.618 [INFO][4878] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.642 [INFO][4890] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" HandleID="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.651 [INFO][4890] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" HandleID="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000311280), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-n-c527831f7b", "pod":"calico-kube-controllers-b8c5bc6b6-6m9m2", "timestamp":"2025-05-14 00:01:14.642027389 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.651 [INFO][4890] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.651 [INFO][4890] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.651 [INFO][4890] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.653 [INFO][4890] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.750 [INFO][4890] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.756 [INFO][4890] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.758 [INFO][4890] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.760 [INFO][4890] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.760 [INFO][4890] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.761 [INFO][4890] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959 May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.765 [INFO][4890] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.773 [INFO][4890] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.132/26] block=192.168.70.128/26 handle="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.773 [INFO][4890] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.132/26] handle="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.773 [INFO][4890] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:14.796899 containerd[1735]: 2025-05-14 00:01:14.773 [INFO][4890] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.132/26] IPv6=[] ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" HandleID="k8s-pod-network.3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.775 [INFO][4878] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0", GenerateName:"calico-kube-controllers-b8c5bc6b6-", Namespace:"calico-system", SelfLink:"", UID:"6fe88591-92ff-4491-90ae-c32f13e57639", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b8c5bc6b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"calico-kube-controllers-b8c5bc6b6-6m9m2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidfe8a8b32df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.775 [INFO][4878] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.132/32] ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.775 [INFO][4878] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidfe8a8b32df ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.779 [INFO][4878] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.780 [INFO][4878] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0", GenerateName:"calico-kube-controllers-b8c5bc6b6-", Namespace:"calico-system", SelfLink:"", UID:"6fe88591-92ff-4491-90ae-c32f13e57639", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b8c5bc6b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959", Pod:"calico-kube-controllers-b8c5bc6b6-6m9m2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidfe8a8b32df", MAC:"fe:d9:a5:6d:c4:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:14.799013 containerd[1735]: 2025-05-14 00:01:14.793 [INFO][4878] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" Namespace="calico-system" Pod="calico-kube-controllers-b8c5bc6b6-6m9m2" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--kube--controllers--b8c5bc6b6--6m9m2-eth0" May 14 00:01:14.827759 containerd[1735]: time="2025-05-14T00:01:14.827712907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2dhjc,Uid:eebab07c-a2e0-4962-9267-45fd956a2e27,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a\"" May 14 00:01:14.831564 containerd[1735]: time="2025-05-14T00:01:14.831521242Z" level=info msg="CreateContainer within sandbox \"fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 00:01:15.224745 systemd-networkd[1559]: cali31c12c44a8e: Link UP May 14 00:01:15.225982 systemd-networkd[1559]: cali31c12c44a8e: Gained carrier May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.160 [INFO][4912] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0 calico-apiserver-6d47754b8- calico-apiserver 6386bb04-f08c-46cd-8bd4-cc9e0fdf6662 725 0 2025-05-14 00:00:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d47754b8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b calico-apiserver-6d47754b8-l5pcr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali31c12c44a8e [] []}} ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.160 [INFO][4912] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.184 [INFO][4925] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" HandleID="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.192 [INFO][4925] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" HandleID="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001fe0a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-n-c527831f7b", "pod":"calico-apiserver-6d47754b8-l5pcr", "timestamp":"2025-05-14 00:01:15.184146503 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.192 [INFO][4925] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.192 [INFO][4925] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.192 [INFO][4925] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.194 [INFO][4925] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.198 [INFO][4925] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.202 [INFO][4925] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.203 [INFO][4925] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.205 [INFO][4925] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.205 [INFO][4925] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.206 [INFO][4925] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86 May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.210 [INFO][4925] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.218 [INFO][4925] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.133/26] block=192.168.70.128/26 handle="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.218 [INFO][4925] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.133/26] handle="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.218 [INFO][4925] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:15.246053 containerd[1735]: 2025-05-14 00:01:15.218 [INFO][4925] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.133/26] IPv6=[] ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" HandleID="k8s-pod-network.3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Workload="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.220 [INFO][4912] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0", GenerateName:"calico-apiserver-6d47754b8-", Namespace:"calico-apiserver", SelfLink:"", UID:"6386bb04-f08c-46cd-8bd4-cc9e0fdf6662", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d47754b8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"calico-apiserver-6d47754b8-l5pcr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31c12c44a8e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.220 [INFO][4912] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.133/32] ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.220 [INFO][4912] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31c12c44a8e ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.226 [INFO][4912] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.227 [INFO][4912] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0", GenerateName:"calico-apiserver-6d47754b8-", Namespace:"calico-apiserver", SelfLink:"", UID:"6386bb04-f08c-46cd-8bd4-cc9e0fdf6662", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d47754b8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86", Pod:"calico-apiserver-6d47754b8-l5pcr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31c12c44a8e", MAC:"a2:59:fa:f4:59:96", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:15.248800 containerd[1735]: 2025-05-14 00:01:15.243 [INFO][4912] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" Namespace="calico-apiserver" Pod="calico-apiserver-6d47754b8-l5pcr" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-calico--apiserver--6d47754b8--l5pcr-eth0" May 14 00:01:15.732201 containerd[1735]: time="2025-05-14T00:01:15.732067170Z" level=info msg="Container 9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:15.739693 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4263494461.mount: Deactivated successfully. May 14 00:01:16.101338 containerd[1735]: time="2025-05-14T00:01:16.100916281Z" level=info msg="connecting to shim 3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999" address="unix:///run/containerd/s/b8b6c9fc115e4516bbcdf7b4870b9de3ddfcd1210024d0dfa3e5b04698e4d3d4" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:16.129431 systemd[1]: Started cri-containerd-3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999.scope - libcontainer container 3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999. May 14 00:01:16.153436 containerd[1735]: time="2025-05-14T00:01:16.153380266Z" level=info msg="connecting to shim 3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959" address="unix:///run/containerd/s/f2a84cca9ab9503966daeeb1fd262d7248d22700376d6b1b6c6c270bd99d6e9d" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:16.188488 systemd[1]: Started cri-containerd-3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959.scope - libcontainer container 3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959. May 14 00:01:16.232770 containerd[1735]: time="2025-05-14T00:01:16.232626099Z" level=info msg="CreateContainer within sandbox \"fe932abde74669703cbb0cc0b277720e9f02bc890298bbf41b60dcc5fe0f855a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c\"" May 14 00:01:16.234663 containerd[1735]: time="2025-05-14T00:01:16.234595917Z" level=info msg="StartContainer for \"9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c\"" May 14 00:01:16.237130 containerd[1735]: time="2025-05-14T00:01:16.237062440Z" level=info msg="connecting to shim 9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c" address="unix:///run/containerd/s/7b4bdb611e059cdf269d8a290f4727a91c1aedbcbb41439d3fa8722de81a16f7" protocol=ttrpc version=3 May 14 00:01:16.269757 systemd[1]: Started cri-containerd-9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c.scope - libcontainer container 9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c. May 14 00:01:16.390183 containerd[1735]: time="2025-05-14T00:01:16.389382849Z" level=info msg="StartContainer for \"9e9f4645739df758bd828bcdd0b927bb607cc53fda5ea8edae878927e454248c\" returns successfully" May 14 00:01:16.483193 containerd[1735]: time="2025-05-14T00:01:16.483123716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-zl6h5,Uid:318fe614-9b16-49bc-840b-cb2287a86e40,Namespace:kube-system,Attempt:0,} returns sandbox id \"3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999\"" May 14 00:01:16.488246 containerd[1735]: time="2025-05-14T00:01:16.488176263Z" level=info msg="CreateContainer within sandbox \"3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 00:01:16.564695 systemd-networkd[1559]: calidfe8a8b32df: Gained IPv6LL May 14 00:01:16.577225 containerd[1735]: time="2025-05-14T00:01:16.576709281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b8c5bc6b6-6m9m2,Uid:6fe88591-92ff-4491-90ae-c32f13e57639,Namespace:calico-system,Attempt:0,} returns sandbox id \"3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959\"" May 14 00:01:16.692423 systemd-networkd[1559]: cali31c12c44a8e: Gained IPv6LL May 14 00:01:16.837389 containerd[1735]: time="2025-05-14T00:01:16.836979488Z" level=info msg="connecting to shim 3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86" address="unix:///run/containerd/s/f37d1bd75a2b2ef0d4ad4d5c3d6d270792e7810941ff5bc1ab8ba42821146bfa" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:16.859470 systemd[1]: Started cri-containerd-3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86.scope - libcontainer container 3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86. May 14 00:01:16.989086 containerd[1735]: time="2025-05-14T00:01:16.988906193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d47754b8-l5pcr,Uid:6386bb04-f08c-46cd-8bd4-cc9e0fdf6662,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86\"" May 14 00:01:17.038885 containerd[1735]: time="2025-05-14T00:01:17.038048748Z" level=info msg="Container f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:17.050158 kubelet[3280]: I0514 00:01:17.050075 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-2dhjc" podStartSLOduration=56.050052359 podStartE2EDuration="56.050052359s" podCreationTimestamp="2025-05-14 00:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:01:17.049348252 +0000 UTC m=+56.204307648" watchObservedRunningTime="2025-05-14 00:01:17.050052359 +0000 UTC m=+56.205011755" May 14 00:01:17.335641 containerd[1735]: time="2025-05-14T00:01:17.335425498Z" level=info msg="CreateContainer within sandbox \"3d162f6ee12a37cf6784eb321c00eee9b7777f3e0c827a663c51e8b283a40999\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e\"" May 14 00:01:17.337095 containerd[1735]: time="2025-05-14T00:01:17.336908412Z" level=info msg="StartContainer for \"f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e\"" May 14 00:01:17.338772 containerd[1735]: time="2025-05-14T00:01:17.338737629Z" level=info msg="connecting to shim f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e" address="unix:///run/containerd/s/b8b6c9fc115e4516bbcdf7b4870b9de3ddfcd1210024d0dfa3e5b04698e4d3d4" protocol=ttrpc version=3 May 14 00:01:17.364481 systemd[1]: Started cri-containerd-f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e.scope - libcontainer container f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e. May 14 00:01:17.428362 containerd[1735]: time="2025-05-14T00:01:17.428289957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:17.432540 containerd[1735]: time="2025-05-14T00:01:17.432511196Z" level=info msg="StartContainer for \"f2506f8bd2b2078c47f3374c24a0866511e3b598fc55cf41fad9cb38a47c685e\" returns successfully" May 14 00:01:17.475562 containerd[1735]: time="2025-05-14T00:01:17.475461093Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 14 00:01:17.481136 containerd[1735]: time="2025-05-14T00:01:17.481056545Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:17.539916 containerd[1735]: time="2025-05-14T00:01:17.539826788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:17.541044 containerd[1735]: time="2025-05-14T00:01:17.540853298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 7.115321687s" May 14 00:01:17.541044 containerd[1735]: time="2025-05-14T00:01:17.540903098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 00:01:17.543420 containerd[1735]: time="2025-05-14T00:01:17.543104218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 14 00:01:17.544473 containerd[1735]: time="2025-05-14T00:01:17.544418531Z" level=info msg="CreateContainer within sandbox \"37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 00:01:17.727451 containerd[1735]: time="2025-05-14T00:01:17.727382623Z" level=info msg="Container b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:17.829915 containerd[1735]: time="2025-05-14T00:01:17.829858170Z" level=info msg="CreateContainer within sandbox \"37a7eef959268e10fc96e0d0a1d776429d62cfb07c1477be1d6e0deef9bebb69\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405\"" May 14 00:01:17.832100 containerd[1735]: time="2025-05-14T00:01:17.830492376Z" level=info msg="StartContainer for \"b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405\"" May 14 00:01:17.832100 containerd[1735]: time="2025-05-14T00:01:17.831990690Z" level=info msg="connecting to shim b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405" address="unix:///run/containerd/s/ddcf7995889688cdaa5a4b42dbb7d4723a0c270ba3b99f19f8dd3ffb03d9d94c" protocol=ttrpc version=3 May 14 00:01:17.858477 systemd[1]: Started cri-containerd-b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405.scope - libcontainer container b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405. May 14 00:01:17.942290 containerd[1735]: time="2025-05-14T00:01:17.942249710Z" level=info msg="StartContainer for \"b6d9a3a87ac3f20ec43edf7dae5dc6f972b72251cd0a5db06b1d9f3009ac9405\" returns successfully" May 14 00:01:18.060072 kubelet[3280]: I0514 00:01:18.059901 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6d47754b8-jnbd2" podStartSLOduration=39.942829095 podStartE2EDuration="47.059878098s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:01:10.425092607 +0000 UTC m=+49.580052103" lastFinishedPulling="2025-05-14 00:01:17.54214161 +0000 UTC m=+56.697101106" observedRunningTime="2025-05-14 00:01:18.057462875 +0000 UTC m=+57.212422271" watchObservedRunningTime="2025-05-14 00:01:18.059878098 +0000 UTC m=+57.214837494" May 14 00:01:18.078467 kubelet[3280]: I0514 00:01:18.078341 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-zl6h5" podStartSLOduration=57.078320368 podStartE2EDuration="57.078320368s" podCreationTimestamp="2025-05-14 00:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:01:18.07738916 +0000 UTC m=+57.232348556" watchObservedRunningTime="2025-05-14 00:01:18.078320368 +0000 UTC m=+57.233279864" May 14 00:01:19.046233 kubelet[3280]: I0514 00:01:19.046194 3280 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:01:21.540319 containerd[1735]: time="2025-05-14T00:01:21.540241493Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:21.576464 containerd[1735]: time="2025-05-14T00:01:21.576375198Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 14 00:01:21.630460 containerd[1735]: time="2025-05-14T00:01:21.630358955Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:21.680375 containerd[1735]: time="2025-05-14T00:01:21.678786664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:21.680375 containerd[1735]: time="2025-05-14T00:01:21.679867273Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 4.136598153s" May 14 00:01:21.680375 containerd[1735]: time="2025-05-14T00:01:21.679909874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 14 00:01:21.682483 containerd[1735]: time="2025-05-14T00:01:21.682448095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 00:01:21.702326 containerd[1735]: time="2025-05-14T00:01:21.702271263Z" level=info msg="CreateContainer within sandbox \"3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 14 00:01:21.843641 containerd[1735]: time="2025-05-14T00:01:21.841336839Z" level=info msg="Container fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:21.936387 containerd[1735]: time="2025-05-14T00:01:21.936341643Z" level=info msg="CreateContainer within sandbox \"3864873fc08781fcf24f86a30b959837be4ae832cf40c411e9d2c7f236d3d959\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\"" May 14 00:01:21.937097 containerd[1735]: time="2025-05-14T00:01:21.937060149Z" level=info msg="StartContainer for \"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\"" May 14 00:01:21.939108 containerd[1735]: time="2025-05-14T00:01:21.938967165Z" level=info msg="connecting to shim fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650" address="unix:///run/containerd/s/f2a84cca9ab9503966daeeb1fd262d7248d22700376d6b1b6c6c270bd99d6e9d" protocol=ttrpc version=3 May 14 00:01:21.968511 systemd[1]: Started cri-containerd-fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650.scope - libcontainer container fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650. May 14 00:01:22.094895 containerd[1735]: time="2025-05-14T00:01:22.094362679Z" level=info msg="StartContainer for \"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" returns successfully" May 14 00:01:22.531639 containerd[1735]: time="2025-05-14T00:01:22.531580577Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:22.578176 containerd[1735]: time="2025-05-14T00:01:22.578084170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 14 00:01:22.580279 containerd[1735]: time="2025-05-14T00:01:22.580237488Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 897.594091ms" May 14 00:01:22.580489 containerd[1735]: time="2025-05-14T00:01:22.580280289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 00:01:22.583553 containerd[1735]: time="2025-05-14T00:01:22.583481216Z" level=info msg="CreateContainer within sandbox \"3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 00:01:22.742382 containerd[1735]: time="2025-05-14T00:01:22.742264059Z" level=info msg="Container 4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:22.773274 containerd[1735]: time="2025-05-14T00:01:22.773212520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,}" May 14 00:01:22.915777 containerd[1735]: time="2025-05-14T00:01:22.915641625Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"b100e33e57f92f7e5ab8988a0c81531e1655d56b4dbda12022a3582531dd2b4a\" pid:5269 exited_at:{seconds:1747180882 nanos:915193121}" May 14 00:01:22.932330 containerd[1735]: time="2025-05-14T00:01:22.930401450Z" level=info msg="CreateContainer within sandbox \"3679ee07b0a36b2917bfd957d5654b50c72fe83d87a33e2a8cca5b295e8c7c86\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826\"" May 14 00:01:22.941871 containerd[1735]: time="2025-05-14T00:01:22.932636469Z" level=info msg="StartContainer for \"4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826\"" May 14 00:01:22.941871 containerd[1735]: time="2025-05-14T00:01:22.934961888Z" level=info msg="connecting to shim 4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826" address="unix:///run/containerd/s/f37d1bd75a2b2ef0d4ad4d5c3d6d270792e7810941ff5bc1ab8ba42821146bfa" protocol=ttrpc version=3 May 14 00:01:22.981510 systemd[1]: Started cri-containerd-4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826.scope - libcontainer container 4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826. May 14 00:01:23.088108 containerd[1735]: time="2025-05-14T00:01:23.088069583Z" level=info msg="StartContainer for \"4d6a3b01359ca4a20eac056c5b2e89e3adb0d44a01f4048c4942a52e6ef90826\" returns successfully" May 14 00:01:23.102705 kubelet[3280]: I0514 00:01:23.102665 3280 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:01:23.244033 kubelet[3280]: I0514 00:01:23.243966 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6d47754b8-l5pcr" podStartSLOduration=46.654434124 podStartE2EDuration="52.243939601s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:01:16.991563918 +0000 UTC m=+56.146523314" lastFinishedPulling="2025-05-14 00:01:22.581069295 +0000 UTC m=+61.736028791" observedRunningTime="2025-05-14 00:01:23.168815866 +0000 UTC m=+62.323775262" watchObservedRunningTime="2025-05-14 00:01:23.243939601 +0000 UTC m=+62.398899097" May 14 00:01:23.260606 containerd[1735]: time="2025-05-14T00:01:23.260552542Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"5da4cd5c8f3cd804d075e99caf751c39245cd9dc3a56ca21615166fe39113df1\" pid:5345 exited_at:{seconds:1747180883 nanos:259879236}" May 14 00:01:23.352912 kubelet[3280]: I0514 00:01:23.352556 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-b8c5bc6b6-6m9m2" podStartSLOduration=47.249700932 podStartE2EDuration="52.35252572s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:01:16.578337896 +0000 UTC m=+55.733297292" lastFinishedPulling="2025-05-14 00:01:21.681162684 +0000 UTC m=+60.836122080" observedRunningTime="2025-05-14 00:01:23.321235855 +0000 UTC m=+62.476195251" watchObservedRunningTime="2025-05-14 00:01:23.35252572 +0000 UTC m=+62.507485116" May 14 00:01:23.388905 systemd-networkd[1559]: cali6425c2c50b1: Link UP May 14 00:01:23.389505 systemd-networkd[1559]: cali6425c2c50b1: Gained carrier May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.025 [INFO][5294] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0 csi-node-driver- calico-system de27c4f6-c0d2-40b8-bd37-0674db8e9821 598 0 2025-05-14 00:00:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284.0.0-n-c527831f7b csi-node-driver-vj6mt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6425c2c50b1 [] []}} ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.025 [INFO][5294] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.070 [INFO][5315] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" HandleID="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Workload="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.203 [INFO][5315] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" HandleID="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Workload="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed180), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-n-c527831f7b", "pod":"csi-node-driver-vj6mt", "timestamp":"2025-05-14 00:01:23.067823412 +0000 UTC"}, Hostname:"ci-4284.0.0-n-c527831f7b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.203 [INFO][5315] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.204 [INFO][5315] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.204 [INFO][5315] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-n-c527831f7b' May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.242 [INFO][5315] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.299 [INFO][5315] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.343 [INFO][5315] ipam/ipam.go 489: Trying affinity for 192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.347 [INFO][5315] ipam/ipam.go 155: Attempting to load block cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.351 [INFO][5315] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.70.128/26 host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.351 [INFO][5315] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.70.128/26 handle="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.354 [INFO][5315] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9 May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.364 [INFO][5315] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.70.128/26 handle="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.379 [INFO][5315] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.70.134/26] block=192.168.70.128/26 handle="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.379 [INFO][5315] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.70.134/26] handle="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" host="ci-4284.0.0-n-c527831f7b" May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.379 [INFO][5315] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:01:23.418873 containerd[1735]: 2025-05-14 00:01:23.379 [INFO][5315] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.134/26] IPv6=[] ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" HandleID="k8s-pod-network.99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Workload="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.382 [INFO][5294] cni-plugin/k8s.go 386: Populated endpoint ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"de27c4f6-c0d2-40b8-bd37-0674db8e9821", ResourceVersion:"598", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"", Pod:"csi-node-driver-vj6mt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6425c2c50b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.383 [INFO][5294] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.70.134/32] ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.383 [INFO][5294] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6425c2c50b1 ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.389 [INFO][5294] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.391 [INFO][5294] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"de27c4f6-c0d2-40b8-bd37-0674db8e9821", ResourceVersion:"598", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 0, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-n-c527831f7b", ContainerID:"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9", Pod:"csi-node-driver-vj6mt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6425c2c50b1", MAC:"7e:b4:e3:93:c1:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:01:23.421889 containerd[1735]: 2025-05-14 00:01:23.415 [INFO][5294] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" Namespace="calico-system" Pod="csi-node-driver-vj6mt" WorkloadEndpoint="ci--4284.0.0--n--c527831f7b-k8s-csi--node--driver--vj6mt-eth0" May 14 00:01:23.744268 containerd[1735]: time="2025-05-14T00:01:23.744177632Z" level=info msg="connecting to shim 99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9" address="unix:///run/containerd/s/c5d91a648de5835003198c214a21981cc48054afe438a252441f980a2c0eb346" namespace=k8s.io protocol=ttrpc version=3 May 14 00:01:23.784469 systemd[1]: Started cri-containerd-99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9.scope - libcontainer container 99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9. May 14 00:01:23.825925 containerd[1735]: time="2025-05-14T00:01:23.825878623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vj6mt,Uid:de27c4f6-c0d2-40b8-bd37-0674db8e9821,Namespace:calico-system,Attempt:0,} returns sandbox id \"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9\"" May 14 00:01:23.828005 containerd[1735]: time="2025-05-14T00:01:23.827914540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 14 00:01:24.500464 systemd-networkd[1559]: cali6425c2c50b1: Gained IPv6LL May 14 00:01:25.334308 containerd[1735]: time="2025-05-14T00:01:25.334249480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:25.337647 containerd[1735]: time="2025-05-14T00:01:25.337564908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 14 00:01:25.343282 containerd[1735]: time="2025-05-14T00:01:25.343208656Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:25.348266 containerd[1735]: time="2025-05-14T00:01:25.348211698Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:25.349240 containerd[1735]: time="2025-05-14T00:01:25.348835903Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 1.520858762s" May 14 00:01:25.349240 containerd[1735]: time="2025-05-14T00:01:25.348875604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 14 00:01:25.351698 containerd[1735]: time="2025-05-14T00:01:25.351674027Z" level=info msg="CreateContainer within sandbox \"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 14 00:01:25.379566 containerd[1735]: time="2025-05-14T00:01:25.377588547Z" level=info msg="Container b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:25.383623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1864637268.mount: Deactivated successfully. May 14 00:01:25.403040 containerd[1735]: time="2025-05-14T00:01:25.402992761Z" level=info msg="CreateContainer within sandbox \"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa\"" May 14 00:01:25.403729 containerd[1735]: time="2025-05-14T00:01:25.403594467Z" level=info msg="StartContainer for \"b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa\"" May 14 00:01:25.405922 containerd[1735]: time="2025-05-14T00:01:25.405882486Z" level=info msg="connecting to shim b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa" address="unix:///run/containerd/s/c5d91a648de5835003198c214a21981cc48054afe438a252441f980a2c0eb346" protocol=ttrpc version=3 May 14 00:01:25.431482 systemd[1]: Started cri-containerd-b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa.scope - libcontainer container b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa. May 14 00:01:25.473100 containerd[1735]: time="2025-05-14T00:01:25.472961653Z" level=info msg="StartContainer for \"b19ea18abd5837ce81ad54eca9ab80eed0ab71bae1d5a251d31ecc8be3e2ceaa\" returns successfully" May 14 00:01:25.474698 containerd[1735]: time="2025-05-14T00:01:25.474658168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 14 00:01:27.265228 containerd[1735]: time="2025-05-14T00:01:27.265186111Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"c4c14615367a30f9026ae061bfcb443ac3a24ee5d46dd17076912cb4ec4b58ca\" pid:5468 exited_at:{seconds:1747180887 nanos:264570706}" May 14 00:01:27.717545 containerd[1735]: time="2025-05-14T00:01:27.717490236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:27.719820 containerd[1735]: time="2025-05-14T00:01:27.719739655Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 14 00:01:27.725600 containerd[1735]: time="2025-05-14T00:01:27.725538904Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:27.731213 containerd[1735]: time="2025-05-14T00:01:27.730594747Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:01:27.731213 containerd[1735]: time="2025-05-14T00:01:27.731056851Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.256355983s" May 14 00:01:27.731213 containerd[1735]: time="2025-05-14T00:01:27.731093951Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 14 00:01:27.735355 containerd[1735]: time="2025-05-14T00:01:27.734479080Z" level=info msg="CreateContainer within sandbox \"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 14 00:01:27.760324 containerd[1735]: time="2025-05-14T00:01:27.757167072Z" level=info msg="Container 11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7: CDI devices from CRI Config.CDIDevices: []" May 14 00:01:27.784591 containerd[1735]: time="2025-05-14T00:01:27.783775497Z" level=info msg="CreateContainer within sandbox \"99f457dd21460b7b9ae33ad9824ef40f3bf3f32b60140725078bcb3526b75de9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7\"" May 14 00:01:27.787254 containerd[1735]: time="2025-05-14T00:01:27.787071425Z" level=info msg="StartContainer for \"11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7\"" May 14 00:01:27.790471 containerd[1735]: time="2025-05-14T00:01:27.790431853Z" level=info msg="connecting to shim 11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7" address="unix:///run/containerd/s/c5d91a648de5835003198c214a21981cc48054afe438a252441f980a2c0eb346" protocol=ttrpc version=3 May 14 00:01:27.816450 systemd[1]: Started cri-containerd-11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7.scope - libcontainer container 11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7. May 14 00:01:27.882242 containerd[1735]: time="2025-05-14T00:01:27.881522023Z" level=info msg="StartContainer for \"11e71faee144418e7fe26abbe2bf471bee40b4dce98338ae75c751a22a7d37e7\" returns successfully" May 14 00:01:27.925997 kubelet[3280]: I0514 00:01:27.925966 3280 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 14 00:01:27.926495 kubelet[3280]: I0514 00:01:27.926085 3280 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 14 00:01:28.157094 kubelet[3280]: I0514 00:01:28.156932 3280 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vj6mt" podStartSLOduration=53.251921327 podStartE2EDuration="57.156908253s" podCreationTimestamp="2025-05-14 00:00:31 +0000 UTC" firstStartedPulling="2025-05-14 00:01:23.827425036 +0000 UTC m=+62.982384432" lastFinishedPulling="2025-05-14 00:01:27.732411962 +0000 UTC m=+66.887371358" observedRunningTime="2025-05-14 00:01:28.155702542 +0000 UTC m=+67.310661938" watchObservedRunningTime="2025-05-14 00:01:28.156908253 +0000 UTC m=+67.311867649" May 14 00:01:52.916739 containerd[1735]: time="2025-05-14T00:01:52.916578917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"86a76d7931ceb4a946b584a67da5b299fd554cf71c70fa096d38834619fc20b9\" pid:5548 exited_at:{seconds:1747180912 nanos:916227714}" May 14 00:01:57.255259 containerd[1735]: time="2025-05-14T00:01:57.255197581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"454d7c6555f7eebbb58c3ef2b7822c9abf07533cebf15f2556006664805f30dd\" pid:5574 exited_at:{seconds:1747180917 nanos:254959079}" May 14 00:02:06.744179 containerd[1735]: time="2025-05-14T00:02:06.744060911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"ac80d3a81bb0c2b523dbda6e60b895a3d263f81cfe5258c47949c8ec8d0b40a7\" pid:5597 exited_at:{seconds:1747180926 nanos:743852009}" May 14 00:02:22.914402 containerd[1735]: time="2025-05-14T00:02:22.914348499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"63d9363313771adfe58b215fc143bb01adb3e666ecf7407296951fde0b86e076\" pid:5619 exited_at:{seconds:1747180942 nanos:913820494}" May 14 00:02:27.253998 containerd[1735]: time="2025-05-14T00:02:27.253848635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"78845866e233ecd54962618109989bf0bbadd23e9f8afb68d11d2ecd5acce6a1\" pid:5646 exited_at:{seconds:1747180947 nanos:253570732}" May 14 00:02:32.441643 systemd[1]: Started sshd@7-10.200.8.5:22-10.200.16.10:40542.service - OpenSSH per-connection server daemon (10.200.16.10:40542). May 14 00:02:33.072856 sshd[5665]: Accepted publickey for core from 10.200.16.10 port 40542 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:33.074490 sshd-session[5665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:33.081144 systemd-logind[1706]: New session 10 of user core. May 14 00:02:33.085483 systemd[1]: Started session-10.scope - Session 10 of User core. May 14 00:02:33.606608 sshd[5667]: Connection closed by 10.200.16.10 port 40542 May 14 00:02:33.607489 sshd-session[5665]: pam_unix(sshd:session): session closed for user core May 14 00:02:33.612122 systemd[1]: sshd@7-10.200.8.5:22-10.200.16.10:40542.service: Deactivated successfully. May 14 00:02:33.614781 systemd[1]: session-10.scope: Deactivated successfully. May 14 00:02:33.615664 systemd-logind[1706]: Session 10 logged out. Waiting for processes to exit. May 14 00:02:33.616563 systemd-logind[1706]: Removed session 10. May 14 00:02:38.720266 systemd[1]: Started sshd@8-10.200.8.5:22-10.200.16.10:44258.service - OpenSSH per-connection server daemon (10.200.16.10:44258). May 14 00:02:39.352396 sshd[5680]: Accepted publickey for core from 10.200.16.10 port 44258 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:39.354006 sshd-session[5680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:39.358864 systemd-logind[1706]: New session 11 of user core. May 14 00:02:39.364455 systemd[1]: Started session-11.scope - Session 11 of User core. May 14 00:02:39.857101 sshd[5682]: Connection closed by 10.200.16.10 port 44258 May 14 00:02:39.858079 sshd-session[5680]: pam_unix(sshd:session): session closed for user core May 14 00:02:39.861700 systemd[1]: sshd@8-10.200.8.5:22-10.200.16.10:44258.service: Deactivated successfully. May 14 00:02:39.864180 systemd[1]: session-11.scope: Deactivated successfully. May 14 00:02:39.865941 systemd-logind[1706]: Session 11 logged out. Waiting for processes to exit. May 14 00:02:39.866954 systemd-logind[1706]: Removed session 11. May 14 00:02:44.970745 systemd[1]: Started sshd@9-10.200.8.5:22-10.200.16.10:44266.service - OpenSSH per-connection server daemon (10.200.16.10:44266). May 14 00:02:45.606158 sshd[5708]: Accepted publickey for core from 10.200.16.10 port 44266 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:45.607695 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:45.612390 systemd-logind[1706]: New session 12 of user core. May 14 00:02:45.616457 systemd[1]: Started session-12.scope - Session 12 of User core. May 14 00:02:46.109887 sshd[5715]: Connection closed by 10.200.16.10 port 44266 May 14 00:02:46.110721 sshd-session[5708]: pam_unix(sshd:session): session closed for user core May 14 00:02:46.114898 systemd[1]: sshd@9-10.200.8.5:22-10.200.16.10:44266.service: Deactivated successfully. May 14 00:02:46.117044 systemd[1]: session-12.scope: Deactivated successfully. May 14 00:02:46.118026 systemd-logind[1706]: Session 12 logged out. Waiting for processes to exit. May 14 00:02:46.119060 systemd-logind[1706]: Removed session 12. May 14 00:02:51.222783 systemd[1]: Started sshd@10-10.200.8.5:22-10.200.16.10:36700.service - OpenSSH per-connection server daemon (10.200.16.10:36700). May 14 00:02:51.855604 sshd[5728]: Accepted publickey for core from 10.200.16.10 port 36700 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:51.857098 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:51.861577 systemd-logind[1706]: New session 13 of user core. May 14 00:02:51.867451 systemd[1]: Started session-13.scope - Session 13 of User core. May 14 00:02:52.356928 sshd[5730]: Connection closed by 10.200.16.10 port 36700 May 14 00:02:52.357291 sshd-session[5728]: pam_unix(sshd:session): session closed for user core May 14 00:02:52.365065 systemd[1]: sshd@10-10.200.8.5:22-10.200.16.10:36700.service: Deactivated successfully. May 14 00:02:52.368689 systemd[1]: session-13.scope: Deactivated successfully. May 14 00:02:52.370842 systemd-logind[1706]: Session 13 logged out. Waiting for processes to exit. May 14 00:02:52.372226 systemd-logind[1706]: Removed session 13. May 14 00:02:52.911437 containerd[1735]: time="2025-05-14T00:02:52.911204453Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"cad1adaa8613db0eff5691f06161f84d7bd821b31d41268614907d581141d247\" pid:5754 exited_at:{seconds:1747180972 nanos:910837349}" May 14 00:02:57.252890 containerd[1735]: time="2025-05-14T00:02:57.252838760Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"50e85ad34830fc4cff946b825035618fedf1d0ce7bd475a858d8571201e08413\" pid:5780 exited_at:{seconds:1747180977 nanos:252573258}" May 14 00:02:57.470124 systemd[1]: Started sshd@11-10.200.8.5:22-10.200.16.10:36712.service - OpenSSH per-connection server daemon (10.200.16.10:36712). May 14 00:02:58.107656 sshd[5790]: Accepted publickey for core from 10.200.16.10 port 36712 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:58.109214 sshd-session[5790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:58.114764 systemd-logind[1706]: New session 14 of user core. May 14 00:02:58.122455 systemd[1]: Started session-14.scope - Session 14 of User core. May 14 00:02:58.608867 sshd[5792]: Connection closed by 10.200.16.10 port 36712 May 14 00:02:58.609626 sshd-session[5790]: pam_unix(sshd:session): session closed for user core May 14 00:02:58.613468 systemd[1]: sshd@11-10.200.8.5:22-10.200.16.10:36712.service: Deactivated successfully. May 14 00:02:58.616202 systemd[1]: session-14.scope: Deactivated successfully. May 14 00:02:58.617004 systemd-logind[1706]: Session 14 logged out. Waiting for processes to exit. May 14 00:02:58.618025 systemd-logind[1706]: Removed session 14. May 14 00:02:58.721635 systemd[1]: Started sshd@12-10.200.8.5:22-10.200.16.10:37272.service - OpenSSH per-connection server daemon (10.200.16.10:37272). May 14 00:02:59.353792 sshd[5805]: Accepted publickey for core from 10.200.16.10 port 37272 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:02:59.355244 sshd-session[5805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:02:59.359603 systemd-logind[1706]: New session 15 of user core. May 14 00:02:59.371453 systemd[1]: Started session-15.scope - Session 15 of User core. May 14 00:02:59.895576 sshd[5807]: Connection closed by 10.200.16.10 port 37272 May 14 00:02:59.896349 sshd-session[5805]: pam_unix(sshd:session): session closed for user core May 14 00:02:59.900422 systemd[1]: sshd@12-10.200.8.5:22-10.200.16.10:37272.service: Deactivated successfully. May 14 00:02:59.902741 systemd[1]: session-15.scope: Deactivated successfully. May 14 00:02:59.903718 systemd-logind[1706]: Session 15 logged out. Waiting for processes to exit. May 14 00:02:59.904781 systemd-logind[1706]: Removed session 15. May 14 00:03:00.010694 systemd[1]: Started sshd@13-10.200.8.5:22-10.200.16.10:37280.service - OpenSSH per-connection server daemon (10.200.16.10:37280). May 14 00:03:00.662353 sshd[5817]: Accepted publickey for core from 10.200.16.10 port 37280 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:00.661778 sshd-session[5817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:00.667329 systemd-logind[1706]: New session 16 of user core. May 14 00:03:00.673467 systemd[1]: Started session-16.scope - Session 16 of User core. May 14 00:03:01.160260 sshd[5819]: Connection closed by 10.200.16.10 port 37280 May 14 00:03:01.161051 sshd-session[5817]: pam_unix(sshd:session): session closed for user core May 14 00:03:01.164080 systemd[1]: sshd@13-10.200.8.5:22-10.200.16.10:37280.service: Deactivated successfully. May 14 00:03:01.166268 systemd[1]: session-16.scope: Deactivated successfully. May 14 00:03:01.167822 systemd-logind[1706]: Session 16 logged out. Waiting for processes to exit. May 14 00:03:01.169287 systemd-logind[1706]: Removed session 16. May 14 00:03:06.275359 systemd[1]: Started sshd@14-10.200.8.5:22-10.200.16.10:37294.service - OpenSSH per-connection server daemon (10.200.16.10:37294). May 14 00:03:06.747026 containerd[1735]: time="2025-05-14T00:03:06.746955473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"d198606757bbbec5fb361d70ce4c19fe8df9ede5d70890c9adde400a6ab7da06\" pid:5845 exited_at:{seconds:1747180986 nanos:746757671}" May 14 00:03:06.921796 sshd[5831]: Accepted publickey for core from 10.200.16.10 port 37294 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:06.923535 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:06.928034 systemd-logind[1706]: New session 17 of user core. May 14 00:03:06.939456 systemd[1]: Started session-17.scope - Session 17 of User core. May 14 00:03:07.424797 sshd[5854]: Connection closed by 10.200.16.10 port 37294 May 14 00:03:07.425571 sshd-session[5831]: pam_unix(sshd:session): session closed for user core May 14 00:03:07.428554 systemd[1]: sshd@14-10.200.8.5:22-10.200.16.10:37294.service: Deactivated successfully. May 14 00:03:07.430960 systemd[1]: session-17.scope: Deactivated successfully. May 14 00:03:07.432814 systemd-logind[1706]: Session 17 logged out. Waiting for processes to exit. May 14 00:03:07.433876 systemd-logind[1706]: Removed session 17. May 14 00:03:12.537663 systemd[1]: Started sshd@15-10.200.8.5:22-10.200.16.10:55916.service - OpenSSH per-connection server daemon (10.200.16.10:55916). May 14 00:03:13.175156 sshd[5865]: Accepted publickey for core from 10.200.16.10 port 55916 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:13.176793 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:13.181414 systemd-logind[1706]: New session 18 of user core. May 14 00:03:13.186445 systemd[1]: Started session-18.scope - Session 18 of User core. May 14 00:03:13.679318 sshd[5867]: Connection closed by 10.200.16.10 port 55916 May 14 00:03:13.681258 sshd-session[5865]: pam_unix(sshd:session): session closed for user core May 14 00:03:13.686234 systemd[1]: sshd@15-10.200.8.5:22-10.200.16.10:55916.service: Deactivated successfully. May 14 00:03:13.689319 systemd[1]: session-18.scope: Deactivated successfully. May 14 00:03:13.691996 systemd-logind[1706]: Session 18 logged out. Waiting for processes to exit. May 14 00:03:13.693137 systemd-logind[1706]: Removed session 18. May 14 00:03:18.796542 systemd[1]: Started sshd@16-10.200.8.5:22-10.200.16.10:48376.service - OpenSSH per-connection server daemon (10.200.16.10:48376). May 14 00:03:19.438824 sshd[5885]: Accepted publickey for core from 10.200.16.10 port 48376 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:19.440323 sshd-session[5885]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:19.444575 systemd-logind[1706]: New session 19 of user core. May 14 00:03:19.451458 systemd[1]: Started session-19.scope - Session 19 of User core. May 14 00:03:19.943844 sshd[5887]: Connection closed by 10.200.16.10 port 48376 May 14 00:03:19.944593 sshd-session[5885]: pam_unix(sshd:session): session closed for user core May 14 00:03:19.948408 systemd[1]: sshd@16-10.200.8.5:22-10.200.16.10:48376.service: Deactivated successfully. May 14 00:03:19.950594 systemd[1]: session-19.scope: Deactivated successfully. May 14 00:03:19.951460 systemd-logind[1706]: Session 19 logged out. Waiting for processes to exit. May 14 00:03:19.952546 systemd-logind[1706]: Removed session 19. May 14 00:03:22.915744 containerd[1735]: time="2025-05-14T00:03:22.915681629Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"12730097299f1726489f82ff111ce739d0c888a8dcc0d6c50414fd1d65f90875\" pid:5911 exited_at:{seconds:1747181002 nanos:915069824}" May 14 00:03:25.061565 systemd[1]: Started sshd@17-10.200.8.5:22-10.200.16.10:48388.service - OpenSSH per-connection server daemon (10.200.16.10:48388). May 14 00:03:25.703453 sshd[5926]: Accepted publickey for core from 10.200.16.10 port 48388 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:25.704971 sshd-session[5926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:25.709369 systemd-logind[1706]: New session 20 of user core. May 14 00:03:25.716447 systemd[1]: Started session-20.scope - Session 20 of User core. May 14 00:03:26.210170 sshd[5928]: Connection closed by 10.200.16.10 port 48388 May 14 00:03:26.211224 sshd-session[5926]: pam_unix(sshd:session): session closed for user core May 14 00:03:26.215855 systemd[1]: sshd@17-10.200.8.5:22-10.200.16.10:48388.service: Deactivated successfully. May 14 00:03:26.218853 systemd[1]: session-20.scope: Deactivated successfully. May 14 00:03:26.219892 systemd-logind[1706]: Session 20 logged out. Waiting for processes to exit. May 14 00:03:26.221051 systemd-logind[1706]: Removed session 20. May 14 00:03:26.321684 systemd[1]: Started sshd@18-10.200.8.5:22-10.200.16.10:48394.service - OpenSSH per-connection server daemon (10.200.16.10:48394). May 14 00:03:26.956431 sshd[5940]: Accepted publickey for core from 10.200.16.10 port 48394 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:26.957962 sshd-session[5940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:26.962675 systemd-logind[1706]: New session 21 of user core. May 14 00:03:26.968445 systemd[1]: Started session-21.scope - Session 21 of User core. May 14 00:03:27.252970 containerd[1735]: time="2025-05-14T00:03:27.252832605Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"568ad60f4b4044e6db38617ffd54e6dacae45881e0b269d21fc88a1aa255adad\" pid:5955 exited_at:{seconds:1747181007 nanos:252312500}" May 14 00:03:27.519981 sshd[5942]: Connection closed by 10.200.16.10 port 48394 May 14 00:03:27.520889 sshd-session[5940]: pam_unix(sshd:session): session closed for user core May 14 00:03:27.524570 systemd[1]: sshd@18-10.200.8.5:22-10.200.16.10:48394.service: Deactivated successfully. May 14 00:03:27.526534 systemd[1]: session-21.scope: Deactivated successfully. May 14 00:03:27.527401 systemd-logind[1706]: Session 21 logged out. Waiting for processes to exit. May 14 00:03:27.528542 systemd-logind[1706]: Removed session 21. May 14 00:03:27.641930 systemd[1]: Started sshd@19-10.200.8.5:22-10.200.16.10:48404.service - OpenSSH per-connection server daemon (10.200.16.10:48404). May 14 00:03:28.273938 sshd[5973]: Accepted publickey for core from 10.200.16.10 port 48404 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:28.275573 sshd-session[5973]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:28.281315 systemd-logind[1706]: New session 22 of user core. May 14 00:03:28.288462 systemd[1]: Started session-22.scope - Session 22 of User core. May 14 00:03:30.526761 sshd[5975]: Connection closed by 10.200.16.10 port 48404 May 14 00:03:30.527161 sshd-session[5973]: pam_unix(sshd:session): session closed for user core May 14 00:03:30.530885 systemd[1]: sshd@19-10.200.8.5:22-10.200.16.10:48404.service: Deactivated successfully. May 14 00:03:30.533525 systemd[1]: session-22.scope: Deactivated successfully. May 14 00:03:30.533825 systemd[1]: session-22.scope: Consumed 550ms CPU time, 67.2M memory peak. May 14 00:03:30.535468 systemd-logind[1706]: Session 22 logged out. Waiting for processes to exit. May 14 00:03:30.536767 systemd-logind[1706]: Removed session 22. May 14 00:03:30.639865 systemd[1]: Started sshd@20-10.200.8.5:22-10.200.16.10:38682.service - OpenSSH per-connection server daemon (10.200.16.10:38682). May 14 00:03:31.276219 sshd[5992]: Accepted publickey for core from 10.200.16.10 port 38682 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:31.277905 sshd-session[5992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:31.282249 systemd-logind[1706]: New session 23 of user core. May 14 00:03:31.286433 systemd[1]: Started session-23.scope - Session 23 of User core. May 14 00:03:31.888577 sshd[5994]: Connection closed by 10.200.16.10 port 38682 May 14 00:03:31.889600 sshd-session[5992]: pam_unix(sshd:session): session closed for user core May 14 00:03:31.894284 systemd[1]: sshd@20-10.200.8.5:22-10.200.16.10:38682.service: Deactivated successfully. May 14 00:03:31.896971 systemd[1]: session-23.scope: Deactivated successfully. May 14 00:03:31.897904 systemd-logind[1706]: Session 23 logged out. Waiting for processes to exit. May 14 00:03:31.899136 systemd-logind[1706]: Removed session 23. May 14 00:03:32.003906 systemd[1]: Started sshd@21-10.200.8.5:22-10.200.16.10:38692.service - OpenSSH per-connection server daemon (10.200.16.10:38692). May 14 00:03:32.641861 sshd[6004]: Accepted publickey for core from 10.200.16.10 port 38692 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:32.643530 sshd-session[6004]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:32.648538 systemd-logind[1706]: New session 24 of user core. May 14 00:03:32.654452 systemd[1]: Started session-24.scope - Session 24 of User core. May 14 00:03:33.140080 sshd[6006]: Connection closed by 10.200.16.10 port 38692 May 14 00:03:33.141102 sshd-session[6004]: pam_unix(sshd:session): session closed for user core May 14 00:03:33.145417 systemd[1]: sshd@21-10.200.8.5:22-10.200.16.10:38692.service: Deactivated successfully. May 14 00:03:33.147836 systemd[1]: session-24.scope: Deactivated successfully. May 14 00:03:33.148699 systemd-logind[1706]: Session 24 logged out. Waiting for processes to exit. May 14 00:03:33.149661 systemd-logind[1706]: Removed session 24. May 14 00:03:38.252699 systemd[1]: Started sshd@22-10.200.8.5:22-10.200.16.10:38698.service - OpenSSH per-connection server daemon (10.200.16.10:38698). May 14 00:03:38.888532 sshd[6018]: Accepted publickey for core from 10.200.16.10 port 38698 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:38.889882 sshd-session[6018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:38.894212 systemd-logind[1706]: New session 25 of user core. May 14 00:03:38.898471 systemd[1]: Started session-25.scope - Session 25 of User core. May 14 00:03:39.386604 sshd[6021]: Connection closed by 10.200.16.10 port 38698 May 14 00:03:39.387521 sshd-session[6018]: pam_unix(sshd:session): session closed for user core May 14 00:03:39.391375 systemd[1]: sshd@22-10.200.8.5:22-10.200.16.10:38698.service: Deactivated successfully. May 14 00:03:39.393885 systemd[1]: session-25.scope: Deactivated successfully. May 14 00:03:39.395513 systemd-logind[1706]: Session 25 logged out. Waiting for processes to exit. May 14 00:03:39.396615 systemd-logind[1706]: Removed session 25. May 14 00:03:44.498767 systemd[1]: Started sshd@23-10.200.8.5:22-10.200.16.10:45420.service - OpenSSH per-connection server daemon (10.200.16.10:45420). May 14 00:03:45.137606 sshd[6037]: Accepted publickey for core from 10.200.16.10 port 45420 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:45.139048 sshd-session[6037]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:45.144561 systemd-logind[1706]: New session 26 of user core. May 14 00:03:45.150472 systemd[1]: Started session-26.scope - Session 26 of User core. May 14 00:03:45.643066 sshd[6042]: Connection closed by 10.200.16.10 port 45420 May 14 00:03:45.643970 sshd-session[6037]: pam_unix(sshd:session): session closed for user core May 14 00:03:45.648582 systemd[1]: sshd@23-10.200.8.5:22-10.200.16.10:45420.service: Deactivated successfully. May 14 00:03:45.650829 systemd[1]: session-26.scope: Deactivated successfully. May 14 00:03:45.651668 systemd-logind[1706]: Session 26 logged out. Waiting for processes to exit. May 14 00:03:45.652651 systemd-logind[1706]: Removed session 26. May 14 00:03:50.759579 systemd[1]: Started sshd@24-10.200.8.5:22-10.200.16.10:50574.service - OpenSSH per-connection server daemon (10.200.16.10:50574). May 14 00:03:51.395437 sshd[6054]: Accepted publickey for core from 10.200.16.10 port 50574 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:51.396917 sshd-session[6054]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:51.402498 systemd-logind[1706]: New session 27 of user core. May 14 00:03:51.408465 systemd[1]: Started session-27.scope - Session 27 of User core. May 14 00:03:51.901420 sshd[6056]: Connection closed by 10.200.16.10 port 50574 May 14 00:03:51.902245 sshd-session[6054]: pam_unix(sshd:session): session closed for user core May 14 00:03:51.905769 systemd[1]: sshd@24-10.200.8.5:22-10.200.16.10:50574.service: Deactivated successfully. May 14 00:03:51.908702 systemd[1]: session-27.scope: Deactivated successfully. May 14 00:03:51.910867 systemd-logind[1706]: Session 27 logged out. Waiting for processes to exit. May 14 00:03:51.911885 systemd-logind[1706]: Removed session 27. May 14 00:03:52.913624 containerd[1735]: time="2025-05-14T00:03:52.913575094Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f4b3c7db80069125d3c110e3c1a0c95e9e2c7cf6a9c323e94eadf8b5e16292b\" id:\"90367a5b79dce6c4a5d4d406f697eea6e2e0cc7b0fd1cb3572ca6d1a5ce542d1\" pid:6085 exited_at:{seconds:1747181032 nanos:913220591}" May 14 00:03:57.014723 systemd[1]: Started sshd@25-10.200.8.5:22-10.200.16.10:50582.service - OpenSSH per-connection server daemon (10.200.16.10:50582). May 14 00:03:57.258869 containerd[1735]: time="2025-05-14T00:03:57.258823743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"096a6162f2f90f2fbf20262dbdf8397d41eb79b84f3d8c56bec22fde0bda9d80\" pid:6114 exited_at:{seconds:1747181037 nanos:258592241}" May 14 00:03:57.647236 sshd[6100]: Accepted publickey for core from 10.200.16.10 port 50582 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:03:57.648980 sshd-session[6100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:03:57.653567 systemd-logind[1706]: New session 28 of user core. May 14 00:03:57.658481 systemd[1]: Started session-28.scope - Session 28 of User core. May 14 00:03:58.147317 sshd[6123]: Connection closed by 10.200.16.10 port 50582 May 14 00:03:58.148081 sshd-session[6100]: pam_unix(sshd:session): session closed for user core May 14 00:03:58.152340 systemd[1]: sshd@25-10.200.8.5:22-10.200.16.10:50582.service: Deactivated successfully. May 14 00:03:58.154814 systemd[1]: session-28.scope: Deactivated successfully. May 14 00:03:58.155778 systemd-logind[1706]: Session 28 logged out. Waiting for processes to exit. May 14 00:03:58.156698 systemd-logind[1706]: Removed session 28. May 14 00:04:03.260659 systemd[1]: Started sshd@26-10.200.8.5:22-10.200.16.10:39954.service - OpenSSH per-connection server daemon (10.200.16.10:39954). May 14 00:04:03.900058 sshd[6138]: Accepted publickey for core from 10.200.16.10 port 39954 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:04:03.903198 sshd-session[6138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:04:03.910360 systemd-logind[1706]: New session 29 of user core. May 14 00:04:03.918457 systemd[1]: Started session-29.scope - Session 29 of User core. May 14 00:04:04.401076 sshd[6142]: Connection closed by 10.200.16.10 port 39954 May 14 00:04:04.401958 sshd-session[6138]: pam_unix(sshd:session): session closed for user core May 14 00:04:04.406688 systemd[1]: sshd@26-10.200.8.5:22-10.200.16.10:39954.service: Deactivated successfully. May 14 00:04:04.409224 systemd[1]: session-29.scope: Deactivated successfully. May 14 00:04:04.410727 systemd-logind[1706]: Session 29 logged out. Waiting for processes to exit. May 14 00:04:04.411770 systemd-logind[1706]: Removed session 29. May 14 00:04:06.746260 containerd[1735]: time="2025-05-14T00:04:06.746211656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdf92332ce19f27a77b3f63e142bb1655cdae4e8d781024b56543a52882ef650\" id:\"0d4070717b58f070bf0c198f585da170e4a9eeed231bee3aacdba529a0bc8c66\" pid:6165 exited_at:{seconds:1747181046 nanos:745979454}" May 14 00:04:09.516356 systemd[1]: Started sshd@27-10.200.8.5:22-10.200.16.10:48894.service - OpenSSH per-connection server daemon (10.200.16.10:48894). May 14 00:04:10.152371 sshd[6175]: Accepted publickey for core from 10.200.16.10 port 48894 ssh2: RSA SHA256:kdsm4aPxgwFO/vR4uHEnGUnhOKZ6XU57pxl25IkKi98 May 14 00:04:10.154136 sshd-session[6175]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:04:10.159000 systemd-logind[1706]: New session 30 of user core. May 14 00:04:10.167453 systemd[1]: Started session-30.scope - Session 30 of User core. May 14 00:04:10.655640 sshd[6177]: Connection closed by 10.200.16.10 port 48894 May 14 00:04:10.657614 sshd-session[6175]: pam_unix(sshd:session): session closed for user core May 14 00:04:10.663561 systemd[1]: sshd@27-10.200.8.5:22-10.200.16.10:48894.service: Deactivated successfully. May 14 00:04:10.666970 systemd[1]: session-30.scope: Deactivated successfully. May 14 00:04:10.668192 systemd-logind[1706]: Session 30 logged out. Waiting for processes to exit. May 14 00:04:10.669250 systemd-logind[1706]: Removed session 30.