May 14 18:09:26.928985 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed May 14 16:37:27 -00 2025 May 14 18:09:26.929012 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=adf4ab3cd3fc72d424aa1ba920dfa0e67212fa35eadab2c698966b09b9e294b0 May 14 18:09:26.929022 kernel: BIOS-provided physical RAM map: May 14 18:09:26.929028 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 14 18:09:26.929034 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved May 14 18:09:26.929039 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable May 14 18:09:26.929048 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc4fff] reserved May 14 18:09:26.929054 kernel: BIOS-e820: [mem 0x000000003ffc5000-0x000000003ffd1fff] usable May 14 18:09:26.929060 kernel: BIOS-e820: [mem 0x000000003ffd2000-0x000000003fffafff] ACPI data May 14 18:09:26.929066 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS May 14 18:09:26.929072 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable May 14 18:09:26.929078 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable May 14 18:09:26.929084 kernel: printk: legacy bootconsole [earlyser0] enabled May 14 18:09:26.929090 kernel: NX (Execute Disable) protection: active May 14 18:09:26.929100 kernel: APIC: Static calls initialized May 14 18:09:26.929106 kernel: efi: EFI v2.7 by Microsoft May 14 18:09:26.929112 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ebb9a98 RNG=0x3ffd2018 May 14 18:09:26.929119 kernel: random: crng init done May 14 18:09:26.929126 kernel: secureboot: Secure boot disabled May 14 18:09:26.929132 kernel: SMBIOS 3.1.0 present. May 14 18:09:26.929139 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 11/21/2024 May 14 18:09:26.929145 kernel: DMI: Memory slots populated: 2/2 May 14 18:09:26.929153 kernel: Hypervisor detected: Microsoft Hyper-V May 14 18:09:26.929160 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 May 14 18:09:26.929166 kernel: Hyper-V: Nested features: 0x3e0101 May 14 18:09:26.929173 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 May 14 18:09:26.929179 kernel: Hyper-V: Using hypercall for remote TLB flush May 14 18:09:26.929186 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 14 18:09:26.929192 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 14 18:09:26.929199 kernel: tsc: Detected 2299.999 MHz processor May 14 18:09:26.929205 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 14 18:09:26.929213 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 14 18:09:26.929220 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 May 14 18:09:26.929228 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 14 18:09:26.929235 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 14 18:09:26.929242 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved May 14 18:09:26.929249 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 May 14 18:09:26.929255 kernel: Using GB pages for direct mapping May 14 18:09:26.929262 kernel: ACPI: Early table checksum verification disabled May 14 18:09:26.929269 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) May 14 18:09:26.929278 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929287 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929294 kernel: ACPI: DSDT 0x000000003FFD6000 01E11C (v02 MSFTVM DSDT01 00000001 INTL 20230628) May 14 18:09:26.929301 kernel: ACPI: FACS 0x000000003FFFE000 000040 May 14 18:09:26.929309 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929316 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929324 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929330 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) May 14 18:09:26.929337 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) May 14 18:09:26.929343 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 14 18:09:26.929350 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] May 14 18:09:26.929356 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff411b] May 14 18:09:26.929364 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] May 14 18:09:26.929371 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] May 14 18:09:26.929378 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] May 14 18:09:26.929387 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] May 14 18:09:26.929395 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] May 14 18:09:26.929402 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] May 14 18:09:26.929409 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] May 14 18:09:26.929417 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] May 14 18:09:26.929424 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] May 14 18:09:26.929431 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] May 14 18:09:26.929439 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] May 14 18:09:26.929445 kernel: Zone ranges: May 14 18:09:26.929454 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 14 18:09:26.929461 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 14 18:09:26.929469 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] May 14 18:09:26.929476 kernel: Device empty May 14 18:09:26.929484 kernel: Movable zone start for each node May 14 18:09:26.929491 kernel: Early memory node ranges May 14 18:09:26.929498 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 14 18:09:26.929506 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] May 14 18:09:26.929513 kernel: node 0: [mem 0x000000003ffc5000-0x000000003ffd1fff] May 14 18:09:26.929521 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] May 14 18:09:26.929528 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] May 14 18:09:26.929535 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] May 14 18:09:26.929542 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 14 18:09:26.929549 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 14 18:09:26.929555 kernel: On node 0, zone DMA32: 132 pages in unavailable ranges May 14 18:09:26.929563 kernel: On node 0, zone DMA32: 45 pages in unavailable ranges May 14 18:09:26.929569 kernel: ACPI: PM-Timer IO Port: 0x408 May 14 18:09:26.929575 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 14 18:09:26.929583 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 14 18:09:26.929589 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 14 18:09:26.929596 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 May 14 18:09:26.929601 kernel: TSC deadline timer available May 14 18:09:26.929608 kernel: CPU topo: Max. logical packages: 1 May 14 18:09:26.929615 kernel: CPU topo: Max. logical dies: 1 May 14 18:09:26.929620 kernel: CPU topo: Max. dies per package: 1 May 14 18:09:26.929626 kernel: CPU topo: Max. threads per core: 2 May 14 18:09:26.929632 kernel: CPU topo: Num. cores per package: 1 May 14 18:09:26.929640 kernel: CPU topo: Num. threads per package: 2 May 14 18:09:26.929646 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 14 18:09:26.929653 kernel: [mem 0x40000000-0xffffffff] available for PCI devices May 14 18:09:26.929660 kernel: Booting paravirtualized kernel on Hyper-V May 14 18:09:26.929667 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 14 18:09:26.929673 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 14 18:09:26.929680 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 14 18:09:26.929687 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 14 18:09:26.929694 kernel: pcpu-alloc: [0] 0 1 May 14 18:09:26.929702 kernel: Hyper-V: PV spinlocks enabled May 14 18:09:26.929709 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 14 18:09:26.929717 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=adf4ab3cd3fc72d424aa1ba920dfa0e67212fa35eadab2c698966b09b9e294b0 May 14 18:09:26.929724 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 14 18:09:26.929731 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) May 14 18:09:26.929737 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 14 18:09:26.929798 kernel: Fallback order for Node 0: 0 May 14 18:09:26.929806 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2096878 May 14 18:09:26.929814 kernel: Policy zone: Normal May 14 18:09:26.929821 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 14 18:09:26.929828 kernel: software IO TLB: area num 2. May 14 18:09:26.929835 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 14 18:09:26.929842 kernel: ftrace: allocating 40065 entries in 157 pages May 14 18:09:26.929849 kernel: ftrace: allocated 157 pages with 5 groups May 14 18:09:26.929855 kernel: Dynamic Preempt: voluntary May 14 18:09:26.929863 kernel: rcu: Preemptible hierarchical RCU implementation. May 14 18:09:26.929870 kernel: rcu: RCU event tracing is enabled. May 14 18:09:26.929878 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 14 18:09:26.929891 kernel: Trampoline variant of Tasks RCU enabled. May 14 18:09:26.929899 kernel: Rude variant of Tasks RCU enabled. May 14 18:09:26.929908 kernel: Tracing variant of Tasks RCU enabled. May 14 18:09:26.929915 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 14 18:09:26.929923 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 14 18:09:26.929931 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 18:09:26.929938 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 18:09:26.929945 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 18:09:26.929953 kernel: Using NULL legacy PIC May 14 18:09:26.929960 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 May 14 18:09:26.929969 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 14 18:09:26.929977 kernel: Console: colour dummy device 80x25 May 14 18:09:26.929984 kernel: printk: legacy console [tty1] enabled May 14 18:09:26.929991 kernel: printk: legacy console [ttyS0] enabled May 14 18:09:26.929999 kernel: printk: legacy bootconsole [earlyser0] disabled May 14 18:09:26.930006 kernel: ACPI: Core revision 20240827 May 14 18:09:26.930015 kernel: Failed to register legacy timer interrupt May 14 18:09:26.930022 kernel: APIC: Switch to symmetric I/O mode setup May 14 18:09:26.930029 kernel: x2apic enabled May 14 18:09:26.930036 kernel: APIC: Switched APIC routing to: physical x2apic May 14 18:09:26.930043 kernel: Hyper-V: Host Build 10.0.26100.1221-1-0 May 14 18:09:26.930051 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 14 18:09:26.930058 kernel: Hyper-V: Disabling IBT because of Hyper-V bug May 14 18:09:26.930066 kernel: Hyper-V: Using IPI hypercalls May 14 18:09:26.930073 kernel: APIC: send_IPI() replaced with hv_send_ipi() May 14 18:09:26.930082 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() May 14 18:09:26.930089 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() May 14 18:09:26.930097 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() May 14 18:09:26.930104 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() May 14 18:09:26.930112 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() May 14 18:09:26.930119 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 14 18:09:26.930126 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4599.99 BogoMIPS (lpj=2299999) May 14 18:09:26.930134 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 14 18:09:26.930141 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 14 18:09:26.930150 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 14 18:09:26.930157 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 14 18:09:26.930164 kernel: Spectre V2 : Mitigation: Retpolines May 14 18:09:26.930172 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 14 18:09:26.930179 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 14 18:09:26.930186 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! May 14 18:09:26.930193 kernel: RETBleed: Vulnerable May 14 18:09:26.930200 kernel: Speculative Store Bypass: Vulnerable May 14 18:09:26.930207 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 14 18:09:26.930214 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 14 18:09:26.930221 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 14 18:09:26.930230 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' May 14 18:09:26.930237 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' May 14 18:09:26.930244 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' May 14 18:09:26.930252 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' May 14 18:09:26.930259 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' May 14 18:09:26.930266 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' May 14 18:09:26.930273 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 14 18:09:26.930280 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 May 14 18:09:26.930287 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 May 14 18:09:26.930294 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 May 14 18:09:26.930303 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 May 14 18:09:26.930310 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 May 14 18:09:26.930317 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 May 14 18:09:26.930324 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. May 14 18:09:26.930332 kernel: Freeing SMP alternatives memory: 32K May 14 18:09:26.930341 kernel: pid_max: default: 32768 minimum: 301 May 14 18:09:26.930349 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 14 18:09:26.930357 kernel: landlock: Up and running. May 14 18:09:26.930365 kernel: SELinux: Initializing. May 14 18:09:26.930373 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 14 18:09:26.930380 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 14 18:09:26.930388 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) May 14 18:09:26.930396 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. May 14 18:09:26.930404 kernel: signal: max sigframe size: 11952 May 14 18:09:26.930412 kernel: rcu: Hierarchical SRCU implementation. May 14 18:09:26.930419 kernel: rcu: Max phase no-delay instances is 400. May 14 18:09:26.930427 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 14 18:09:26.930434 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 14 18:09:26.930442 kernel: smp: Bringing up secondary CPUs ... May 14 18:09:26.930450 kernel: smpboot: x86: Booting SMP configuration: May 14 18:09:26.930457 kernel: .... node #0, CPUs: #1 May 14 18:09:26.930466 kernel: smp: Brought up 1 node, 2 CPUs May 14 18:09:26.930474 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) May 14 18:09:26.930482 kernel: Memory: 8082316K/8387512K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54424K init, 2536K bss, 299988K reserved, 0K cma-reserved) May 14 18:09:26.930489 kernel: devtmpfs: initialized May 14 18:09:26.930496 kernel: x86/mm: Memory block size: 128MB May 14 18:09:26.930504 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) May 14 18:09:26.930511 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 14 18:09:26.930519 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 14 18:09:26.930527 kernel: pinctrl core: initialized pinctrl subsystem May 14 18:09:26.930536 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 14 18:09:26.930543 kernel: audit: initializing netlink subsys (disabled) May 14 18:09:26.930551 kernel: audit: type=2000 audit(1747246163.028:1): state=initialized audit_enabled=0 res=1 May 14 18:09:26.930558 kernel: thermal_sys: Registered thermal governor 'step_wise' May 14 18:09:26.930565 kernel: thermal_sys: Registered thermal governor 'user_space' May 14 18:09:26.930573 kernel: cpuidle: using governor menu May 14 18:09:26.930580 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 14 18:09:26.930588 kernel: dca service started, version 1.12.1 May 14 18:09:26.930595 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] May 14 18:09:26.930604 kernel: e820: reserve RAM buffer [mem 0x3ffd2000-0x3fffffff] May 14 18:09:26.930612 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 14 18:09:26.930620 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 14 18:09:26.930627 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 14 18:09:26.930635 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 14 18:09:26.930642 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 14 18:09:26.930650 kernel: ACPI: Added _OSI(Module Device) May 14 18:09:26.930657 kernel: ACPI: Added _OSI(Processor Device) May 14 18:09:26.930665 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 14 18:09:26.930674 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 14 18:09:26.930681 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 14 18:09:26.930689 kernel: ACPI: Interpreter enabled May 14 18:09:26.930697 kernel: ACPI: PM: (supports S0 S5) May 14 18:09:26.930704 kernel: ACPI: Using IOAPIC for interrupt routing May 14 18:09:26.930712 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 14 18:09:26.930719 kernel: PCI: Ignoring E820 reservations for host bridge windows May 14 18:09:26.930726 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F May 14 18:09:26.930733 kernel: iommu: Default domain type: Translated May 14 18:09:26.930757 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 14 18:09:26.930765 kernel: efivars: Registered efivars operations May 14 18:09:26.930772 kernel: PCI: Using ACPI for IRQ routing May 14 18:09:26.930780 kernel: PCI: System does not support PCI May 14 18:09:26.930787 kernel: vgaarb: loaded May 14 18:09:26.930794 kernel: clocksource: Switched to clocksource tsc-early May 14 18:09:26.930802 kernel: VFS: Disk quotas dquot_6.6.0 May 14 18:09:26.930809 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 14 18:09:26.930816 kernel: pnp: PnP ACPI init May 14 18:09:26.930824 kernel: pnp: PnP ACPI: found 3 devices May 14 18:09:26.930832 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 14 18:09:26.930839 kernel: NET: Registered PF_INET protocol family May 14 18:09:26.930846 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 14 18:09:26.930854 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) May 14 18:09:26.930861 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 14 18:09:26.930869 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) May 14 18:09:26.930876 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 14 18:09:26.930884 kernel: TCP: Hash tables configured (established 65536 bind 65536) May 14 18:09:26.930893 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) May 14 18:09:26.930908 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) May 14 18:09:26.930916 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 14 18:09:26.930924 kernel: NET: Registered PF_XDP protocol family May 14 18:09:26.930931 kernel: PCI: CLS 0 bytes, default 64 May 14 18:09:26.930938 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 14 18:09:26.930946 kernel: software IO TLB: mapped [mem 0x000000003aa59000-0x000000003ea59000] (64MB) May 14 18:09:26.930953 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer May 14 18:09:26.930961 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules May 14 18:09:26.930972 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 14 18:09:26.930983 kernel: clocksource: Switched to clocksource tsc May 14 18:09:26.930991 kernel: Initialise system trusted keyrings May 14 18:09:26.931000 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 May 14 18:09:26.931009 kernel: Key type asymmetric registered May 14 18:09:26.931017 kernel: Asymmetric key parser 'x509' registered May 14 18:09:26.931024 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 14 18:09:26.931032 kernel: io scheduler mq-deadline registered May 14 18:09:26.931041 kernel: io scheduler kyber registered May 14 18:09:26.931051 kernel: io scheduler bfq registered May 14 18:09:26.931060 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 14 18:09:26.931068 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 14 18:09:26.931077 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 14 18:09:26.931087 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A May 14 18:09:26.931095 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A May 14 18:09:26.931104 kernel: i8042: PNP: No PS/2 controller found. May 14 18:09:26.931245 kernel: rtc_cmos 00:02: registered as rtc0 May 14 18:09:26.931335 kernel: rtc_cmos 00:02: setting system clock to 2025-05-14T18:09:26 UTC (1747246166) May 14 18:09:26.931410 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram May 14 18:09:26.931418 kernel: intel_pstate: Intel P-state driver initializing May 14 18:09:26.931425 kernel: efifb: probing for efifb May 14 18:09:26.931431 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 14 18:09:26.931439 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 14 18:09:26.931446 kernel: efifb: scrolling: redraw May 14 18:09:26.931454 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 14 18:09:26.931463 kernel: Console: switching to colour frame buffer device 128x48 May 14 18:09:26.931470 kernel: fb0: EFI VGA frame buffer device May 14 18:09:26.931477 kernel: pstore: Using crash dump compression: deflate May 14 18:09:26.931484 kernel: pstore: Registered efi_pstore as persistent store backend May 14 18:09:26.931492 kernel: NET: Registered PF_INET6 protocol family May 14 18:09:26.931500 kernel: Segment Routing with IPv6 May 14 18:09:26.931507 kernel: In-situ OAM (IOAM) with IPv6 May 14 18:09:26.931515 kernel: NET: Registered PF_PACKET protocol family May 14 18:09:26.931522 kernel: Key type dns_resolver registered May 14 18:09:26.931531 kernel: IPI shorthand broadcast: enabled May 14 18:09:26.931539 kernel: sched_clock: Marking stable (2737147393, 83522770)->(3101151743, -280481580) May 14 18:09:26.931547 kernel: registered taskstats version 1 May 14 18:09:26.931554 kernel: Loading compiled-in X.509 certificates May 14 18:09:26.931562 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 41e2a150aa08ec2528be2394819b3db677e5f4ef' May 14 18:09:26.931570 kernel: Demotion targets for Node 0: null May 14 18:09:26.931578 kernel: Key type .fscrypt registered May 14 18:09:26.931585 kernel: Key type fscrypt-provisioning registered May 14 18:09:26.931593 kernel: ima: No TPM chip found, activating TPM-bypass! May 14 18:09:26.931768 kernel: ima: Allocated hash algorithm: sha1 May 14 18:09:26.931778 kernel: ima: No architecture policies found May 14 18:09:26.931786 kernel: clk: Disabling unused clocks May 14 18:09:26.931794 kernel: Warning: unable to open an initial console. May 14 18:09:26.931802 kernel: Freeing unused kernel image (initmem) memory: 54424K May 14 18:09:26.931810 kernel: Write protecting the kernel read-only data: 24576k May 14 18:09:26.931818 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 14 18:09:26.931826 kernel: Run /init as init process May 14 18:09:26.931834 kernel: with arguments: May 14 18:09:26.931844 kernel: /init May 14 18:09:26.931851 kernel: with environment: May 14 18:09:26.931859 kernel: HOME=/ May 14 18:09:26.931867 kernel: TERM=linux May 14 18:09:26.931874 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 14 18:09:26.931884 systemd[1]: Successfully made /usr/ read-only. May 14 18:09:26.931896 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 18:09:26.931905 systemd[1]: Detected virtualization microsoft. May 14 18:09:26.931915 systemd[1]: Detected architecture x86-64. May 14 18:09:26.931923 systemd[1]: Running in initrd. May 14 18:09:26.931931 systemd[1]: No hostname configured, using default hostname. May 14 18:09:26.931940 systemd[1]: Hostname set to . May 14 18:09:26.931948 systemd[1]: Initializing machine ID from random generator. May 14 18:09:26.931956 systemd[1]: Queued start job for default target initrd.target. May 14 18:09:26.931965 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 18:09:26.931973 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 18:09:26.931984 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 14 18:09:26.931993 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 18:09:26.932001 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 14 18:09:26.932010 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 14 18:09:26.932020 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 14 18:09:26.932028 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 14 18:09:26.932036 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 18:09:26.932046 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 18:09:26.932054 systemd[1]: Reached target paths.target - Path Units. May 14 18:09:26.932063 systemd[1]: Reached target slices.target - Slice Units. May 14 18:09:26.932071 systemd[1]: Reached target swap.target - Swaps. May 14 18:09:26.932078 systemd[1]: Reached target timers.target - Timer Units. May 14 18:09:26.932086 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 14 18:09:26.932094 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 18:09:26.932102 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 14 18:09:26.932112 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 14 18:09:26.932119 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 18:09:26.932127 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 18:09:26.932135 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 18:09:26.932142 systemd[1]: Reached target sockets.target - Socket Units. May 14 18:09:26.932150 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 14 18:09:26.932158 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 18:09:26.932165 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 14 18:09:26.932174 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 14 18:09:26.932184 systemd[1]: Starting systemd-fsck-usr.service... May 14 18:09:26.932192 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 18:09:26.932201 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 18:09:26.932217 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:26.932244 systemd-journald[205]: Collecting audit messages is disabled. May 14 18:09:26.932266 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 14 18:09:26.932276 systemd-journald[205]: Journal started May 14 18:09:26.932299 systemd-journald[205]: Runtime Journal (/run/log/journal/7f610dd827e248fea78d46427c4a1ac5) is 8M, max 159M, 151M free. May 14 18:09:26.939327 systemd[1]: Started systemd-journald.service - Journal Service. May 14 18:09:26.938685 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 18:09:26.938824 systemd[1]: Finished systemd-fsck-usr.service. May 14 18:09:26.940868 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 18:09:26.943843 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 18:09:26.956405 systemd-modules-load[207]: Inserted module 'overlay' May 14 18:09:26.964805 systemd-tmpfiles[215]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 14 18:09:26.968852 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 18:09:26.975553 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 18:09:26.980220 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 18:09:26.989754 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 14 18:09:26.991125 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:26.995142 kernel: Bridge firewalling registered May 14 18:09:26.991441 systemd-modules-load[207]: Inserted module 'br_netfilter' May 14 18:09:26.993924 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 18:09:27.001952 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 18:09:27.004850 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 18:09:27.010017 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 18:09:27.020244 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 18:09:27.024707 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 18:09:27.031833 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 18:09:27.034756 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 14 18:09:27.055638 dracut-cmdline[247]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=adf4ab3cd3fc72d424aa1ba920dfa0e67212fa35eadab2c698966b09b9e294b0 May 14 18:09:27.057052 systemd-resolved[245]: Positive Trust Anchors: May 14 18:09:27.057059 systemd-resolved[245]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 18:09:27.057089 systemd-resolved[245]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 18:09:27.059410 systemd-resolved[245]: Defaulting to hostname 'linux'. May 14 18:09:27.060728 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 18:09:27.073715 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 18:09:27.126757 kernel: SCSI subsystem initialized May 14 18:09:27.133755 kernel: Loading iSCSI transport class v2.0-870. May 14 18:09:27.140755 kernel: iscsi: registered transport (tcp) May 14 18:09:27.156812 kernel: iscsi: registered transport (qla4xxx) May 14 18:09:27.156850 kernel: QLogic iSCSI HBA Driver May 14 18:09:27.168133 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 14 18:09:27.176414 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 14 18:09:27.179192 systemd[1]: Reached target network-pre.target - Preparation for Network. May 14 18:09:27.207428 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 14 18:09:27.210670 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 14 18:09:27.251755 kernel: raid6: avx512x4 gen() 45903 MB/s May 14 18:09:27.268755 kernel: raid6: avx512x2 gen() 44731 MB/s May 14 18:09:27.285750 kernel: raid6: avx512x1 gen() 29994 MB/s May 14 18:09:27.303750 kernel: raid6: avx2x4 gen() 41724 MB/s May 14 18:09:27.320751 kernel: raid6: avx2x2 gen() 43552 MB/s May 14 18:09:27.338196 kernel: raid6: avx2x1 gen() 31867 MB/s May 14 18:09:27.338212 kernel: raid6: using algorithm avx512x4 gen() 45903 MB/s May 14 18:09:27.356875 kernel: raid6: .... xor() 7990 MB/s, rmw enabled May 14 18:09:27.356893 kernel: raid6: using avx512x2 recovery algorithm May 14 18:09:27.372765 kernel: xor: automatically using best checksumming function avx May 14 18:09:27.473755 kernel: Btrfs loaded, zoned=no, fsverity=no May 14 18:09:27.478034 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 14 18:09:27.479860 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 18:09:27.500360 systemd-udevd[455]: Using default interface naming scheme 'v255'. May 14 18:09:27.503846 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 18:09:27.509182 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 14 18:09:27.533826 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation May 14 18:09:27.549107 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 14 18:09:27.550026 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 18:09:27.584400 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 18:09:27.588221 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 14 18:09:27.628767 kernel: cryptd: max_cpu_qlen set to 1000 May 14 18:09:27.637770 kernel: AES CTR mode by8 optimization enabled May 14 18:09:27.645324 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 18:09:27.645422 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:27.658790 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:27.666196 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:27.678938 kernel: hv_vmbus: Vmbus version:5.3 May 14 18:09:27.687393 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 18:09:27.687465 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:27.699890 kernel: pps_core: LinuxPPS API ver. 1 registered May 14 18:09:27.699909 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 14 18:09:27.699427 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:27.706765 kernel: PTP clock support registered May 14 18:09:27.710816 kernel: hv_vmbus: registering driver hv_storvsc May 14 18:09:27.714055 kernel: hv_vmbus: registering driver hyperv_keyboard May 14 18:09:27.717768 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 May 14 18:09:27.717811 kernel: hid: raw HID events driver (C) Jiri Kosina May 14 18:09:27.720145 kernel: scsi host0: storvsc_host_t May 14 18:09:27.727200 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 May 14 18:09:27.727247 kernel: hv_utils: Registering HyperV Utility Driver May 14 18:09:27.727258 kernel: hv_vmbus: registering driver hv_utils May 14 18:09:27.760698 kernel: hv_utils: Shutdown IC version 3.2 May 14 18:09:27.762807 kernel: hv_utils: Heartbeat IC version 3.0 May 14 18:09:27.763107 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:27.764889 kernel: hv_utils: TimeSync IC version 4.0 May 14 18:09:27.683539 systemd-resolved[245]: Clock change detected. Flushing caches. May 14 18:09:27.687674 systemd-journald[205]: Time jumped backwards, rotating. May 14 18:09:27.687711 kernel: hv_vmbus: registering driver hv_pci May 14 18:09:27.691637 kernel: hv_vmbus: registering driver hv_netvsc May 14 18:09:27.691672 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 May 14 18:09:27.741193 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 May 14 18:09:27.741299 kernel: hv_vmbus: registering driver hid_hyperv May 14 18:09:27.741309 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 May 14 18:09:27.741318 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] May 14 18:09:27.741408 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 14 18:09:27.741485 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] May 14 18:09:27.741562 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint May 14 18:09:27.741655 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52342850 (unnamed net_device) (uninitialized): VF slot 1 added May 14 18:09:27.741731 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] May 14 18:09:27.741847 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 14 18:09:27.741936 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 14 18:09:27.741945 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 14 18:09:27.742025 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) May 14 18:09:27.742120 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 May 14 18:09:27.742193 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned May 14 18:09:27.753759 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#179 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 14 18:09:27.756922 kernel: nvme nvme0: pci function c05b:00:00.0 May 14 18:09:27.757093 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) May 14 18:09:27.980699 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#296 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 14 18:09:27.980831 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 14 18:09:27.980926 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 14 18:09:28.361376 kernel: nvme nvme0: using unchecked data buffer May 14 18:09:28.744765 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 May 14 18:09:28.801910 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 May 14 18:09:28.802004 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] May 14 18:09:28.802095 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] May 14 18:09:28.802170 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint May 14 18:09:28.802278 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] May 14 18:09:28.802368 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] May 14 18:09:28.802450 kernel: pci 7870:00:00.0: enabling Extended Tags May 14 18:09:28.802514 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 May 14 18:09:28.802591 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned May 14 18:09:28.802663 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned May 14 18:09:28.802722 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) May 14 18:09:28.802833 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 May 14 18:09:28.802901 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52342850 eth0: VF registering: eth1 May 14 18:09:28.802969 kernel: mana 7870:00:00.0 eth1: joined to eth0 May 14 18:09:28.745895 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. May 14 18:09:28.807858 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 May 14 18:09:28.818098 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 14 18:09:28.904728 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. May 14 18:09:28.917227 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. May 14 18:09:28.918128 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. May 14 18:09:28.922811 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 14 18:09:29.041679 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 14 18:09:29.049723 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 14 18:09:29.053216 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 18:09:29.054480 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 18:09:29.057619 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 14 18:09:29.075506 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 14 18:09:29.951472 disk-uuid[666]: The operation has completed successfully. May 14 18:09:29.953870 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 14 18:09:30.004344 systemd[1]: disk-uuid.service: Deactivated successfully. May 14 18:09:30.004415 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 14 18:09:30.028121 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 14 18:09:30.039587 sh[715]: Success May 14 18:09:30.066993 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 14 18:09:30.067045 kernel: device-mapper: uevent: version 1.0.3 May 14 18:09:30.068313 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 14 18:09:30.075763 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 14 18:09:30.339046 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 14 18:09:30.344396 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 14 18:09:30.353404 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 14 18:09:30.364635 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 14 18:09:30.364672 kernel: BTRFS: device fsid dedcf745-d4ff-44ac-b61c-5ec1bad114c7 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (728) May 14 18:09:30.368276 kernel: BTRFS info (device dm-0): first mount of filesystem dedcf745-d4ff-44ac-b61c-5ec1bad114c7 May 14 18:09:30.368308 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 14 18:09:30.369786 kernel: BTRFS info (device dm-0): using free-space-tree May 14 18:09:30.671275 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 14 18:09:30.676046 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 14 18:09:30.678675 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 14 18:09:30.681231 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 14 18:09:30.689368 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 14 18:09:30.713803 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (751) May 14 18:09:30.722344 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 9b1e3c61-417b-43c0-b064-c7db19a42998 May 14 18:09:30.722387 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 14 18:09:30.724078 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 14 18:09:30.745764 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 9b1e3c61-417b-43c0-b064-c7db19a42998 May 14 18:09:30.746338 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 14 18:09:30.751848 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 14 18:09:30.761847 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 18:09:30.766460 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 18:09:30.787663 systemd-networkd[897]: lo: Link UP May 14 18:09:30.787670 systemd-networkd[897]: lo: Gained carrier May 14 18:09:30.789026 systemd-networkd[897]: Enumeration completed May 14 18:09:30.794881 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 14 18:09:30.789339 systemd-networkd[897]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 18:09:30.801256 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 14 18:09:30.801447 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52342850 eth0: Data path switched to VF: enP30832s1 May 14 18:09:30.789342 systemd-networkd[897]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 18:09:30.789598 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 18:09:30.792686 systemd[1]: Reached target network.target - Network. May 14 18:09:30.801846 systemd-networkd[897]: enP30832s1: Link UP May 14 18:09:30.801903 systemd-networkd[897]: eth0: Link UP May 14 18:09:30.801976 systemd-networkd[897]: eth0: Gained carrier May 14 18:09:30.801984 systemd-networkd[897]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 18:09:30.805909 systemd-networkd[897]: enP30832s1: Gained carrier May 14 18:09:30.813774 systemd-networkd[897]: eth0: DHCPv4 address 10.200.8.38/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 14 18:09:31.691598 ignition[888]: Ignition 2.21.0 May 14 18:09:31.691609 ignition[888]: Stage: fetch-offline May 14 18:09:31.691686 ignition[888]: no configs at "/usr/lib/ignition/base.d" May 14 18:09:31.694695 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 14 18:09:31.691692 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:31.691781 ignition[888]: parsed url from cmdline: "" May 14 18:09:31.699867 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 14 18:09:31.691783 ignition[888]: no config URL provided May 14 18:09:31.691787 ignition[888]: reading system config file "/usr/lib/ignition/user.ign" May 14 18:09:31.691792 ignition[888]: no config at "/usr/lib/ignition/user.ign" May 14 18:09:31.691796 ignition[888]: failed to fetch config: resource requires networking May 14 18:09:31.691936 ignition[888]: Ignition finished successfully May 14 18:09:31.720783 ignition[908]: Ignition 2.21.0 May 14 18:09:31.720791 ignition[908]: Stage: fetch May 14 18:09:31.720974 ignition[908]: no configs at "/usr/lib/ignition/base.d" May 14 18:09:31.720982 ignition[908]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:31.721046 ignition[908]: parsed url from cmdline: "" May 14 18:09:31.721048 ignition[908]: no config URL provided May 14 18:09:31.721052 ignition[908]: reading system config file "/usr/lib/ignition/user.ign" May 14 18:09:31.721057 ignition[908]: no config at "/usr/lib/ignition/user.ign" May 14 18:09:31.721084 ignition[908]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 14 18:09:31.815153 ignition[908]: GET result: OK May 14 18:09:31.815247 ignition[908]: config has been read from IMDS userdata May 14 18:09:31.815282 ignition[908]: parsing config with SHA512: 066c89ca9e764dd6987c93c6ab7630523cce780867a7cd942a554505089bde6f96f19e4212ea18e27d8f662dd0b5ede53c742411994affefaa0dcbf257fb7d15 May 14 18:09:31.821581 unknown[908]: fetched base config from "system" May 14 18:09:31.821589 unknown[908]: fetched base config from "system" May 14 18:09:31.821902 ignition[908]: fetch: fetch complete May 14 18:09:31.821593 unknown[908]: fetched user config from "azure" May 14 18:09:31.821906 ignition[908]: fetch: fetch passed May 14 18:09:31.824043 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 14 18:09:31.821939 ignition[908]: Ignition finished successfully May 14 18:09:31.824966 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 14 18:09:31.848583 ignition[914]: Ignition 2.21.0 May 14 18:09:31.848592 ignition[914]: Stage: kargs May 14 18:09:31.848782 ignition[914]: no configs at "/usr/lib/ignition/base.d" May 14 18:09:31.850930 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 14 18:09:31.848789 ignition[914]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:31.855410 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 14 18:09:31.849853 ignition[914]: kargs: kargs passed May 14 18:09:31.849889 ignition[914]: Ignition finished successfully May 14 18:09:31.871616 ignition[920]: Ignition 2.21.0 May 14 18:09:31.871624 ignition[920]: Stage: disks May 14 18:09:31.873443 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 14 18:09:31.871798 ignition[920]: no configs at "/usr/lib/ignition/base.d" May 14 18:09:31.876940 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 14 18:09:31.871803 ignition[920]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:31.879183 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 14 18:09:31.872419 ignition[920]: disks: disks passed May 14 18:09:31.882544 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 18:09:31.872446 ignition[920]: Ignition finished successfully May 14 18:09:31.886550 systemd[1]: Reached target sysinit.target - System Initialization. May 14 18:09:31.894654 systemd[1]: Reached target basic.target - Basic System. May 14 18:09:31.897674 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 14 18:09:31.954567 systemd-fsck[928]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks May 14 18:09:31.957907 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 14 18:09:31.962351 systemd[1]: Mounting sysroot.mount - /sysroot... May 14 18:09:32.239759 kernel: EXT4-fs (nvme0n1p9): mounted filesystem d6072e19-4548-4806-a012-87bb17c59f4c r/w with ordered data mode. Quota mode: none. May 14 18:09:32.240153 systemd[1]: Mounted sysroot.mount - /sysroot. May 14 18:09:32.240975 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 14 18:09:32.261985 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 18:09:32.265534 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 14 18:09:32.272544 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 14 18:09:32.276712 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 14 18:09:32.276770 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 14 18:09:32.285767 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 14 18:09:32.289837 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (937) May 14 18:09:32.289500 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 14 18:09:32.296655 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 9b1e3c61-417b-43c0-b064-c7db19a42998 May 14 18:09:32.296701 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 14 18:09:32.298241 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 14 18:09:32.303532 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 18:09:32.478843 systemd-networkd[897]: eth0: Gained IPv6LL May 14 18:09:32.734881 systemd-networkd[897]: enP30832s1: Gained IPv6LL May 14 18:09:32.857000 coreos-metadata[939]: May 14 18:09:32.856 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 14 18:09:32.863089 coreos-metadata[939]: May 14 18:09:32.863 INFO Fetch successful May 14 18:09:32.864219 coreos-metadata[939]: May 14 18:09:32.863 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 14 18:09:32.877367 coreos-metadata[939]: May 14 18:09:32.877 INFO Fetch successful May 14 18:09:32.891446 coreos-metadata[939]: May 14 18:09:32.891 INFO wrote hostname ci-4334.0.0-a-c37eb65ec1 to /sysroot/etc/hostname May 14 18:09:32.894179 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 14 18:09:32.997438 initrd-setup-root[967]: cut: /sysroot/etc/passwd: No such file or directory May 14 18:09:33.016133 initrd-setup-root[974]: cut: /sysroot/etc/group: No such file or directory May 14 18:09:33.023119 initrd-setup-root[981]: cut: /sysroot/etc/shadow: No such file or directory May 14 18:09:33.026909 initrd-setup-root[988]: cut: /sysroot/etc/gshadow: No such file or directory May 14 18:09:34.032279 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 14 18:09:34.034867 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 14 18:09:34.040360 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 14 18:09:34.050382 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 14 18:09:34.053941 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 9b1e3c61-417b-43c0-b064-c7db19a42998 May 14 18:09:34.069099 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 14 18:09:34.074123 ignition[1055]: INFO : Ignition 2.21.0 May 14 18:09:34.074123 ignition[1055]: INFO : Stage: mount May 14 18:09:34.077915 ignition[1055]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 18:09:34.077915 ignition[1055]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:34.077915 ignition[1055]: INFO : mount: mount passed May 14 18:09:34.077915 ignition[1055]: INFO : Ignition finished successfully May 14 18:09:34.077202 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 14 18:09:34.079879 systemd[1]: Starting ignition-files.service - Ignition (files)... May 14 18:09:34.092510 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 18:09:34.109762 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1068) May 14 18:09:34.112223 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 9b1e3c61-417b-43c0-b064-c7db19a42998 May 14 18:09:34.112249 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 14 18:09:34.112260 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 14 18:09:34.117199 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 18:09:34.138933 ignition[1084]: INFO : Ignition 2.21.0 May 14 18:09:34.138933 ignition[1084]: INFO : Stage: files May 14 18:09:34.140837 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 18:09:34.140837 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:34.140837 ignition[1084]: DEBUG : files: compiled without relabeling support, skipping May 14 18:09:34.153833 ignition[1084]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 14 18:09:34.153833 ignition[1084]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 14 18:09:34.170332 ignition[1084]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 14 18:09:34.173832 ignition[1084]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 14 18:09:34.173832 ignition[1084]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 14 18:09:34.170590 unknown[1084]: wrote ssh authorized keys file for user: core May 14 18:09:34.185512 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 18:09:34.190802 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 14 18:09:34.604721 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 14 18:09:34.960396 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 18:09:34.960396 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 18:09:34.967857 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 18:09:34.991800 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 14 18:09:35.608758 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 14 18:09:36.800594 ignition[1084]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 18:09:36.800594 ignition[1084]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 14 18:09:36.843154 ignition[1084]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 18:09:36.849801 ignition[1084]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 18:09:36.849801 ignition[1084]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 14 18:09:36.854836 ignition[1084]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 14 18:09:36.854836 ignition[1084]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 14 18:09:36.854836 ignition[1084]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 14 18:09:36.854836 ignition[1084]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 14 18:09:36.854836 ignition[1084]: INFO : files: files passed May 14 18:09:36.854836 ignition[1084]: INFO : Ignition finished successfully May 14 18:09:36.854787 systemd[1]: Finished ignition-files.service - Ignition (files). May 14 18:09:36.856274 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 14 18:09:36.872868 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 14 18:09:36.876710 systemd[1]: ignition-quench.service: Deactivated successfully. May 14 18:09:36.876805 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 14 18:09:36.892941 initrd-setup-root-after-ignition[1115]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 18:09:36.892941 initrd-setup-root-after-ignition[1115]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 14 18:09:36.897446 initrd-setup-root-after-ignition[1119]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 18:09:36.896090 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 18:09:36.902334 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 14 18:09:36.905851 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 14 18:09:36.935846 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 14 18:09:36.935917 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 14 18:09:36.939102 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 14 18:09:36.943068 systemd[1]: Reached target initrd.target - Initrd Default Target. May 14 18:09:36.946645 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 14 18:09:36.947232 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 14 18:09:36.966564 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 18:09:36.968150 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 14 18:09:36.981406 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 14 18:09:36.981547 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 18:09:36.981830 systemd[1]: Stopped target timers.target - Timer Units. May 14 18:09:36.982127 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 14 18:09:36.982214 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 18:09:36.983038 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 14 18:09:36.983205 systemd[1]: Stopped target basic.target - Basic System. May 14 18:09:36.998913 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 14 18:09:37.002890 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 14 18:09:37.005384 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 14 18:09:37.005676 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 14 18:09:37.005962 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 14 18:09:37.006240 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 14 18:09:37.006768 systemd[1]: Stopped target sysinit.target - System Initialization. May 14 18:09:37.007276 systemd[1]: Stopped target local-fs.target - Local File Systems. May 14 18:09:37.007791 systemd[1]: Stopped target swap.target - Swaps. May 14 18:09:37.008025 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 14 18:09:37.008130 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 14 18:09:37.009060 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 14 18:09:37.068487 ignition[1139]: INFO : Ignition 2.21.0 May 14 18:09:37.068487 ignition[1139]: INFO : Stage: umount May 14 18:09:37.068487 ignition[1139]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 18:09:37.068487 ignition[1139]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 14 18:09:37.068487 ignition[1139]: INFO : umount: umount passed May 14 18:09:37.068487 ignition[1139]: INFO : Ignition finished successfully May 14 18:09:37.009349 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 18:09:37.009675 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 14 18:09:37.013567 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 18:09:37.019835 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 14 18:09:37.019951 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 14 18:09:37.022520 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 14 18:09:37.022635 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 18:09:37.022760 systemd[1]: ignition-files.service: Deactivated successfully. May 14 18:09:37.022859 systemd[1]: Stopped ignition-files.service - Ignition (files). May 14 18:09:37.023223 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 14 18:09:37.023314 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 14 18:09:37.025925 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 14 18:09:37.027855 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 14 18:09:37.028353 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 14 18:09:37.028488 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 14 18:09:37.029116 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 14 18:09:37.029845 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 14 18:09:37.043822 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 14 18:09:37.044823 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 14 18:09:37.060864 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 14 18:09:37.069789 systemd[1]: ignition-mount.service: Deactivated successfully. May 14 18:09:37.069856 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 14 18:09:37.071302 systemd[1]: ignition-disks.service: Deactivated successfully. May 14 18:09:37.071399 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 14 18:09:37.075965 systemd[1]: ignition-kargs.service: Deactivated successfully. May 14 18:09:37.076007 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 14 18:09:37.076899 systemd[1]: ignition-fetch.service: Deactivated successfully. May 14 18:09:37.076934 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 14 18:09:37.079975 systemd[1]: Stopped target network.target - Network. May 14 18:09:37.086339 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 14 18:09:37.086384 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 14 18:09:37.088174 systemd[1]: Stopped target paths.target - Path Units. May 14 18:09:37.091946 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 14 18:09:37.095876 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 18:09:37.123932 systemd[1]: Stopped target slices.target - Slice Units. May 14 18:09:37.124809 systemd[1]: Stopped target sockets.target - Socket Units. May 14 18:09:37.127416 systemd[1]: iscsid.socket: Deactivated successfully. May 14 18:09:37.127451 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 14 18:09:37.131534 systemd[1]: iscsiuio.socket: Deactivated successfully. May 14 18:09:37.131569 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 18:09:37.132485 systemd[1]: ignition-setup.service: Deactivated successfully. May 14 18:09:37.132527 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 14 18:09:37.137864 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 14 18:09:37.137905 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 14 18:09:37.142883 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 14 18:09:37.146090 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 14 18:09:37.154937 systemd[1]: systemd-resolved.service: Deactivated successfully. May 14 18:09:37.155017 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 14 18:09:37.160343 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 14 18:09:37.160500 systemd[1]: systemd-networkd.service: Deactivated successfully. May 14 18:09:37.160573 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 14 18:09:37.163633 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 14 18:09:37.164209 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 14 18:09:37.168094 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 14 18:09:37.168131 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 14 18:09:37.173636 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 14 18:09:37.175194 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 14 18:09:37.175247 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 18:09:37.175431 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 14 18:09:37.175464 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 14 18:09:37.208180 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52342850 eth0: Data path switched from VF: enP30832s1 May 14 18:09:37.208336 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 14 18:09:37.180326 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 14 18:09:37.180365 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 14 18:09:37.180524 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 14 18:09:37.180555 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 18:09:37.180787 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 18:09:37.181754 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 14 18:09:37.181800 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 14 18:09:37.197284 systemd[1]: systemd-udevd.service: Deactivated successfully. May 14 18:09:37.197399 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 18:09:37.203997 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 14 18:09:37.204057 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 14 18:09:37.210830 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 14 18:09:37.210864 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 14 18:09:37.214822 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 14 18:09:37.214872 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 14 18:09:37.232783 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 14 18:09:37.233235 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 14 18:09:37.239565 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 18:09:37.239618 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 18:09:37.243856 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 14 18:09:37.246433 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 14 18:09:37.247765 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 14 18:09:37.250810 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 14 18:09:37.250855 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 18:09:37.258990 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 14 18:09:37.259036 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 18:09:37.266862 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 14 18:09:37.266911 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 14 18:09:37.267768 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 18:09:37.267799 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:37.269097 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 14 18:09:37.269138 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 14 18:09:37.269164 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 14 18:09:37.269191 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 18:09:37.269427 systemd[1]: network-cleanup.service: Deactivated successfully. May 14 18:09:37.269498 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 14 18:09:37.280165 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 14 18:09:37.280228 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 14 18:09:39.476418 systemd[1]: sysroot-boot.service: Deactivated successfully. May 14 18:09:39.476555 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 14 18:09:39.477593 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 14 18:09:39.482034 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 14 18:09:39.482088 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 14 18:09:39.484856 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 14 18:09:39.515991 systemd[1]: Switching root. May 14 18:09:39.609988 systemd-journald[205]: Journal stopped May 14 18:09:46.255203 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). May 14 18:09:46.255231 kernel: SELinux: policy capability network_peer_controls=1 May 14 18:09:46.255243 kernel: SELinux: policy capability open_perms=1 May 14 18:09:46.255252 kernel: SELinux: policy capability extended_socket_class=1 May 14 18:09:46.255260 kernel: SELinux: policy capability always_check_network=0 May 14 18:09:46.255268 kernel: SELinux: policy capability cgroup_seclabel=1 May 14 18:09:46.255279 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 14 18:09:46.255288 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 14 18:09:46.255297 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 14 18:09:46.255304 kernel: SELinux: policy capability userspace_initial_context=0 May 14 18:09:46.255312 kernel: audit: type=1403 audit(1747246183.395:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 14 18:09:46.255321 systemd[1]: Successfully loaded SELinux policy in 164.082ms. May 14 18:09:46.255330 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.937ms. May 14 18:09:46.255344 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 18:09:46.255353 systemd[1]: Detected virtualization microsoft. May 14 18:09:46.255362 systemd[1]: Detected architecture x86-64. May 14 18:09:46.255370 systemd[1]: Detected first boot. May 14 18:09:46.255378 systemd[1]: Hostname set to . May 14 18:09:46.255389 systemd[1]: Initializing machine ID from random generator. May 14 18:09:46.255397 zram_generator::config[1183]: No configuration found. May 14 18:09:46.255406 kernel: Guest personality initialized and is inactive May 14 18:09:46.255413 kernel: VMCI host device registered (name=vmci, major=10, minor=124) May 14 18:09:46.255421 kernel: Initialized host personality May 14 18:09:46.255428 kernel: NET: Registered PF_VSOCK protocol family May 14 18:09:46.255436 systemd[1]: Populated /etc with preset unit settings. May 14 18:09:46.255446 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 14 18:09:46.255455 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 14 18:09:46.255463 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 14 18:09:46.255778 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 14 18:09:46.255793 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 14 18:09:46.255803 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 14 18:09:46.255812 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 14 18:09:46.255825 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 14 18:09:46.255833 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 14 18:09:46.255843 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 14 18:09:46.255852 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 14 18:09:46.255860 systemd[1]: Created slice user.slice - User and Session Slice. May 14 18:09:46.255869 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 18:09:46.255879 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 18:09:46.255888 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 14 18:09:46.255899 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 14 18:09:46.255911 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 14 18:09:46.255921 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 18:09:46.255930 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 14 18:09:46.255939 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 18:09:46.255948 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 18:09:46.255957 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 14 18:09:46.255966 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 14 18:09:46.255977 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 14 18:09:46.255985 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 14 18:09:46.255994 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 18:09:46.256002 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 18:09:46.256011 systemd[1]: Reached target slices.target - Slice Units. May 14 18:09:46.256020 systemd[1]: Reached target swap.target - Swaps. May 14 18:09:46.256029 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 14 18:09:46.256038 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 14 18:09:46.256049 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 14 18:09:46.256058 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 18:09:46.256067 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 18:09:46.256075 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 18:09:46.256084 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 14 18:09:46.256095 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 14 18:09:46.256104 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 14 18:09:46.256114 systemd[1]: Mounting media.mount - External Media Directory... May 14 18:09:46.256124 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:46.256133 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 14 18:09:46.256142 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 14 18:09:46.256151 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 14 18:09:46.256161 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 14 18:09:46.256172 systemd[1]: Reached target machines.target - Containers. May 14 18:09:46.256182 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 14 18:09:46.256191 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 18:09:46.256201 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 18:09:46.256210 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 14 18:09:46.256220 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 18:09:46.256229 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 18:09:46.256238 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 18:09:46.256249 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 14 18:09:46.256258 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 18:09:46.256268 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 14 18:09:46.256277 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 14 18:09:46.256286 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 14 18:09:46.256296 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 14 18:09:46.256305 systemd[1]: Stopped systemd-fsck-usr.service. May 14 18:09:46.256315 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 18:09:46.256327 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 18:09:46.256336 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 18:09:46.256345 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 14 18:09:46.256354 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 14 18:09:46.256363 kernel: fuse: init (API version 7.41) May 14 18:09:46.256372 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 14 18:09:46.256381 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 18:09:46.256390 systemd[1]: verity-setup.service: Deactivated successfully. May 14 18:09:46.256399 systemd[1]: Stopped verity-setup.service. May 14 18:09:46.256410 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:46.256419 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 14 18:09:46.256428 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 14 18:09:46.256436 systemd[1]: Mounted media.mount - External Media Directory. May 14 18:09:46.256445 kernel: loop: module loaded May 14 18:09:46.256454 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 14 18:09:46.256463 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 14 18:09:46.256473 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 14 18:09:46.256483 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 18:09:46.256492 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 14 18:09:46.256502 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 14 18:09:46.256511 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 18:09:46.256519 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 18:09:46.256528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 18:09:46.256537 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 18:09:46.256546 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 14 18:09:46.256555 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 14 18:09:46.256566 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 18:09:46.256575 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 18:09:46.256584 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 18:09:46.256593 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 14 18:09:46.256625 systemd-journald[1269]: Collecting audit messages is disabled. May 14 18:09:46.256649 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 14 18:09:46.256660 systemd-journald[1269]: Journal started May 14 18:09:46.256686 systemd-journald[1269]: Runtime Journal (/run/log/journal/c22bde849388404bbe6a0b78c100e1e6) is 8M, max 159M, 151M free. May 14 18:09:45.833088 systemd[1]: Queued start job for default target multi-user.target. May 14 18:09:45.841197 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 14 18:09:45.841495 systemd[1]: systemd-journald.service: Deactivated successfully. May 14 18:09:46.261196 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 14 18:09:46.266231 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 14 18:09:46.266270 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 18:09:46.270754 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 14 18:09:46.275935 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 14 18:09:46.278761 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 18:09:46.286957 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 14 18:09:46.295812 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 18:09:46.295865 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 14 18:09:46.302842 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 18:09:46.309250 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 18:09:46.314668 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 14 18:09:46.321830 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 18:09:46.326762 systemd[1]: Started systemd-journald.service - Journal Service. May 14 18:09:46.336913 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 14 18:09:46.339061 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 14 18:09:46.343055 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 14 18:09:46.346976 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 14 18:09:46.358086 systemd[1]: Reached target network-pre.target - Preparation for Network. May 14 18:09:46.362849 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 14 18:09:46.372038 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 18:09:46.434930 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 14 18:09:46.449699 systemd-journald[1269]: Time spent on flushing to /var/log/journal/c22bde849388404bbe6a0b78c100e1e6 is 956.131ms for 985 entries. May 14 18:09:46.449699 systemd-journald[1269]: System Journal (/var/log/journal/c22bde849388404bbe6a0b78c100e1e6) is 11.8M, max 2.6G, 2.6G free. May 14 18:09:48.282105 systemd-journald[1269]: Received client request to flush runtime journal. May 14 18:09:48.282144 kernel: ACPI: bus type drm_connector registered May 14 18:09:48.282162 kernel: loop0: detected capacity change from 0 to 146240 May 14 18:09:48.282177 systemd-journald[1269]: /var/log/journal/c22bde849388404bbe6a0b78c100e1e6/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. May 14 18:09:48.282199 systemd-journald[1269]: Rotating system journal. May 14 18:09:46.472484 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 18:09:46.472594 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 18:09:46.545216 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 18:09:46.575823 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 14 18:09:46.579949 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 14 18:09:46.584432 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 14 18:09:46.588871 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. May 14 18:09:46.588880 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. May 14 18:09:46.600913 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 18:09:46.604570 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 14 18:09:47.527138 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 14 18:09:47.532523 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 18:09:47.548477 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. May 14 18:09:47.548486 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. May 14 18:09:47.550555 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 18:09:48.282864 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 14 18:09:48.283426 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 14 18:09:48.287058 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 14 18:09:48.521770 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 14 18:09:48.560768 kernel: loop1: detected capacity change from 0 to 28536 May 14 18:09:49.466483 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 14 18:09:49.469593 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 18:09:49.498181 systemd-udevd[1350]: Using default interface naming scheme 'v255'. May 14 18:09:49.516769 kernel: loop2: detected capacity change from 0 to 113872 May 14 18:09:49.750465 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 18:09:49.755789 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 18:09:49.814833 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 14 18:09:49.833758 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#150 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 14 18:09:49.839856 kernel: hv_vmbus: registering driver hyperv_fb May 14 18:09:49.841770 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 14 18:09:49.843760 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 14 18:09:49.844783 kernel: Console: switching to colour dummy device 80x25 May 14 18:09:49.849569 kernel: Console: switching to colour frame buffer device 128x48 May 14 18:09:49.856762 kernel: mousedev: PS/2 mouse device common for all mice May 14 18:09:49.898223 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 14 18:09:49.918761 kernel: hv_vmbus: registering driver hv_balloon May 14 18:09:49.920763 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 14 18:09:49.993356 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 14 18:09:50.046946 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:50.054415 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 18:09:50.054906 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:50.058287 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:50.075704 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 18:09:50.075934 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:50.078840 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 18:09:50.209758 kernel: kvm_intel: Using Hyper-V Enlightened VMCS May 14 18:09:50.297767 kernel: loop3: detected capacity change from 0 to 205544 May 14 18:09:50.491432 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 14 18:09:50.493358 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 14 18:09:50.689408 systemd-networkd[1358]: lo: Link UP May 14 18:09:50.689416 systemd-networkd[1358]: lo: Gained carrier May 14 18:09:50.690903 systemd-networkd[1358]: Enumeration completed May 14 18:09:50.690969 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 18:09:50.693233 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 14 18:09:50.694898 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 14 18:09:50.698074 systemd-networkd[1358]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 18:09:50.698128 systemd-networkd[1358]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 18:09:50.700768 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 14 18:09:50.703770 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 14 18:09:50.705474 systemd-networkd[1358]: enP30832s1: Link UP May 14 18:09:50.705610 systemd-networkd[1358]: eth0: Link UP May 14 18:09:50.705647 systemd-networkd[1358]: eth0: Gained carrier May 14 18:09:50.705686 systemd-networkd[1358]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 18:09:50.705797 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52342850 eth0: Data path switched to VF: enP30832s1 May 14 18:09:50.714951 systemd-networkd[1358]: enP30832s1: Gained carrier May 14 18:09:50.750690 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 14 18:09:50.755588 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 14 18:09:50.935761 kernel: loop4: detected capacity change from 0 to 146240 May 14 18:09:50.949762 kernel: loop5: detected capacity change from 0 to 28536 May 14 18:09:50.969760 kernel: loop6: detected capacity change from 0 to 113872 May 14 18:09:50.983785 kernel: loop7: detected capacity change from 0 to 205544 May 14 18:09:51.179173 (sd-merge)[1448]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 14 18:09:51.179596 (sd-merge)[1448]: Merged extensions into '/usr'. May 14 18:09:51.182715 systemd[1]: Reload requested from client PID 1305 ('systemd-sysext') (unit systemd-sysext.service)... May 14 18:09:51.182726 systemd[1]: Reloading... May 14 18:09:51.241770 zram_generator::config[1486]: No configuration found. May 14 18:09:51.309910 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 18:09:51.428153 systemd[1]: Reloading finished in 245 ms. May 14 18:09:51.455115 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 14 18:09:51.456868 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 18:09:51.467583 systemd[1]: Starting ensure-sysext.service... May 14 18:09:51.470847 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 18:09:51.481265 systemd[1]: Reload requested from client PID 1539 ('systemctl') (unit ensure-sysext.service)... May 14 18:09:51.481282 systemd[1]: Reloading... May 14 18:09:51.487781 systemd-tmpfiles[1540]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 14 18:09:51.488005 systemd-tmpfiles[1540]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 14 18:09:51.488226 systemd-tmpfiles[1540]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 14 18:09:51.488459 systemd-tmpfiles[1540]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 14 18:09:51.489094 systemd-tmpfiles[1540]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 14 18:09:51.489352 systemd-tmpfiles[1540]: ACLs are not supported, ignoring. May 14 18:09:51.489427 systemd-tmpfiles[1540]: ACLs are not supported, ignoring. May 14 18:09:51.492854 systemd-tmpfiles[1540]: Detected autofs mount point /boot during canonicalization of boot. May 14 18:09:51.492860 systemd-tmpfiles[1540]: Skipping /boot May 14 18:09:51.500641 systemd-tmpfiles[1540]: Detected autofs mount point /boot during canonicalization of boot. May 14 18:09:51.500655 systemd-tmpfiles[1540]: Skipping /boot May 14 18:09:51.535765 zram_generator::config[1567]: No configuration found. May 14 18:09:51.613036 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 18:09:51.687446 systemd[1]: Reloading finished in 205 ms. May 14 18:09:51.722211 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 18:09:51.730381 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 18:09:51.745473 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 14 18:09:51.748936 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 14 18:09:51.752955 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 18:09:51.755858 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 14 18:09:51.761437 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.761593 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 18:09:51.762811 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 18:09:51.770843 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 18:09:51.773917 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 18:09:51.776848 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 18:09:51.776958 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 18:09:51.777038 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.784273 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 18:09:51.784880 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 18:09:51.789492 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.789666 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 18:09:51.792503 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 18:09:51.794490 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 18:09:51.794656 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 18:09:51.794801 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.795694 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 14 18:09:51.801468 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 18:09:51.801910 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 18:09:51.804163 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 18:09:51.804331 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 18:09:51.809238 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 18:09:51.809366 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 18:09:51.812554 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 14 18:09:51.822415 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.823029 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 18:09:51.824922 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 18:09:51.828981 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 18:09:51.831988 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 18:09:51.839923 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 18:09:51.842891 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 18:09:51.842996 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 18:09:51.843123 systemd[1]: Reached target time-set.target - System Time Set. May 14 18:09:51.845181 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 18:09:51.846223 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 18:09:51.846352 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 18:09:51.849149 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 18:09:51.849283 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 18:09:51.851719 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 18:09:51.856901 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 18:09:51.862064 systemd[1]: Finished ensure-sysext.service. May 14 18:09:51.863431 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 18:09:51.863541 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 18:09:51.868461 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 18:09:51.868507 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 18:09:51.898514 systemd-resolved[1634]: Positive Trust Anchors: May 14 18:09:51.898526 systemd-resolved[1634]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 18:09:51.898556 systemd-resolved[1634]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 18:09:51.901805 systemd-resolved[1634]: Using system hostname 'ci-4334.0.0-a-c37eb65ec1'. May 14 18:09:51.903081 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 18:09:51.904422 systemd[1]: Reached target network.target - Network. May 14 18:09:51.907780 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 18:09:51.995967 augenrules[1675]: No rules May 14 18:09:51.996726 systemd[1]: audit-rules.service: Deactivated successfully. May 14 18:09:51.996917 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 18:09:52.191863 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 14 18:09:52.194921 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 14 18:09:52.254889 systemd-networkd[1358]: eth0: Gained IPv6LL May 14 18:09:52.766887 systemd-networkd[1358]: enP30832s1: Gained IPv6LL May 14 18:09:53.749822 systemd-networkd[1358]: eth0: DHCPv4 address 10.200.8.38/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 14 18:09:53.752774 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 14 18:09:53.754733 systemd[1]: Reached target network-online.target - Network is Online. May 14 18:09:54.715645 ldconfig[1294]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 14 18:09:54.723311 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 14 18:09:54.725677 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 14 18:09:54.751094 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 14 18:09:54.752467 systemd[1]: Reached target sysinit.target - System Initialization. May 14 18:09:54.754905 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 14 18:09:54.756217 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 14 18:09:54.758856 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 14 18:09:54.760112 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 14 18:09:54.761450 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 14 18:09:54.763829 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 14 18:09:54.765094 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 14 18:09:54.765126 systemd[1]: Reached target paths.target - Path Units. May 14 18:09:54.766091 systemd[1]: Reached target timers.target - Timer Units. May 14 18:09:54.769397 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 14 18:09:54.771371 systemd[1]: Starting docker.socket - Docker Socket for the API... May 14 18:09:54.774108 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 14 18:09:54.775649 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 14 18:09:54.778802 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 14 18:09:54.787201 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 14 18:09:54.794220 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 14 18:09:54.795854 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 14 18:09:54.799349 systemd[1]: Reached target sockets.target - Socket Units. May 14 18:09:54.800356 systemd[1]: Reached target basic.target - Basic System. May 14 18:09:54.802812 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 14 18:09:54.802832 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 14 18:09:54.804540 systemd[1]: Starting chronyd.service - NTP client/server... May 14 18:09:54.807825 systemd[1]: Starting containerd.service - containerd container runtime... May 14 18:09:54.815393 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 14 18:09:54.821900 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 14 18:09:54.825021 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 14 18:09:54.832392 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 14 18:09:54.835472 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 14 18:09:54.836766 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 14 18:09:54.839908 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 14 18:09:54.847689 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:09:54.853893 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 14 18:09:54.855375 jq[1694]: false May 14 18:09:54.857581 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 14 18:09:54.861843 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 14 18:09:54.867234 (chronyd)[1688]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 14 18:09:54.868973 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 14 18:09:54.872905 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 14 18:09:54.878297 systemd[1]: Starting systemd-logind.service - User Login Management... May 14 18:09:54.882516 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 14 18:09:54.884120 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 14 18:09:54.884571 systemd[1]: Starting update-engine.service - Update Engine... May 14 18:09:54.890964 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 14 18:09:54.897551 chronyd[1714]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 14 18:09:54.899584 extend-filesystems[1697]: Found loop4 May 14 18:09:54.906163 extend-filesystems[1697]: Found loop5 May 14 18:09:54.906163 extend-filesystems[1697]: Found loop6 May 14 18:09:54.906163 extend-filesystems[1697]: Found loop7 May 14 18:09:54.906163 extend-filesystems[1697]: Found sr0 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p1 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p2 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p3 May 14 18:09:54.906163 extend-filesystems[1697]: Found usr May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p4 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p6 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p7 May 14 18:09:54.906163 extend-filesystems[1697]: Found nvme0n1p9 May 14 18:09:54.906163 extend-filesystems[1697]: Checking size of /dev/nvme0n1p9 May 14 18:09:54.902805 chronyd[1714]: Timezone right/UTC failed leap second check, ignoring May 14 18:09:54.902019 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 14 18:09:54.902942 chronyd[1714]: Loaded seccomp filter (level 2) May 14 18:09:54.906712 systemd[1]: Started chronyd.service - NTP client/server. May 14 18:09:54.909781 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 14 18:09:54.912490 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 14 18:09:54.941029 jq[1709]: true May 14 18:09:54.915680 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 14 18:09:54.916167 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 14 18:09:54.942185 jq[1721]: true May 14 18:09:54.962978 (ntainerd)[1734]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 14 18:09:54.966457 systemd[1]: motdgen.service: Deactivated successfully. May 14 18:09:54.966637 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 14 18:09:55.004710 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 14 18:09:55.009477 extend-filesystems[1697]: Old size kept for /dev/nvme0n1p9 May 14 18:09:55.013067 systemd[1]: extend-filesystems.service: Deactivated successfully. May 14 18:09:55.014104 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 14 18:09:55.086381 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Refreshing passwd entry cache May 14 18:09:55.085900 oslogin_cache_refresh[1698]: Refreshing passwd entry cache May 14 18:09:55.105157 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Failure getting users, quitting May 14 18:09:55.105157 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 14 18:09:55.105157 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Refreshing group entry cache May 14 18:09:55.104797 oslogin_cache_refresh[1698]: Failure getting users, quitting May 14 18:09:55.104813 oslogin_cache_refresh[1698]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 14 18:09:55.104851 oslogin_cache_refresh[1698]: Refreshing group entry cache May 14 18:09:55.118026 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Failure getting groups, quitting May 14 18:09:55.118087 oslogin_cache_refresh[1698]: Failure getting groups, quitting May 14 18:09:55.118131 google_oslogin_nss_cache[1698]: oslogin_cache_refresh[1698]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 14 18:09:55.118158 oslogin_cache_refresh[1698]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 14 18:09:55.119336 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 14 18:09:55.119501 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 14 18:09:55.144537 update_engine[1707]: I20250514 18:09:55.144228 1707 main.cc:92] Flatcar Update Engine starting May 14 18:09:55.200664 systemd-logind[1706]: New seat seat0. May 14 18:09:55.202867 systemd-logind[1706]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 14 18:09:55.202994 systemd[1]: Started systemd-logind.service - User Login Management. May 14 18:09:55.205509 bash[1754]: Updated "/home/core/.ssh/authorized_keys" May 14 18:09:55.206120 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 14 18:09:55.210639 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 14 18:09:55.219786 tar[1720]: linux-amd64/helm May 14 18:09:55.350396 sshd_keygen[1718]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 14 18:09:55.356494 dbus-daemon[1691]: [system] SELinux support is enabled May 14 18:09:55.356905 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 14 18:09:55.362917 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 14 18:09:55.362947 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 14 18:09:55.366860 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 14 18:09:55.366882 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 14 18:09:55.373125 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 14 18:09:55.373593 dbus-daemon[1691]: [system] Successfully activated service 'org.freedesktop.systemd1' May 14 18:09:55.376329 update_engine[1707]: I20250514 18:09:55.375600 1707 update_check_scheduler.cc:74] Next update check in 5m40s May 14 18:09:55.378786 systemd[1]: Started update-engine.service - Update Engine. May 14 18:09:55.382316 systemd[1]: Starting issuegen.service - Generate /run/issue... May 14 18:09:55.385907 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 14 18:09:55.390481 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 14 18:09:55.409475 systemd[1]: issuegen.service: Deactivated successfully. May 14 18:09:55.409641 systemd[1]: Finished issuegen.service - Generate /run/issue. May 14 18:09:55.419287 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 14 18:09:55.445617 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 14 18:09:55.448932 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 14 18:09:55.455961 systemd[1]: Started getty@tty1.service - Getty on tty1. May 14 18:09:55.462422 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 14 18:09:55.464352 systemd[1]: Reached target getty.target - Login Prompts. May 14 18:09:55.606527 coreos-metadata[1690]: May 14 18:09:55.606 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 14 18:09:55.610735 coreos-metadata[1690]: May 14 18:09:55.610 INFO Fetch successful May 14 18:09:55.611590 coreos-metadata[1690]: May 14 18:09:55.611 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 14 18:09:55.615696 coreos-metadata[1690]: May 14 18:09:55.615 INFO Fetch successful May 14 18:09:55.616073 coreos-metadata[1690]: May 14 18:09:55.616 INFO Fetching http://168.63.129.16/machine/eda8e7c4-9757-41b6-82a2-03c07b09f390/90600005%2D1a8e%2D4109%2D9400%2D0dfa0fb9b13a.%5Fci%2D4334.0.0%2Da%2Dc37eb65ec1?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 14 18:09:55.617707 coreos-metadata[1690]: May 14 18:09:55.617 INFO Fetch successful May 14 18:09:55.620080 coreos-metadata[1690]: May 14 18:09:55.620 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 14 18:09:55.631912 coreos-metadata[1690]: May 14 18:09:55.631 INFO Fetch successful May 14 18:09:55.679687 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 14 18:09:55.683711 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 14 18:09:55.759862 tar[1720]: linux-amd64/LICENSE May 14 18:09:55.759862 tar[1720]: linux-amd64/README.md May 14 18:09:55.768592 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 14 18:09:56.034824 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:09:56.040771 locksmithd[1801]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 14 18:09:56.043039 (kubelet)[1834]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:09:56.504914 kubelet[1834]: E0514 18:09:56.504830 1834 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:09:56.507059 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:09:56.507260 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:09:56.507719 systemd[1]: kubelet.service: Consumed 760ms CPU time, 234M memory peak. May 14 18:09:56.509465 containerd[1734]: time="2025-05-14T18:09:56Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 14 18:09:56.510092 containerd[1734]: time="2025-05-14T18:09:56.510058870Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 14 18:09:56.516118 containerd[1734]: time="2025-05-14T18:09:56.516091207Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.424µs" May 14 18:09:56.516118 containerd[1734]: time="2025-05-14T18:09:56.516113608Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 14 18:09:56.516200 containerd[1734]: time="2025-05-14T18:09:56.516128800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 14 18:09:56.516249 containerd[1734]: time="2025-05-14T18:09:56.516236999Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 14 18:09:56.516269 containerd[1734]: time="2025-05-14T18:09:56.516249705Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 14 18:09:56.516286 containerd[1734]: time="2025-05-14T18:09:56.516267860Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 18:09:56.516324 containerd[1734]: time="2025-05-14T18:09:56.516312790Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 18:09:56.516345 containerd[1734]: time="2025-05-14T18:09:56.516321992Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 18:09:56.516498 containerd[1734]: time="2025-05-14T18:09:56.516483701Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 18:09:56.516498 containerd[1734]: time="2025-05-14T18:09:56.516495481Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 18:09:56.516546 containerd[1734]: time="2025-05-14T18:09:56.516504289Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 18:09:56.516546 containerd[1734]: time="2025-05-14T18:09:56.516511573Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 14 18:09:56.516583 containerd[1734]: time="2025-05-14T18:09:56.516561898Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 14 18:09:56.516706 containerd[1734]: time="2025-05-14T18:09:56.516692665Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 18:09:56.516729 containerd[1734]: time="2025-05-14T18:09:56.516713587Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 18:09:56.516729 containerd[1734]: time="2025-05-14T18:09:56.516722179Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 14 18:09:56.516778 containerd[1734]: time="2025-05-14T18:09:56.516764768Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 14 18:09:56.516947 containerd[1734]: time="2025-05-14T18:09:56.516935146Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 14 18:09:56.516980 containerd[1734]: time="2025-05-14T18:09:56.516974974Z" level=info msg="metadata content store policy set" policy=shared May 14 18:09:56.533692 containerd[1734]: time="2025-05-14T18:09:56.533665926Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533706621Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533719713Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533729951Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533752316Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533761891Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 14 18:09:56.533778 containerd[1734]: time="2025-05-14T18:09:56.533774227Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 14 18:09:56.533892 containerd[1734]: time="2025-05-14T18:09:56.533783709Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 14 18:09:56.533892 containerd[1734]: time="2025-05-14T18:09:56.533793032Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 14 18:09:56.533892 containerd[1734]: time="2025-05-14T18:09:56.533801032Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 14 18:09:56.533892 containerd[1734]: time="2025-05-14T18:09:56.533809040Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 14 18:09:56.533892 containerd[1734]: time="2025-05-14T18:09:56.533819503Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 14 18:09:56.533973 containerd[1734]: time="2025-05-14T18:09:56.533905495Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 14 18:09:56.533973 containerd[1734]: time="2025-05-14T18:09:56.533920185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 14 18:09:56.533973 containerd[1734]: time="2025-05-14T18:09:56.533932631Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 14 18:09:56.533973 containerd[1734]: time="2025-05-14T18:09:56.533947804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 14 18:09:56.533973 containerd[1734]: time="2025-05-14T18:09:56.533964601Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.533974605Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.533984707Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.533993248Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.534003683Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.534012453Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 14 18:09:56.534049 containerd[1734]: time="2025-05-14T18:09:56.534021599Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 14 18:09:56.534179 containerd[1734]: time="2025-05-14T18:09:56.534081655Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 14 18:09:56.534179 containerd[1734]: time="2025-05-14T18:09:56.534095741Z" level=info msg="Start snapshots syncer" May 14 18:09:56.534179 containerd[1734]: time="2025-05-14T18:09:56.534118557Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 14 18:09:56.534351 containerd[1734]: time="2025-05-14T18:09:56.534325635Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 14 18:09:56.534460 containerd[1734]: time="2025-05-14T18:09:56.534365233Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 14 18:09:56.534460 containerd[1734]: time="2025-05-14T18:09:56.534430619Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 14 18:09:56.534526 containerd[1734]: time="2025-05-14T18:09:56.534508336Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 14 18:09:56.534556 containerd[1734]: time="2025-05-14T18:09:56.534525002Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 14 18:09:56.534556 containerd[1734]: time="2025-05-14T18:09:56.534534917Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 14 18:09:56.534556 containerd[1734]: time="2025-05-14T18:09:56.534544639Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 14 18:09:56.534611 containerd[1734]: time="2025-05-14T18:09:56.534555335Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 14 18:09:56.534611 containerd[1734]: time="2025-05-14T18:09:56.534569070Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 14 18:09:56.534611 containerd[1734]: time="2025-05-14T18:09:56.534582959Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 14 18:09:56.534611 containerd[1734]: time="2025-05-14T18:09:56.534603517Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534613244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534622829Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534647562Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534660085Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534668401Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534676290Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534683137Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534690793Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 14 18:09:56.534692 containerd[1734]: time="2025-05-14T18:09:56.534699529Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 14 18:09:56.534948 containerd[1734]: time="2025-05-14T18:09:56.534712117Z" level=info msg="runtime interface created" May 14 18:09:56.534948 containerd[1734]: time="2025-05-14T18:09:56.534716070Z" level=info msg="created NRI interface" May 14 18:09:56.534948 containerd[1734]: time="2025-05-14T18:09:56.534723011Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 14 18:09:56.534948 containerd[1734]: time="2025-05-14T18:09:56.534731573Z" level=info msg="Connect containerd service" May 14 18:09:56.534948 containerd[1734]: time="2025-05-14T18:09:56.534762533Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 14 18:09:56.535303 containerd[1734]: time="2025-05-14T18:09:56.535284668Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 18:09:57.501922 containerd[1734]: time="2025-05-14T18:09:57.501851208Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 14 18:09:57.502016 containerd[1734]: time="2025-05-14T18:09:57.501998306Z" level=info msg="Start subscribing containerd event" May 14 18:09:57.502118 containerd[1734]: time="2025-05-14T18:09:57.502040732Z" level=info msg="Start recovering state" May 14 18:09:57.502143 containerd[1734]: time="2025-05-14T18:09:57.502137153Z" level=info msg="Start event monitor" May 14 18:09:57.502160 containerd[1734]: time="2025-05-14T18:09:57.502152150Z" level=info msg="Start cni network conf syncer for default" May 14 18:09:57.502187 containerd[1734]: time="2025-05-14T18:09:57.502166123Z" level=info msg="Start streaming server" May 14 18:09:57.502187 containerd[1734]: time="2025-05-14T18:09:57.502174728Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 14 18:09:57.502187 containerd[1734]: time="2025-05-14T18:09:57.502181644Z" level=info msg="runtime interface starting up..." May 14 18:09:57.502240 containerd[1734]: time="2025-05-14T18:09:57.502191855Z" level=info msg="starting plugins..." May 14 18:09:57.502240 containerd[1734]: time="2025-05-14T18:09:57.502202343Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 14 18:09:57.502427 containerd[1734]: time="2025-05-14T18:09:57.502051882Z" level=info msg=serving... address=/run/containerd/containerd.sock May 14 18:09:57.502427 containerd[1734]: time="2025-05-14T18:09:57.502331757Z" level=info msg="containerd successfully booted in 0.993096s" May 14 18:09:57.502861 systemd[1]: Started containerd.service - containerd container runtime. May 14 18:09:57.507188 systemd[1]: Reached target multi-user.target - Multi-User System. May 14 18:09:57.512378 systemd[1]: Startup finished in 2.854s (kernel) + 16.624s (initrd) + 14.280s (userspace) = 33.758s. May 14 18:09:57.771392 login[1814]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 18:09:57.773790 login[1815]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 18:09:57.778301 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 14 18:09:57.779568 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 14 18:09:57.786869 systemd-logind[1706]: New session 1 of user core. May 14 18:09:57.790918 systemd-logind[1706]: New session 2 of user core. May 14 18:09:57.799819 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 14 18:09:57.802628 systemd[1]: Starting user@500.service - User Manager for UID 500... May 14 18:09:57.814107 (systemd)[1867]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 14 18:09:57.816039 systemd-logind[1706]: New session c1 of user core. May 14 18:09:57.877011 waagent[1812]: 2025-05-14T18:09:57.876957Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 14 18:09:57.878881 waagent[1812]: 2025-05-14T18:09:57.877389Z INFO Daemon Daemon OS: flatcar 4334.0.0 May 14 18:09:57.880463 waagent[1812]: 2025-05-14T18:09:57.880423Z INFO Daemon Daemon Python: 3.11.12 May 14 18:09:57.882101 waagent[1812]: 2025-05-14T18:09:57.882049Z INFO Daemon Daemon Run daemon May 14 18:09:57.883531 waagent[1812]: 2025-05-14T18:09:57.883497Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4334.0.0' May 14 18:09:57.885932 waagent[1812]: 2025-05-14T18:09:57.885899Z INFO Daemon Daemon Using waagent for provisioning May 14 18:09:57.887516 waagent[1812]: 2025-05-14T18:09:57.887487Z INFO Daemon Daemon Activate resource disk May 14 18:09:57.889179 waagent[1812]: 2025-05-14T18:09:57.889150Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 14 18:09:57.893045 waagent[1812]: 2025-05-14T18:09:57.893010Z INFO Daemon Daemon Found device: None May 14 18:09:57.894760 waagent[1812]: 2025-05-14T18:09:57.894482Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 14 18:09:57.897150 waagent[1812]: 2025-05-14T18:09:57.897124Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 14 18:09:57.900938 waagent[1812]: 2025-05-14T18:09:57.900908Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 14 18:09:57.902691 waagent[1812]: 2025-05-14T18:09:57.902664Z INFO Daemon Daemon Running default provisioning handler May 14 18:09:57.909759 waagent[1812]: 2025-05-14T18:09:57.909526Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 14 18:09:57.910089 waagent[1812]: 2025-05-14T18:09:57.910060Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 14 18:09:57.910216 waagent[1812]: 2025-05-14T18:09:57.910200Z INFO Daemon Daemon cloud-init is enabled: False May 14 18:09:57.910339 waagent[1812]: 2025-05-14T18:09:57.910328Z INFO Daemon Daemon Copying ovf-env.xml May 14 18:09:57.959784 waagent[1812]: 2025-05-14T18:09:57.959726Z INFO Daemon Daemon Successfully mounted dvd May 14 18:09:57.967248 systemd[1867]: Queued start job for default target default.target. May 14 18:09:57.971418 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 14 18:09:57.972918 systemd[1867]: Created slice app.slice - User Application Slice. May 14 18:09:57.973015 systemd[1867]: Reached target paths.target - Paths. May 14 18:09:57.973050 systemd[1867]: Reached target timers.target - Timers. May 14 18:09:57.976812 waagent[1812]: 2025-05-14T18:09:57.973859Z INFO Daemon Daemon Detect protocol endpoint May 14 18:09:57.976812 waagent[1812]: 2025-05-14T18:09:57.975148Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 14 18:09:57.973932 systemd[1867]: Starting dbus.socket - D-Bus User Message Bus Socket... May 14 18:09:57.977187 waagent[1812]: 2025-05-14T18:09:57.977022Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 14 18:09:57.979083 waagent[1812]: 2025-05-14T18:09:57.978624Z INFO Daemon Daemon Test for route to 168.63.129.16 May 14 18:09:57.980855 waagent[1812]: 2025-05-14T18:09:57.980822Z INFO Daemon Daemon Route to 168.63.129.16 exists May 14 18:09:57.982114 systemd[1867]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 14 18:09:57.982797 waagent[1812]: 2025-05-14T18:09:57.982754Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 14 18:09:57.982981 systemd[1867]: Reached target sockets.target - Sockets. May 14 18:09:57.983066 systemd[1867]: Reached target basic.target - Basic System. May 14 18:09:57.983112 systemd[1]: Started user@500.service - User Manager for UID 500. May 14 18:09:57.983485 systemd[1867]: Reached target default.target - Main User Target. May 14 18:09:57.983507 systemd[1867]: Startup finished in 162ms. May 14 18:09:57.987900 systemd[1]: Started session-1.scope - Session 1 of User core. May 14 18:09:57.988944 systemd[1]: Started session-2.scope - Session 2 of User core. May 14 18:09:57.999674 waagent[1812]: 2025-05-14T18:09:57.999650Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 14 18:09:58.000687 waagent[1812]: 2025-05-14T18:09:58.000302Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 14 18:09:58.000687 waagent[1812]: 2025-05-14T18:09:58.000506Z INFO Daemon Daemon Server preferred version:2015-04-05 May 14 18:09:58.053627 waagent[1812]: 2025-05-14T18:09:58.053566Z INFO Daemon Daemon Initializing goal state during protocol detection May 14 18:09:58.053949 waagent[1812]: 2025-05-14T18:09:58.053924Z INFO Daemon Daemon Forcing an update of the goal state. May 14 18:09:58.058499 waagent[1812]: 2025-05-14T18:09:58.058476Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 14 18:09:58.071680 waagent[1812]: 2025-05-14T18:09:58.071650Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 14 18:09:58.072188 waagent[1812]: 2025-05-14T18:09:58.072152Z INFO Daemon May 14 18:09:58.072390 waagent[1812]: 2025-05-14T18:09:58.072241Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: f7a6475e-a556-4a99-85c7-03ac46243476 eTag: 747909170742729847 source: Fabric] May 14 18:09:58.072549 waagent[1812]: 2025-05-14T18:09:58.072524Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 14 18:09:58.072727 waagent[1812]: 2025-05-14T18:09:58.072706Z INFO Daemon May 14 18:09:58.072811 waagent[1812]: 2025-05-14T18:09:58.072777Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 14 18:09:58.078579 waagent[1812]: 2025-05-14T18:09:58.078249Z INFO Daemon Daemon Downloading artifacts profile blob May 14 18:09:58.161198 waagent[1812]: 2025-05-14T18:09:58.161158Z INFO Daemon Downloaded certificate {'thumbprint': '93BAB49CF7F52CDE832033102B664C4028B68BAB', 'hasPrivateKey': True} May 14 18:09:58.164466 waagent[1812]: 2025-05-14T18:09:58.161531Z INFO Daemon Downloaded certificate {'thumbprint': '5484F3AE3A44010D7E0AB6D3FB3C7D0FDA1F53C9', 'hasPrivateKey': False} May 14 18:09:58.164466 waagent[1812]: 2025-05-14T18:09:58.162064Z INFO Daemon Fetch goal state completed May 14 18:09:58.170727 waagent[1812]: 2025-05-14T18:09:58.170671Z INFO Daemon Daemon Starting provisioning May 14 18:09:58.171239 waagent[1812]: 2025-05-14T18:09:58.170837Z INFO Daemon Daemon Handle ovf-env.xml. May 14 18:09:58.171239 waagent[1812]: 2025-05-14T18:09:58.170990Z INFO Daemon Daemon Set hostname [ci-4334.0.0-a-c37eb65ec1] May 14 18:09:58.187557 waagent[1812]: 2025-05-14T18:09:58.187513Z INFO Daemon Daemon Publish hostname [ci-4334.0.0-a-c37eb65ec1] May 14 18:09:58.188612 waagent[1812]: 2025-05-14T18:09:58.187798Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 14 18:09:58.188612 waagent[1812]: 2025-05-14T18:09:58.188128Z INFO Daemon Daemon Primary interface is [eth0] May 14 18:09:58.200163 systemd-networkd[1358]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 18:09:58.200169 systemd-networkd[1358]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 18:09:58.200189 systemd-networkd[1358]: eth0: DHCP lease lost May 14 18:09:58.200913 waagent[1812]: 2025-05-14T18:09:58.200873Z INFO Daemon Daemon Create user account if not exists May 14 18:09:58.201979 waagent[1812]: 2025-05-14T18:09:58.201946Z INFO Daemon Daemon User core already exists, skip useradd May 14 18:09:58.202655 waagent[1812]: 2025-05-14T18:09:58.202067Z INFO Daemon Daemon Configure sudoer May 14 18:09:58.205333 waagent[1812]: 2025-05-14T18:09:58.205287Z INFO Daemon Daemon Configure sshd May 14 18:09:58.208520 waagent[1812]: 2025-05-14T18:09:58.208485Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 14 18:09:58.213326 waagent[1812]: 2025-05-14T18:09:58.208612Z INFO Daemon Daemon Deploy ssh public key. May 14 18:09:58.217849 systemd-networkd[1358]: eth0: DHCPv4 address 10.200.8.38/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 14 18:09:59.279577 waagent[1812]: 2025-05-14T18:09:59.279504Z INFO Daemon Daemon Provisioning complete May 14 18:09:59.289714 waagent[1812]: 2025-05-14T18:09:59.289683Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 14 18:09:59.290418 waagent[1812]: 2025-05-14T18:09:59.289900Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 14 18:09:59.290418 waagent[1812]: 2025-05-14T18:09:59.290118Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 14 18:09:59.380722 waagent[1919]: 2025-05-14T18:09:59.380657Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 14 18:09:59.380948 waagent[1919]: 2025-05-14T18:09:59.380756Z INFO ExtHandler ExtHandler OS: flatcar 4334.0.0 May 14 18:09:59.380948 waagent[1919]: 2025-05-14T18:09:59.380798Z INFO ExtHandler ExtHandler Python: 3.11.12 May 14 18:09:59.380948 waagent[1919]: 2025-05-14T18:09:59.380834Z INFO ExtHandler ExtHandler CPU Arch: x86_64 May 14 18:09:59.414729 waagent[1919]: 2025-05-14T18:09:59.414688Z INFO ExtHandler ExtHandler Distro: flatcar-4334.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; May 14 18:09:59.414875 waagent[1919]: 2025-05-14T18:09:59.414850Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 14 18:09:59.414934 waagent[1919]: 2025-05-14T18:09:59.414904Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 14 18:09:59.419532 waagent[1919]: 2025-05-14T18:09:59.419484Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 14 18:09:59.435711 waagent[1919]: 2025-05-14T18:09:59.435681Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 14 18:09:59.436027 waagent[1919]: 2025-05-14T18:09:59.436001Z INFO ExtHandler May 14 18:09:59.436070 waagent[1919]: 2025-05-14T18:09:59.436048Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: e77aa561-39af-45e6-8a2e-fbd7222851b7 eTag: 747909170742729847 source: Fabric] May 14 18:09:59.436246 waagent[1919]: 2025-05-14T18:09:59.436226Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 14 18:09:59.436515 waagent[1919]: 2025-05-14T18:09:59.436495Z INFO ExtHandler May 14 18:09:59.436544 waagent[1919]: 2025-05-14T18:09:59.436528Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 14 18:09:59.441471 waagent[1919]: 2025-05-14T18:09:59.441449Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 14 18:09:59.519704 waagent[1919]: 2025-05-14T18:09:59.519661Z INFO ExtHandler Downloaded certificate {'thumbprint': '93BAB49CF7F52CDE832033102B664C4028B68BAB', 'hasPrivateKey': True} May 14 18:09:59.519968 waagent[1919]: 2025-05-14T18:09:59.519944Z INFO ExtHandler Downloaded certificate {'thumbprint': '5484F3AE3A44010D7E0AB6D3FB3C7D0FDA1F53C9', 'hasPrivateKey': False} May 14 18:09:59.520203 waagent[1919]: 2025-05-14T18:09:59.520180Z INFO ExtHandler Fetch goal state completed May 14 18:09:59.534427 waagent[1919]: 2025-05-14T18:09:59.534365Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 14 18:09:59.538122 waagent[1919]: 2025-05-14T18:09:59.538079Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1919 May 14 18:09:59.538214 waagent[1919]: 2025-05-14T18:09:59.538192Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 14 18:09:59.538442 waagent[1919]: 2025-05-14T18:09:59.538422Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 14 18:09:59.539337 waagent[1919]: 2025-05-14T18:09:59.539307Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 14 18:09:59.539584 waagent[1919]: 2025-05-14T18:09:59.539561Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 14 18:09:59.539665 waagent[1919]: 2025-05-14T18:09:59.539647Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 14 18:09:59.540052 waagent[1919]: 2025-05-14T18:09:59.540031Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 14 18:09:59.556566 waagent[1919]: 2025-05-14T18:09:59.556541Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 14 18:09:59.556680 waagent[1919]: 2025-05-14T18:09:59.556663Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 14 18:09:59.561484 waagent[1919]: 2025-05-14T18:09:59.561292Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 14 18:09:59.566074 systemd[1]: Reload requested from client PID 1936 ('systemctl') (unit waagent.service)... May 14 18:09:59.566084 systemd[1]: Reloading... May 14 18:09:59.631847 zram_generator::config[1976]: No configuration found. May 14 18:09:59.700586 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 18:09:59.778211 systemd[1]: Reloading finished in 211 ms. May 14 18:09:59.792786 waagent[1919]: 2025-05-14T18:09:59.790930Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 14 18:09:59.792786 waagent[1919]: 2025-05-14T18:09:59.791009Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 14 18:10:00.089620 waagent[1919]: 2025-05-14T18:10:00.089508Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 14 18:10:00.089923 waagent[1919]: 2025-05-14T18:10:00.089887Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 14 18:10:00.090780 waagent[1919]: 2025-05-14T18:10:00.090718Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 14 18:10:00.090875 waagent[1919]: 2025-05-14T18:10:00.090786Z INFO ExtHandler ExtHandler Starting env monitor service. May 14 18:10:00.090942 waagent[1919]: 2025-05-14T18:10:00.090899Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 14 18:10:00.091520 waagent[1919]: 2025-05-14T18:10:00.091488Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 14 18:10:00.091801 waagent[1919]: 2025-05-14T18:10:00.091716Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 14 18:10:00.091801 waagent[1919]: 2025-05-14T18:10:00.091772Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 14 18:10:00.092020 waagent[1919]: 2025-05-14T18:10:00.091999Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 14 18:10:00.092087 waagent[1919]: 2025-05-14T18:10:00.092055Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 14 18:10:00.092187 waagent[1919]: 2025-05-14T18:10:00.092162Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 14 18:10:00.092352 waagent[1919]: 2025-05-14T18:10:00.092329Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 14 18:10:00.092499 waagent[1919]: 2025-05-14T18:10:00.092481Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 14 18:10:00.092681 waagent[1919]: 2025-05-14T18:10:00.092654Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 14 18:10:00.092895 waagent[1919]: 2025-05-14T18:10:00.092850Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 14 18:10:00.092895 waagent[1919]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 14 18:10:00.092895 waagent[1919]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 May 14 18:10:00.092895 waagent[1919]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 14 18:10:00.092895 waagent[1919]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 14 18:10:00.092895 waagent[1919]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 14 18:10:00.092895 waagent[1919]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 14 18:10:00.093132 waagent[1919]: 2025-05-14T18:10:00.093113Z INFO EnvHandler ExtHandler Configure routes May 14 18:10:00.093798 waagent[1919]: 2025-05-14T18:10:00.093774Z INFO EnvHandler ExtHandler Gateway:None May 14 18:10:00.094331 waagent[1919]: 2025-05-14T18:10:00.094295Z INFO EnvHandler ExtHandler Routes:None May 14 18:10:00.116324 waagent[1919]: 2025-05-14T18:10:00.116288Z INFO ExtHandler ExtHandler May 14 18:10:00.116382 waagent[1919]: 2025-05-14T18:10:00.116340Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: eb9a205e-2cd5-4a6b-a322-2163bc5fedf6 correlation 3ec46245-6352-4c93-8667-bb309a447566 created: 2025-05-14T18:08:53.986992Z] May 14 18:10:00.116601 waagent[1919]: 2025-05-14T18:10:00.116578Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 14 18:10:00.116965 waagent[1919]: 2025-05-14T18:10:00.116940Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] May 14 18:10:00.205164 waagent[1919]: 2025-05-14T18:10:00.204796Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command May 14 18:10:00.205164 waagent[1919]: Try `iptables -h' or 'iptables --help' for more information.) May 14 18:10:00.205164 waagent[1919]: 2025-05-14T18:10:00.205111Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: FCFDA591-A84B-45AB-A598-113B7CE14845;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 14 18:10:00.440206 waagent[1919]: 2025-05-14T18:10:00.440165Z INFO MonitorHandler ExtHandler Network interfaces: May 14 18:10:00.440206 waagent[1919]: Executing ['ip', '-a', '-o', 'link']: May 14 18:10:00.440206 waagent[1919]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 14 18:10:00.440206 waagent[1919]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:34:28:50 brd ff:ff:ff:ff:ff:ff\ alias Network Device May 14 18:10:00.440206 waagent[1919]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:34:28:50 brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 May 14 18:10:00.440206 waagent[1919]: Executing ['ip', '-4', '-a', '-o', 'address']: May 14 18:10:00.440206 waagent[1919]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 14 18:10:00.440206 waagent[1919]: 2: eth0 inet 10.200.8.38/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever May 14 18:10:00.440206 waagent[1919]: Executing ['ip', '-6', '-a', '-o', 'address']: May 14 18:10:00.440206 waagent[1919]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 14 18:10:00.440206 waagent[1919]: 2: eth0 inet6 fe80::7e1e:52ff:fe34:2850/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 14 18:10:00.440206 waagent[1919]: 3: enP30832s1 inet6 fe80::7e1e:52ff:fe34:2850/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 14 18:10:00.863074 waagent[1919]: 2025-05-14T18:10:00.863029Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 14 18:10:00.863074 waagent[1919]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.863074 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.863074 waagent[1919]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.863074 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.863074 waagent[1919]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.863074 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.863074 waagent[1919]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 14 18:10:00.863074 waagent[1919]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 14 18:10:00.863074 waagent[1919]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 14 18:10:00.865441 waagent[1919]: 2025-05-14T18:10:00.865398Z INFO EnvHandler ExtHandler Current Firewall rules: May 14 18:10:00.865441 waagent[1919]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.865441 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.865441 waagent[1919]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.865441 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.865441 waagent[1919]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 14 18:10:00.865441 waagent[1919]: pkts bytes target prot opt in out source destination May 14 18:10:00.865441 waagent[1919]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 14 18:10:00.865441 waagent[1919]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 14 18:10:00.865441 waagent[1919]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 14 18:10:06.610468 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 14 18:10:06.612329 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:10:11.253847 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 14 18:10:11.254969 systemd[1]: Started sshd@0-10.200.8.38:22-10.200.16.10:47842.service - OpenSSH per-connection server daemon (10.200.16.10:47842). May 14 18:10:12.263253 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:10:12.268088 (kubelet)[2074]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:10:12.309938 kubelet[2074]: E0514 18:10:12.309909 2074 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:10:12.312442 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:10:12.312583 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:10:12.312860 systemd[1]: kubelet.service: Consumed 125ms CPU time, 97.6M memory peak. May 14 18:10:12.314475 sshd[2067]: Accepted publickey for core from 10.200.16.10 port 47842 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:12.315371 sshd-session[2067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:12.319062 systemd-logind[1706]: New session 3 of user core. May 14 18:10:12.321867 systemd[1]: Started session-3.scope - Session 3 of User core. May 14 18:10:12.869862 systemd[1]: Started sshd@1-10.200.8.38:22-10.200.16.10:47848.service - OpenSSH per-connection server daemon (10.200.16.10:47848). May 14 18:10:13.502525 sshd[2084]: Accepted publickey for core from 10.200.16.10 port 47848 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:13.503773 sshd-session[2084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:13.508070 systemd-logind[1706]: New session 4 of user core. May 14 18:10:13.527847 systemd[1]: Started session-4.scope - Session 4 of User core. May 14 18:10:13.949807 sshd[2086]: Connection closed by 10.200.16.10 port 47848 May 14 18:10:13.950281 sshd-session[2084]: pam_unix(sshd:session): session closed for user core May 14 18:10:13.953305 systemd[1]: sshd@1-10.200.8.38:22-10.200.16.10:47848.service: Deactivated successfully. May 14 18:10:13.954575 systemd[1]: session-4.scope: Deactivated successfully. May 14 18:10:13.955175 systemd-logind[1706]: Session 4 logged out. Waiting for processes to exit. May 14 18:10:13.956143 systemd-logind[1706]: Removed session 4. May 14 18:10:14.065808 systemd[1]: Started sshd@2-10.200.8.38:22-10.200.16.10:47858.service - OpenSSH per-connection server daemon (10.200.16.10:47858). May 14 18:10:14.700431 sshd[2092]: Accepted publickey for core from 10.200.16.10 port 47858 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:14.702487 sshd-session[2092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:14.706350 systemd-logind[1706]: New session 5 of user core. May 14 18:10:14.711859 systemd[1]: Started session-5.scope - Session 5 of User core. May 14 18:10:15.145169 sshd[2094]: Connection closed by 10.200.16.10 port 47858 May 14 18:10:15.145625 sshd-session[2092]: pam_unix(sshd:session): session closed for user core May 14 18:10:15.148299 systemd[1]: sshd@2-10.200.8.38:22-10.200.16.10:47858.service: Deactivated successfully. May 14 18:10:15.149672 systemd[1]: session-5.scope: Deactivated successfully. May 14 18:10:15.150666 systemd-logind[1706]: Session 5 logged out. Waiting for processes to exit. May 14 18:10:15.151582 systemd-logind[1706]: Removed session 5. May 14 18:10:15.257495 systemd[1]: Started sshd@3-10.200.8.38:22-10.200.16.10:47860.service - OpenSSH per-connection server daemon (10.200.16.10:47860). May 14 18:10:15.892547 sshd[2100]: Accepted publickey for core from 10.200.16.10 port 47860 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:15.893791 sshd-session[2100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:15.897802 systemd-logind[1706]: New session 6 of user core. May 14 18:10:15.904883 systemd[1]: Started session-6.scope - Session 6 of User core. May 14 18:10:16.368541 sshd[2102]: Connection closed by 10.200.16.10 port 47860 May 14 18:10:16.369029 sshd-session[2100]: pam_unix(sshd:session): session closed for user core May 14 18:10:16.372356 systemd[1]: sshd@3-10.200.8.38:22-10.200.16.10:47860.service: Deactivated successfully. May 14 18:10:16.373617 systemd[1]: session-6.scope: Deactivated successfully. May 14 18:10:16.374375 systemd-logind[1706]: Session 6 logged out. Waiting for processes to exit. May 14 18:10:16.375362 systemd-logind[1706]: Removed session 6. May 14 18:10:16.479561 systemd[1]: Started sshd@4-10.200.8.38:22-10.200.16.10:47872.service - OpenSSH per-connection server daemon (10.200.16.10:47872). May 14 18:10:17.112859 sshd[2108]: Accepted publickey for core from 10.200.16.10 port 47872 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:17.114059 sshd-session[2108]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:17.117788 systemd-logind[1706]: New session 7 of user core. May 14 18:10:17.131877 systemd[1]: Started session-7.scope - Session 7 of User core. May 14 18:10:17.537044 sudo[2111]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 14 18:10:17.537232 sudo[2111]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 18:10:17.569522 sudo[2111]: pam_unix(sudo:session): session closed for user root May 14 18:10:17.670567 sshd[2110]: Connection closed by 10.200.16.10 port 47872 May 14 18:10:17.671193 sshd-session[2108]: pam_unix(sshd:session): session closed for user core May 14 18:10:17.674837 systemd[1]: sshd@4-10.200.8.38:22-10.200.16.10:47872.service: Deactivated successfully. May 14 18:10:17.676052 systemd[1]: session-7.scope: Deactivated successfully. May 14 18:10:17.676619 systemd-logind[1706]: Session 7 logged out. Waiting for processes to exit. May 14 18:10:17.677581 systemd-logind[1706]: Removed session 7. May 14 18:10:17.830849 systemd[1]: Started sshd@5-10.200.8.38:22-10.200.16.10:47888.service - OpenSSH per-connection server daemon (10.200.16.10:47888). May 14 18:10:18.466886 sshd[2117]: Accepted publickey for core from 10.200.16.10 port 47888 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:18.468096 sshd-session[2117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:18.472152 systemd-logind[1706]: New session 8 of user core. May 14 18:10:18.480878 systemd[1]: Started session-8.scope - Session 8 of User core. May 14 18:10:18.685398 chronyd[1714]: Selected source PHC0 May 14 18:10:18.813148 sudo[2121]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 14 18:10:18.813337 sudo[2121]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 18:10:18.818328 sudo[2121]: pam_unix(sudo:session): session closed for user root May 14 18:10:18.821643 sudo[2120]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 14 18:10:18.821861 sudo[2120]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 18:10:18.828208 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 18:10:18.855248 augenrules[2143]: No rules May 14 18:10:18.855656 systemd[1]: audit-rules.service: Deactivated successfully. May 14 18:10:18.855872 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 18:10:18.856717 sudo[2120]: pam_unix(sudo:session): session closed for user root May 14 18:10:18.959888 sshd[2119]: Connection closed by 10.200.16.10 port 47888 May 14 18:10:18.960177 sshd-session[2117]: pam_unix(sshd:session): session closed for user core May 14 18:10:18.963179 systemd[1]: sshd@5-10.200.8.38:22-10.200.16.10:47888.service: Deactivated successfully. May 14 18:10:18.964332 systemd[1]: session-8.scope: Deactivated successfully. May 14 18:10:18.964941 systemd-logind[1706]: Session 8 logged out. Waiting for processes to exit. May 14 18:10:18.965871 systemd-logind[1706]: Removed session 8. May 14 18:10:19.070483 systemd[1]: Started sshd@6-10.200.8.38:22-10.200.16.10:47182.service - OpenSSH per-connection server daemon (10.200.16.10:47182). May 14 18:10:19.702982 sshd[2152]: Accepted publickey for core from 10.200.16.10 port 47182 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:10:19.704153 sshd-session[2152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:10:19.707788 systemd-logind[1706]: New session 9 of user core. May 14 18:10:19.717873 systemd[1]: Started session-9.scope - Session 9 of User core. May 14 18:10:20.054879 sudo[2155]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 14 18:10:20.055073 sudo[2155]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 18:10:21.846115 systemd[1]: Starting docker.service - Docker Application Container Engine... May 14 18:10:21.860078 (dockerd)[2173]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 14 18:10:22.360111 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 14 18:10:22.361369 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:10:23.176955 dockerd[2173]: time="2025-05-14T18:10:23.176914636Z" level=info msg="Starting up" May 14 18:10:23.177526 dockerd[2173]: time="2025-05-14T18:10:23.177503558Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 14 18:10:25.453655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:10:25.461001 (kubelet)[2201]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:10:25.492323 kubelet[2201]: E0514 18:10:25.492274 2201 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:10:25.493677 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:10:25.493809 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:10:25.494105 systemd[1]: kubelet.service: Consumed 107ms CPU time, 96M memory peak. May 14 18:10:28.724439 dockerd[2173]: time="2025-05-14T18:10:28.724405098Z" level=info msg="Loading containers: start." May 14 18:10:28.799835 kernel: Initializing XFRM netlink socket May 14 18:10:29.120521 systemd-networkd[1358]: docker0: Link UP May 14 18:10:29.136085 dockerd[2173]: time="2025-05-14T18:10:29.136056591Z" level=info msg="Loading containers: done." May 14 18:10:29.576673 dockerd[2173]: time="2025-05-14T18:10:29.576314589Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 14 18:10:29.576673 dockerd[2173]: time="2025-05-14T18:10:29.576409241Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 14 18:10:29.576673 dockerd[2173]: time="2025-05-14T18:10:29.576516907Z" level=info msg="Initializing buildkit" May 14 18:10:29.786423 dockerd[2173]: time="2025-05-14T18:10:29.786380182Z" level=info msg="Completed buildkit initialization" May 14 18:10:29.792291 dockerd[2173]: time="2025-05-14T18:10:29.792260598Z" level=info msg="Daemon has completed initialization" May 14 18:10:29.792534 dockerd[2173]: time="2025-05-14T18:10:29.792311864Z" level=info msg="API listen on /run/docker.sock" May 14 18:10:29.792578 systemd[1]: Started docker.service - Docker Application Container Engine. May 14 18:10:31.082971 containerd[1734]: time="2025-05-14T18:10:31.082927612Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 14 18:10:32.905107 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount183839404.mount: Deactivated successfully. May 14 18:10:35.610303 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 14 18:10:35.611978 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:10:38.062478 kernel: hv_balloon: Max. dynamic memory size: 8192 MB May 14 18:10:40.392682 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:10:40.398966 (kubelet)[2412]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:10:40.429493 kubelet[2412]: E0514 18:10:40.429454 2412 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:10:40.431015 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:10:40.431235 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:10:40.431574 systemd[1]: kubelet.service: Consumed 111ms CPU time, 95.9M memory peak. May 14 18:10:40.765951 update_engine[1707]: I20250514 18:10:40.765803 1707 update_attempter.cc:509] Updating boot flags... May 14 18:10:43.771126 containerd[1734]: time="2025-05-14T18:10:43.771057524Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:43.833160 containerd[1734]: time="2025-05-14T18:10:43.833103705Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960995" May 14 18:10:43.835825 containerd[1734]: time="2025-05-14T18:10:43.835771637Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:43.893772 containerd[1734]: time="2025-05-14T18:10:43.893695825Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:43.894761 containerd[1734]: time="2025-05-14T18:10:43.894631170Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 12.811658728s" May 14 18:10:43.894761 containerd[1734]: time="2025-05-14T18:10:43.894671404Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 14 18:10:43.896756 containerd[1734]: time="2025-05-14T18:10:43.896708749Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 14 18:10:47.981185 containerd[1734]: time="2025-05-14T18:10:47.981118630Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:48.026626 containerd[1734]: time="2025-05-14T18:10:48.026569842Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713784" May 14 18:10:48.073586 containerd[1734]: time="2025-05-14T18:10:48.073512308Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:48.121209 containerd[1734]: time="2025-05-14T18:10:48.121125099Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:48.122663 containerd[1734]: time="2025-05-14T18:10:48.122287793Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 4.225539031s" May 14 18:10:48.122663 containerd[1734]: time="2025-05-14T18:10:48.122325358Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 14 18:10:48.122902 containerd[1734]: time="2025-05-14T18:10:48.122789267Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 14 18:10:50.610465 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 14 18:10:50.612304 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:10:53.909619 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:10:53.924925 (kubelet)[2515]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:10:53.953732 kubelet[2515]: E0514 18:10:53.953699 2515 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:10:53.955022 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:10:53.955141 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:10:53.955433 systemd[1]: kubelet.service: Consumed 104ms CPU time, 95.5M memory peak. May 14 18:10:55.274615 containerd[1734]: time="2025-05-14T18:10:55.274489439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:55.323189 containerd[1734]: time="2025-05-14T18:10:55.323131926Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780394" May 14 18:10:55.326254 containerd[1734]: time="2025-05-14T18:10:55.326155083Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:56.283768 containerd[1734]: time="2025-05-14T18:10:56.283699590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:10:56.284841 containerd[1734]: time="2025-05-14T18:10:56.284581707Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 8.161735945s" May 14 18:10:56.284841 containerd[1734]: time="2025-05-14T18:10:56.284620971Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 14 18:10:56.285272 containerd[1734]: time="2025-05-14T18:10:56.285247104Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 14 18:11:00.883445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount14162910.mount: Deactivated successfully. May 14 18:11:01.882352 containerd[1734]: time="2025-05-14T18:11:01.882294056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:01.929884 containerd[1734]: time="2025-05-14T18:11:01.929846831Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354633" May 14 18:11:01.932720 containerd[1734]: time="2025-05-14T18:11:01.932448090Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:01.977471 containerd[1734]: time="2025-05-14T18:11:01.977407636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:01.978194 containerd[1734]: time="2025-05-14T18:11:01.978026614Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 5.692742426s" May 14 18:11:01.978194 containerd[1734]: time="2025-05-14T18:11:01.978062701Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 14 18:11:01.978797 containerd[1734]: time="2025-05-14T18:11:01.978735365Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 14 18:11:03.436410 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount505739462.mount: Deactivated successfully. May 14 18:11:04.110373 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 14 18:11:04.112046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:07.101739 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:07.104460 (kubelet)[2545]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:11:07.134324 kubelet[2545]: E0514 18:11:07.134290 2545 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:11:07.135295 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:11:07.135401 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:11:07.135655 systemd[1]: kubelet.service: Consumed 110ms CPU time, 95.4M memory peak. May 14 18:11:16.623810 containerd[1734]: time="2025-05-14T18:11:16.623676207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:16.670784 containerd[1734]: time="2025-05-14T18:11:16.670716789Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 14 18:11:16.717449 containerd[1734]: time="2025-05-14T18:11:16.717377066Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:16.721590 containerd[1734]: time="2025-05-14T18:11:16.721552547Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:16.722661 containerd[1734]: time="2025-05-14T18:11:16.722220825Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 14.743439675s" May 14 18:11:16.722661 containerd[1734]: time="2025-05-14T18:11:16.722250538Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 14 18:11:16.722761 containerd[1734]: time="2025-05-14T18:11:16.722686896Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 14 18:11:17.360459 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 14 18:11:17.362113 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:20.658612 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:20.671983 (kubelet)[2600]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:11:20.701756 kubelet[2600]: E0514 18:11:20.701708 2600 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:11:20.702990 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:11:20.703095 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:11:20.703381 systemd[1]: kubelet.service: Consumed 110ms CPU time, 95.5M memory peak. May 14 18:11:23.937887 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2947101681.mount: Deactivated successfully. May 14 18:11:24.183769 containerd[1734]: time="2025-05-14T18:11:24.183699061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 18:11:24.187001 containerd[1734]: time="2025-05-14T18:11:24.186957674Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 14 18:11:24.231226 containerd[1734]: time="2025-05-14T18:11:24.230934005Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 18:11:24.284189 containerd[1734]: time="2025-05-14T18:11:24.284131070Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 18:11:24.285096 containerd[1734]: time="2025-05-14T18:11:24.284935774Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 7.562226454s" May 14 18:11:24.285096 containerd[1734]: time="2025-05-14T18:11:24.284973949Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 14 18:11:24.285665 containerd[1734]: time="2025-05-14T18:11:24.285617109Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 14 18:11:29.689527 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1112890127.mount: Deactivated successfully. May 14 18:11:30.860377 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 14 18:11:30.862376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:35.587732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:35.590638 (kubelet)[2626]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 18:11:35.620427 kubelet[2626]: E0514 18:11:35.620372 2626 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 18:11:35.622065 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 18:11:35.622181 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 18:11:35.622475 systemd[1]: kubelet.service: Consumed 113ms CPU time, 95.8M memory peak. May 14 18:11:44.979738 containerd[1734]: time="2025-05-14T18:11:44.979681944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:45.025088 containerd[1734]: time="2025-05-14T18:11:45.025039867Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" May 14 18:11:45.028015 containerd[1734]: time="2025-05-14T18:11:45.027932353Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:45.073917 containerd[1734]: time="2025-05-14T18:11:45.073832342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:11:45.075419 containerd[1734]: time="2025-05-14T18:11:45.075077614Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 20.789429464s" May 14 18:11:45.075419 containerd[1734]: time="2025-05-14T18:11:45.075115259Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 14 18:11:45.860554 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 14 18:11:45.862579 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:48.268955 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 14 18:11:48.269075 systemd[1]: kubelet.service: Failed with result 'signal'. May 14 18:11:48.269412 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:48.272342 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:48.294885 systemd[1]: Reload requested from client PID 2709 ('systemctl') (unit session-9.scope)... May 14 18:11:48.294905 systemd[1]: Reloading... May 14 18:11:48.364137 zram_generator::config[2750]: No configuration found. May 14 18:11:49.201459 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 18:11:49.279930 systemd[1]: Reloading finished in 984 ms. May 14 18:11:50.119861 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 14 18:11:50.119977 systemd[1]: kubelet.service: Failed with result 'signal'. May 14 18:11:50.120384 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:50.122315 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:11:55.702759 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:11:55.711069 (kubelet)[2821]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 18:11:55.743762 kubelet[2821]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 18:11:55.743762 kubelet[2821]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 18:11:55.743762 kubelet[2821]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 18:11:55.743762 kubelet[2821]: I0514 18:11:55.742694 2821 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 18:11:55.943358 kubelet[2821]: I0514 18:11:55.943335 2821 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 14 18:11:55.943358 kubelet[2821]: I0514 18:11:55.943352 2821 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 18:11:55.943532 kubelet[2821]: I0514 18:11:55.943520 2821 server.go:929] "Client rotation is on, will bootstrap in background" May 14 18:11:55.966498 kubelet[2821]: I0514 18:11:55.966132 2821 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 18:11:55.966972 kubelet[2821]: E0514 18:11:55.966862 2821 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.38:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:55.974518 kubelet[2821]: I0514 18:11:55.974503 2821 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 18:11:55.978174 kubelet[2821]: I0514 18:11:55.978159 2821 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 18:11:55.978265 kubelet[2821]: I0514 18:11:55.978232 2821 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 14 18:11:55.978349 kubelet[2821]: I0514 18:11:55.978313 2821 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 18:11:55.978517 kubelet[2821]: I0514 18:11:55.978348 2821 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-c37eb65ec1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 18:11:55.978626 kubelet[2821]: I0514 18:11:55.978519 2821 topology_manager.go:138] "Creating topology manager with none policy" May 14 18:11:55.978626 kubelet[2821]: I0514 18:11:55.978527 2821 container_manager_linux.go:300] "Creating device plugin manager" May 14 18:11:55.978626 kubelet[2821]: I0514 18:11:55.978605 2821 state_mem.go:36] "Initialized new in-memory state store" May 14 18:11:55.980323 kubelet[2821]: I0514 18:11:55.980165 2821 kubelet.go:408] "Attempting to sync node with API server" May 14 18:11:55.980323 kubelet[2821]: I0514 18:11:55.980184 2821 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 18:11:55.980323 kubelet[2821]: I0514 18:11:55.980208 2821 kubelet.go:314] "Adding apiserver pod source" May 14 18:11:55.980323 kubelet[2821]: I0514 18:11:55.980224 2821 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 18:11:55.985080 kubelet[2821]: I0514 18:11:55.984981 2821 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 14 18:11:55.986870 kubelet[2821]: I0514 18:11:55.986480 2821 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 18:11:55.986870 kubelet[2821]: W0514 18:11:55.986533 2821 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 14 18:11:55.986950 kubelet[2821]: I0514 18:11:55.986935 2821 server.go:1269] "Started kubelet" May 14 18:11:55.987063 kubelet[2821]: W0514 18:11:55.987025 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:55.987096 kubelet[2821]: E0514 18:11:55.987072 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:55.990375 kubelet[2821]: W0514 18:11:55.990002 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:55.990375 kubelet[2821]: E0514 18:11:55.990042 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:55.990375 kubelet[2821]: I0514 18:11:55.990105 2821 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 18:11:55.991265 kubelet[2821]: I0514 18:11:55.990849 2821 server.go:460] "Adding debug handlers to kubelet server" May 14 18:11:55.991446 kubelet[2821]: I0514 18:11:55.991410 2821 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 18:11:55.991657 kubelet[2821]: I0514 18:11:55.991646 2821 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 18:11:55.993123 kubelet[2821]: I0514 18:11:55.992733 2821 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 18:11:55.994306 kubelet[2821]: E0514 18:11:55.991887 2821 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.38:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.38:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334.0.0-a-c37eb65ec1.183f77445f8297ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-c37eb65ec1,UID:ci-4334.0.0-a-c37eb65ec1,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-c37eb65ec1,},FirstTimestamp:2025-05-14 18:11:55.986917306 +0000 UTC m=+0.272920181,LastTimestamp:2025-05-14 18:11:55.986917306 +0000 UTC m=+0.272920181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-c37eb65ec1,}" May 14 18:11:55.994916 kubelet[2821]: I0514 18:11:55.994854 2821 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 18:11:55.997167 kubelet[2821]: E0514 18:11:55.996809 2821 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:55.997167 kubelet[2821]: I0514 18:11:55.996836 2821 volume_manager.go:289] "Starting Kubelet Volume Manager" May 14 18:11:55.997167 kubelet[2821]: I0514 18:11:55.996958 2821 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 14 18:11:55.997167 kubelet[2821]: I0514 18:11:55.996998 2821 reconciler.go:26] "Reconciler: start to sync state" May 14 18:11:55.997290 kubelet[2821]: W0514 18:11:55.997222 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:55.997290 kubelet[2821]: E0514 18:11:55.997255 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:55.997396 kubelet[2821]: I0514 18:11:55.997380 2821 factory.go:221] Registration of the systemd container factory successfully May 14 18:11:55.997444 kubelet[2821]: I0514 18:11:55.997434 2821 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 18:11:55.998520 kubelet[2821]: I0514 18:11:55.998506 2821 factory.go:221] Registration of the containerd container factory successfully May 14 18:11:56.001574 kubelet[2821]: E0514 18:11:56.001530 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="200ms" May 14 18:11:56.005808 kubelet[2821]: E0514 18:11:56.005790 2821 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 18:11:56.025739 kubelet[2821]: I0514 18:11:56.025723 2821 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 18:11:56.025739 kubelet[2821]: I0514 18:11:56.025735 2821 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 18:11:56.025739 kubelet[2821]: I0514 18:11:56.025766 2821 state_mem.go:36] "Initialized new in-memory state store" May 14 18:11:56.096899 kubelet[2821]: E0514 18:11:56.096867 2821 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:56.197364 kubelet[2821]: E0514 18:11:56.197327 2821 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:56.202984 kubelet[2821]: E0514 18:11:56.202952 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="400ms" May 14 18:11:56.298498 kubelet[2821]: E0514 18:11:56.298404 2821 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:56.371171 kubelet[2821]: E0514 18:11:56.371074 2821 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.38:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.38:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334.0.0-a-c37eb65ec1.183f77445f8297ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-c37eb65ec1,UID:ci-4334.0.0-a-c37eb65ec1,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-c37eb65ec1,},FirstTimestamp:2025-05-14 18:11:55.986917306 +0000 UTC m=+0.272920181,LastTimestamp:2025-05-14 18:11:55.986917306 +0000 UTC m=+0.272920181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-c37eb65ec1,}" May 14 18:11:56.399380 kubelet[2821]: E0514 18:11:56.399342 2821 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:56.425759 kubelet[2821]: I0514 18:11:56.425727 2821 policy_none.go:49] "None policy: Start" May 14 18:11:56.426570 kubelet[2821]: I0514 18:11:56.426530 2821 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 18:11:56.426570 kubelet[2821]: I0514 18:11:56.426554 2821 state_mem.go:35] "Initializing new in-memory state store" May 14 18:11:56.477002 kubelet[2821]: I0514 18:11:56.476132 2821 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 18:11:56.477376 kubelet[2821]: I0514 18:11:56.477363 2821 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 18:11:56.477553 kubelet[2821]: I0514 18:11:56.477546 2821 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 18:11:56.478358 kubelet[2821]: I0514 18:11:56.478245 2821 kubelet.go:2321] "Starting kubelet main sync loop" May 14 18:11:56.478358 kubelet[2821]: E0514 18:11:56.478284 2821 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 18:11:56.478643 kubelet[2821]: W0514 18:11:56.478613 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:56.478679 kubelet[2821]: E0514 18:11:56.478653 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:56.479471 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 14 18:11:56.488035 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 14 18:11:56.490606 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 14 18:11:56.497307 kubelet[2821]: I0514 18:11:56.497198 2821 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 18:11:56.497499 kubelet[2821]: I0514 18:11:56.497484 2821 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 18:11:56.497731 kubelet[2821]: I0514 18:11:56.497541 2821 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 18:11:56.498048 kubelet[2821]: I0514 18:11:56.497801 2821 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 18:11:56.499231 kubelet[2821]: E0514 18:11:56.499219 2821 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:11:56.586285 systemd[1]: Created slice kubepods-burstable-podaf41aae107f00c6091edb0715bd663ee.slice - libcontainer container kubepods-burstable-podaf41aae107f00c6091edb0715bd663ee.slice. May 14 18:11:56.599427 kubelet[2821]: I0514 18:11:56.599399 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.599666 kubelet[2821]: E0514 18:11:56.599633 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601833 kubelet[2821]: I0514 18:11:56.601820 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601877 kubelet[2821]: I0514 18:11:56.601864 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601898 kubelet[2821]: I0514 18:11:56.601883 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601918 kubelet[2821]: I0514 18:11:56.601908 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601985 kubelet[2821]: I0514 18:11:56.601925 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601985 kubelet[2821]: I0514 18:11:56.601944 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c9856bd9cefba25cb9f09bff67995d9e-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-c37eb65ec1\" (UID: \"c9856bd9cefba25cb9f09bff67995d9e\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601985 kubelet[2821]: I0514 18:11:56.601958 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.601985 kubelet[2821]: I0514 18:11:56.601973 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.602092 kubelet[2821]: I0514 18:11:56.602002 2821 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.604238 kubelet[2821]: E0514 18:11:56.604046 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="800ms" May 14 18:11:56.607124 systemd[1]: Created slice kubepods-burstable-podaa9a0fb3491eee452c6cae6fbf8041d9.slice - libcontainer container kubepods-burstable-podaa9a0fb3491eee452c6cae6fbf8041d9.slice. May 14 18:11:56.616541 systemd[1]: Created slice kubepods-burstable-podc9856bd9cefba25cb9f09bff67995d9e.slice - libcontainer container kubepods-burstable-podc9856bd9cefba25cb9f09bff67995d9e.slice. May 14 18:11:56.801992 kubelet[2821]: I0514 18:11:56.801945 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.802321 kubelet[2821]: E0514 18:11:56.802218 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:56.906525 containerd[1734]: time="2025-05-14T18:11:56.906475504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-c37eb65ec1,Uid:af41aae107f00c6091edb0715bd663ee,Namespace:kube-system,Attempt:0,}" May 14 18:11:56.915883 containerd[1734]: time="2025-05-14T18:11:56.915854042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-c37eb65ec1,Uid:aa9a0fb3491eee452c6cae6fbf8041d9,Namespace:kube-system,Attempt:0,}" May 14 18:11:56.919320 containerd[1734]: time="2025-05-14T18:11:56.919298459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-c37eb65ec1,Uid:c9856bd9cefba25cb9f09bff67995d9e,Namespace:kube-system,Attempt:0,}" May 14 18:11:57.013882 kubelet[2821]: W0514 18:11:57.013826 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:57.013984 kubelet[2821]: E0514 18:11:57.013892 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:57.204098 kubelet[2821]: I0514 18:11:57.204024 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:57.204359 kubelet[2821]: E0514 18:11:57.204332 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:57.405496 kubelet[2821]: E0514 18:11:57.405443 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="1.6s" May 14 18:11:57.563727 kubelet[2821]: W0514 18:11:57.563633 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:57.563727 kubelet[2821]: E0514 18:11:57.563690 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:57.564095 kubelet[2821]: W0514 18:11:57.564014 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:57.564095 kubelet[2821]: E0514 18:11:57.564056 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:57.731489 kubelet[2821]: W0514 18:11:57.731460 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:57.731580 kubelet[2821]: E0514 18:11:57.731499 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:58.005840 kubelet[2821]: I0514 18:11:58.005790 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:58.006154 kubelet[2821]: E0514 18:11:58.006057 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:58.011825 kubelet[2821]: E0514 18:11:58.011799 2821 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.38:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:59.006003 kubelet[2821]: E0514 18:11:59.005925 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="3.2s" May 14 18:11:59.419346 kubelet[2821]: W0514 18:11:59.419299 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:59.419472 kubelet[2821]: E0514 18:11:59.419356 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:59.426938 kubelet[2821]: W0514 18:11:59.426914 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:11:59.427001 kubelet[2821]: E0514 18:11:59.426948 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:11:59.607803 kubelet[2821]: I0514 18:11:59.607772 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:11:59.608167 kubelet[2821]: E0514 18:11:59.608144 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:00.025946 kubelet[2821]: W0514 18:12:00.025910 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:12:00.026241 kubelet[2821]: E0514 18:12:00.025954 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-c37eb65ec1&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:12:00.334649 containerd[1734]: time="2025-05-14T18:12:00.334543526Z" level=info msg="connecting to shim 4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a" address="unix:///run/containerd/s/0b48e5a15a720c71ac74bd0a2f62b984d891ed4209086a577fcec80afd9a2f8f" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:00.352893 systemd[1]: Started cri-containerd-4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a.scope - libcontainer container 4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a. May 14 18:12:00.390321 containerd[1734]: time="2025-05-14T18:12:00.389823605Z" level=info msg="connecting to shim cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881" address="unix:///run/containerd/s/de03fb477b8cd1996420c57e4879dcf1b07b560d13281acda86b524378f7f8df" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:00.410868 systemd[1]: Started cri-containerd-cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881.scope - libcontainer container cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881. May 14 18:12:00.674638 kubelet[2821]: W0514 18:12:00.674596 2821 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.38:6443: connect: connection refused May 14 18:12:00.674776 kubelet[2821]: E0514 18:12:00.674652 2821 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:12:02.180701 containerd[1734]: time="2025-05-14T18:12:02.180261739Z" level=info msg="connecting to shim f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a" address="unix:///run/containerd/s/4e0f0fe265773a8525ceea94be7551c03aa8c0889c783abc3c1b91bf0ef73a68" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:02.203903 systemd[1]: Started cri-containerd-f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a.scope - libcontainer container f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a. May 14 18:12:02.207123 kubelet[2821]: E0514 18:12:02.207089 2821 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-c37eb65ec1?timeout=10s\": dial tcp 10.200.8.38:6443: connect: connection refused" interval="6.4s" May 14 18:12:02.331879 containerd[1734]: time="2025-05-14T18:12:02.331833114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-c37eb65ec1,Uid:af41aae107f00c6091edb0715bd663ee,Namespace:kube-system,Attempt:0,} returns sandbox id \"4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a\"" May 14 18:12:02.336064 containerd[1734]: time="2025-05-14T18:12:02.334802180Z" level=info msg="CreateContainer within sandbox \"4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 14 18:12:02.378577 containerd[1734]: time="2025-05-14T18:12:02.378548233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-c37eb65ec1,Uid:aa9a0fb3491eee452c6cae6fbf8041d9,Namespace:kube-system,Attempt:0,} returns sandbox id \"cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881\"" May 14 18:12:02.382424 containerd[1734]: time="2025-05-14T18:12:02.382397843Z" level=info msg="CreateContainer within sandbox \"cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 14 18:12:02.383643 kubelet[2821]: E0514 18:12:02.383620 2821 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.38:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.38:6443: connect: connection refused" logger="UnhandledError" May 14 18:12:02.472739 containerd[1734]: time="2025-05-14T18:12:02.472474519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-c37eb65ec1,Uid:c9856bd9cefba25cb9f09bff67995d9e,Namespace:kube-system,Attempt:0,} returns sandbox id \"f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a\"" May 14 18:12:02.474589 containerd[1734]: time="2025-05-14T18:12:02.474564955Z" level=info msg="CreateContainer within sandbox \"f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 14 18:12:02.777102 containerd[1734]: time="2025-05-14T18:12:02.776984951Z" level=info msg="Container 09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:02.779703 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3942364861.mount: Deactivated successfully. May 14 18:12:02.809762 kubelet[2821]: I0514 18:12:02.809729 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:02.810039 kubelet[2821]: E0514 18:12:02.810022 2821 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.38:6443/api/v1/nodes\": dial tcp 10.200.8.38:6443: connect: connection refused" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:02.880401 containerd[1734]: time="2025-05-14T18:12:02.880362128Z" level=info msg="Container b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:02.931067 containerd[1734]: time="2025-05-14T18:12:02.931037272Z" level=info msg="Container c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:03.182946 containerd[1734]: time="2025-05-14T18:12:03.182922981Z" level=info msg="CreateContainer within sandbox \"cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08\"" May 14 18:12:03.183501 containerd[1734]: time="2025-05-14T18:12:03.183485357Z" level=info msg="StartContainer for \"09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08\"" May 14 18:12:03.184430 containerd[1734]: time="2025-05-14T18:12:03.184406295Z" level=info msg="connecting to shim 09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08" address="unix:///run/containerd/s/de03fb477b8cd1996420c57e4879dcf1b07b560d13281acda86b524378f7f8df" protocol=ttrpc version=3 May 14 18:12:03.207889 systemd[1]: Started cri-containerd-09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08.scope - libcontainer container 09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08. May 14 18:12:03.279058 containerd[1734]: time="2025-05-14T18:12:03.279036262Z" level=info msg="CreateContainer within sandbox \"f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a\"" May 14 18:12:03.279438 containerd[1734]: time="2025-05-14T18:12:03.279420054Z" level=info msg="StartContainer for \"c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a\"" May 14 18:12:03.280734 containerd[1734]: time="2025-05-14T18:12:03.280711634Z" level=info msg="CreateContainer within sandbox \"4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8\"" May 14 18:12:03.281705 containerd[1734]: time="2025-05-14T18:12:03.281007915Z" level=info msg="connecting to shim c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a" address="unix:///run/containerd/s/4e0f0fe265773a8525ceea94be7551c03aa8c0889c783abc3c1b91bf0ef73a68" protocol=ttrpc version=3 May 14 18:12:03.281705 containerd[1734]: time="2025-05-14T18:12:03.281582799Z" level=info msg="StartContainer for \"09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08\" returns successfully" May 14 18:12:03.282768 containerd[1734]: time="2025-05-14T18:12:03.282051250Z" level=info msg="StartContainer for \"b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8\"" May 14 18:12:03.286936 containerd[1734]: time="2025-05-14T18:12:03.286816412Z" level=info msg="connecting to shim b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8" address="unix:///run/containerd/s/0b48e5a15a720c71ac74bd0a2f62b984d891ed4209086a577fcec80afd9a2f8f" protocol=ttrpc version=3 May 14 18:12:03.308883 systemd[1]: Started cri-containerd-c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a.scope - libcontainer container c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a. May 14 18:12:03.312480 systemd[1]: Started cri-containerd-b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8.scope - libcontainer container b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8. May 14 18:12:03.384123 containerd[1734]: time="2025-05-14T18:12:03.384102775Z" level=info msg="StartContainer for \"b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8\" returns successfully" May 14 18:12:03.388864 containerd[1734]: time="2025-05-14T18:12:03.388801036Z" level=info msg="StartContainer for \"c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a\" returns successfully" May 14 18:12:04.993374 kubelet[2821]: I0514 18:12:04.993344 2821 apiserver.go:52] "Watching apiserver" May 14 18:12:04.997023 kubelet[2821]: I0514 18:12:04.997000 2821 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 18:12:05.452771 kubelet[2821]: E0514 18:12:05.452731 2821 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-c37eb65ec1" not found May 14 18:12:05.958057 kubelet[2821]: E0514 18:12:05.958030 2821 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-c37eb65ec1" not found May 14 18:12:06.499817 kubelet[2821]: E0514 18:12:06.499737 2821 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:12:06.501077 kubelet[2821]: E0514 18:12:06.500980 2821 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-c37eb65ec1" not found May 14 18:12:07.558674 kubelet[2821]: E0514 18:12:07.558560 2821 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-c37eb65ec1" not found May 14 18:12:08.610764 kubelet[2821]: E0514 18:12:08.610719 2821 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4334.0.0-a-c37eb65ec1\" not found" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:09.211725 kubelet[2821]: I0514 18:12:09.211292 2821 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:09.216093 kubelet[2821]: I0514 18:12:09.215965 2821 kubelet_node_status.go:75] "Successfully registered node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:09.216093 kubelet[2821]: E0514 18:12:09.215990 2821 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4334.0.0-a-c37eb65ec1\": node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:12:09.433263 systemd[1]: Reload requested from client PID 3093 ('systemctl') (unit session-9.scope)... May 14 18:12:09.433275 systemd[1]: Reloading... May 14 18:12:09.513766 zram_generator::config[3138]: No configuration found. May 14 18:12:09.590054 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 18:12:09.681146 systemd[1]: Reloading finished in 247 ms. May 14 18:12:09.712830 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:12:09.735452 systemd[1]: kubelet.service: Deactivated successfully. May 14 18:12:09.735643 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:12:09.735686 systemd[1]: kubelet.service: Consumed 581ms CPU time, 115.6M memory peak. May 14 18:12:09.737080 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 18:12:12.495542 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 18:12:12.503282 (kubelet)[3205]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 18:12:12.536994 kubelet[3205]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 18:12:12.536994 kubelet[3205]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 18:12:12.536994 kubelet[3205]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 18:12:12.537236 kubelet[3205]: I0514 18:12:12.537081 3205 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 18:12:12.541577 kubelet[3205]: I0514 18:12:12.541554 3205 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 14 18:12:12.541577 kubelet[3205]: I0514 18:12:12.541571 3205 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 18:12:12.541797 kubelet[3205]: I0514 18:12:12.541784 3205 server.go:929] "Client rotation is on, will bootstrap in background" May 14 18:12:12.542555 kubelet[3205]: I0514 18:12:12.542529 3205 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 14 18:12:12.543935 kubelet[3205]: I0514 18:12:12.543828 3205 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 18:12:12.546833 kubelet[3205]: I0514 18:12:12.546816 3205 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 18:12:12.549370 kubelet[3205]: I0514 18:12:12.549350 3205 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 18:12:12.549460 kubelet[3205]: I0514 18:12:12.549425 3205 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 14 18:12:12.549514 kubelet[3205]: I0514 18:12:12.549497 3205 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 18:12:12.549651 kubelet[3205]: I0514 18:12:12.549515 3205 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-c37eb65ec1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 18:12:12.549754 kubelet[3205]: I0514 18:12:12.549651 3205 topology_manager.go:138] "Creating topology manager with none policy" May 14 18:12:12.549754 kubelet[3205]: I0514 18:12:12.549660 3205 container_manager_linux.go:300] "Creating device plugin manager" May 14 18:12:12.549754 kubelet[3205]: I0514 18:12:12.549687 3205 state_mem.go:36] "Initialized new in-memory state store" May 14 18:12:12.549828 kubelet[3205]: I0514 18:12:12.549784 3205 kubelet.go:408] "Attempting to sync node with API server" May 14 18:12:12.549828 kubelet[3205]: I0514 18:12:12.549793 3205 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 18:12:12.549828 kubelet[3205]: I0514 18:12:12.549816 3205 kubelet.go:314] "Adding apiserver pod source" May 14 18:12:12.550008 kubelet[3205]: I0514 18:12:12.549830 3205 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 18:12:12.556317 kubelet[3205]: I0514 18:12:12.555109 3205 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 14 18:12:12.556317 kubelet[3205]: I0514 18:12:12.555499 3205 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 18:12:12.558100 kubelet[3205]: I0514 18:12:12.558054 3205 server.go:1269] "Started kubelet" May 14 18:12:12.563733 kubelet[3205]: I0514 18:12:12.563706 3205 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 18:12:12.565286 kubelet[3205]: I0514 18:12:12.564647 3205 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 18:12:12.566759 kubelet[3205]: I0514 18:12:12.566650 3205 server.go:460] "Adding debug handlers to kubelet server" May 14 18:12:12.569182 kubelet[3205]: I0514 18:12:12.569138 3205 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 18:12:12.569310 kubelet[3205]: I0514 18:12:12.569297 3205 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 18:12:12.572278 kubelet[3205]: I0514 18:12:12.572208 3205 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 18:12:12.573324 kubelet[3205]: I0514 18:12:12.573289 3205 volume_manager.go:289] "Starting Kubelet Volume Manager" May 14 18:12:12.573501 kubelet[3205]: E0514 18:12:12.573478 3205 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-c37eb65ec1\" not found" May 14 18:12:12.575679 kubelet[3205]: I0514 18:12:12.575581 3205 factory.go:221] Registration of the systemd container factory successfully May 14 18:12:12.575944 kubelet[3205]: I0514 18:12:12.575930 3205 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 18:12:12.576324 kubelet[3205]: I0514 18:12:12.576310 3205 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 14 18:12:12.576416 kubelet[3205]: I0514 18:12:12.576408 3205 reconciler.go:26] "Reconciler: start to sync state" May 14 18:12:12.577785 kubelet[3205]: I0514 18:12:12.577769 3205 factory.go:221] Registration of the containerd container factory successfully May 14 18:12:12.578165 kubelet[3205]: I0514 18:12:12.578105 3205 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 18:12:12.579358 kubelet[3205]: I0514 18:12:12.579139 3205 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 18:12:12.579358 kubelet[3205]: I0514 18:12:12.579165 3205 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 18:12:12.579358 kubelet[3205]: I0514 18:12:12.579178 3205 kubelet.go:2321] "Starting kubelet main sync loop" May 14 18:12:12.579358 kubelet[3205]: E0514 18:12:12.579204 3205 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 18:12:12.584911 kubelet[3205]: E0514 18:12:12.584897 3205 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 18:12:12.619200 kubelet[3205]: I0514 18:12:12.619189 3205 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 18:12:12.619264 kubelet[3205]: I0514 18:12:12.619212 3205 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 18:12:12.619264 kubelet[3205]: I0514 18:12:12.619227 3205 state_mem.go:36] "Initialized new in-memory state store" May 14 18:12:12.619352 kubelet[3205]: I0514 18:12:12.619337 3205 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 14 18:12:12.619373 kubelet[3205]: I0514 18:12:12.619347 3205 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 14 18:12:12.619373 kubelet[3205]: I0514 18:12:12.619362 3205 policy_none.go:49] "None policy: Start" May 14 18:12:12.619815 kubelet[3205]: I0514 18:12:12.619803 3205 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 18:12:12.619869 kubelet[3205]: I0514 18:12:12.619818 3205 state_mem.go:35] "Initializing new in-memory state store" May 14 18:12:12.619955 kubelet[3205]: I0514 18:12:12.619947 3205 state_mem.go:75] "Updated machine memory state" May 14 18:12:12.622733 kubelet[3205]: I0514 18:12:12.622707 3205 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 18:12:12.622847 kubelet[3205]: I0514 18:12:12.622838 3205 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 18:12:12.622880 kubelet[3205]: I0514 18:12:12.622852 3205 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 18:12:12.623109 kubelet[3205]: I0514 18:12:12.623094 3205 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 18:12:12.724942 kubelet[3205]: I0514 18:12:12.724927 3205 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.748686 kubelet[3205]: W0514 18:12:12.748577 3205 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 18:12:12.753676 kubelet[3205]: W0514 18:12:12.753429 3205 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 18:12:12.755235 kubelet[3205]: W0514 18:12:12.755184 3205 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 18:12:12.776621 kubelet[3205]: I0514 18:12:12.776602 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776621 kubelet[3205]: I0514 18:12:12.776625 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776786 kubelet[3205]: I0514 18:12:12.776642 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776786 kubelet[3205]: I0514 18:12:12.776656 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776786 kubelet[3205]: I0514 18:12:12.776670 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c9856bd9cefba25cb9f09bff67995d9e-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-c37eb65ec1\" (UID: \"c9856bd9cefba25cb9f09bff67995d9e\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776786 kubelet[3205]: I0514 18:12:12.776683 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776786 kubelet[3205]: I0514 18:12:12.776695 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776884 kubelet[3205]: I0514 18:12:12.776709 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/af41aae107f00c6091edb0715bd663ee-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" (UID: \"af41aae107f00c6091edb0715bd663ee\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.776884 kubelet[3205]: I0514 18:12:12.776734 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa9a0fb3491eee452c6cae6fbf8041d9-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" (UID: \"aa9a0fb3491eee452c6cae6fbf8041d9\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.858299 kubelet[3205]: I0514 18:12:12.858021 3205 kubelet_node_status.go:111] "Node was previously registered" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:12.858299 kubelet[3205]: I0514 18:12:12.858072 3205 kubelet_node_status.go:75] "Successfully registered node" node="ci-4334.0.0-a-c37eb65ec1" May 14 18:12:13.013091 kubelet[3205]: I0514 18:12:13.012723 3205 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 14 18:12:13.013174 containerd[1734]: time="2025-05-14T18:12:13.013117222Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 14 18:12:13.013410 kubelet[3205]: I0514 18:12:13.013390 3205 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 14 18:12:13.551705 kubelet[3205]: I0514 18:12:13.551686 3205 apiserver.go:52] "Watching apiserver" May 14 18:12:13.576790 kubelet[3205]: I0514 18:12:13.576727 3205 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 18:12:13.652000 kubelet[3205]: W0514 18:12:13.651978 3205 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 18:12:13.652067 kubelet[3205]: E0514 18:12:13.652027 3205 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4334.0.0-a-c37eb65ec1\" already exists" pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:13.657027 kubelet[3205]: W0514 18:12:13.656971 3205 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 18:12:13.657027 kubelet[3205]: E0514 18:12:13.657015 3205 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4334.0.0-a-c37eb65ec1\" already exists" pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" May 14 18:12:13.811464 kubelet[3205]: I0514 18:12:13.811298 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4334.0.0-a-c37eb65ec1" podStartSLOduration=1.8112846889999998 podStartE2EDuration="1.811284689s" podCreationTimestamp="2025-05-14 18:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:12:13.660543296 +0000 UTC m=+1.153066660" watchObservedRunningTime="2025-05-14 18:12:13.811284689 +0000 UTC m=+1.303808044" May 14 18:12:13.856178 kubelet[3205]: I0514 18:12:13.856144 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4334.0.0-a-c37eb65ec1" podStartSLOduration=1.856133163 podStartE2EDuration="1.856133163s" podCreationTimestamp="2025-05-14 18:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:12:13.811874584 +0000 UTC m=+1.304397943" watchObservedRunningTime="2025-05-14 18:12:13.856133163 +0000 UTC m=+1.348656549" May 14 18:12:13.953291 kubelet[3205]: I0514 18:12:13.953113 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4334.0.0-a-c37eb65ec1" podStartSLOduration=1.953103982 podStartE2EDuration="1.953103982s" podCreationTimestamp="2025-05-14 18:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:12:13.856241447 +0000 UTC m=+1.348764815" watchObservedRunningTime="2025-05-14 18:12:13.953103982 +0000 UTC m=+1.445627342" May 14 18:12:14.016097 systemd[1]: Created slice kubepods-besteffort-pod09ee6e35_87de_42ed_a47b_36667e564733.slice - libcontainer container kubepods-besteffort-pod09ee6e35_87de_42ed_a47b_36667e564733.slice. May 14 18:12:14.084577 kubelet[3205]: I0514 18:12:14.084512 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/09ee6e35-87de-42ed-a47b-36667e564733-kube-proxy\") pod \"kube-proxy-q8ftt\" (UID: \"09ee6e35-87de-42ed-a47b-36667e564733\") " pod="kube-system/kube-proxy-q8ftt" May 14 18:12:14.084577 kubelet[3205]: I0514 18:12:14.084565 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/09ee6e35-87de-42ed-a47b-36667e564733-xtables-lock\") pod \"kube-proxy-q8ftt\" (UID: \"09ee6e35-87de-42ed-a47b-36667e564733\") " pod="kube-system/kube-proxy-q8ftt" May 14 18:12:14.084668 kubelet[3205]: I0514 18:12:14.084584 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2dwr\" (UniqueName: \"kubernetes.io/projected/09ee6e35-87de-42ed-a47b-36667e564733-kube-api-access-k2dwr\") pod \"kube-proxy-q8ftt\" (UID: \"09ee6e35-87de-42ed-a47b-36667e564733\") " pod="kube-system/kube-proxy-q8ftt" May 14 18:12:14.084668 kubelet[3205]: I0514 18:12:14.084603 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09ee6e35-87de-42ed-a47b-36667e564733-lib-modules\") pod \"kube-proxy-q8ftt\" (UID: \"09ee6e35-87de-42ed-a47b-36667e564733\") " pod="kube-system/kube-proxy-q8ftt" May 14 18:12:14.325960 containerd[1734]: time="2025-05-14T18:12:14.325923066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q8ftt,Uid:09ee6e35-87de-42ed-a47b-36667e564733,Namespace:kube-system,Attempt:0,}" May 14 18:12:14.629281 containerd[1734]: time="2025-05-14T18:12:14.629248433Z" level=info msg="connecting to shim b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8" address="unix:///run/containerd/s/c03f4e202e74545e6be24f767e04d4cbf402ec8f50f5e49e8f1ec7f7f452d4a0" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:14.651915 systemd[1]: Started cri-containerd-b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8.scope - libcontainer container b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8. May 14 18:12:14.670384 containerd[1734]: time="2025-05-14T18:12:14.670350876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q8ftt,Uid:09ee6e35-87de-42ed-a47b-36667e564733,Namespace:kube-system,Attempt:0,} returns sandbox id \"b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8\"" May 14 18:12:14.672646 containerd[1734]: time="2025-05-14T18:12:14.672589312Z" level=info msg="CreateContainer within sandbox \"b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 14 18:12:15.786217 containerd[1734]: time="2025-05-14T18:12:15.786179420Z" level=info msg="Container 01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:15.793882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4223073942.mount: Deactivated successfully. May 14 18:12:16.221871 containerd[1734]: time="2025-05-14T18:12:16.221779359Z" level=info msg="CreateContainer within sandbox \"b48ac70a20e8e5d03a774d669428d844575233998bb845b5edddbe604122d3b8\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f\"" May 14 18:12:16.222364 containerd[1734]: time="2025-05-14T18:12:16.222337249Z" level=info msg="StartContainer for \"01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f\"" May 14 18:12:16.223952 containerd[1734]: time="2025-05-14T18:12:16.223926260Z" level=info msg="connecting to shim 01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f" address="unix:///run/containerd/s/c03f4e202e74545e6be24f767e04d4cbf402ec8f50f5e49e8f1ec7f7f452d4a0" protocol=ttrpc version=3 May 14 18:12:16.256190 systemd[1]: Started cri-containerd-01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f.scope - libcontainer container 01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f. May 14 18:12:16.398246 containerd[1734]: time="2025-05-14T18:12:16.398043717Z" level=info msg="StartContainer for \"01efc5baf83ebfc0d638803e378f72202c89cbfc070dd424a7c70665dc6bde4f\" returns successfully" May 14 18:12:16.597922 systemd[1]: Created slice kubepods-besteffort-pod4d7029a1_c7af_4313_989a_19c3947b47bd.slice - libcontainer container kubepods-besteffort-pod4d7029a1_c7af_4313_989a_19c3947b47bd.slice. May 14 18:12:16.599183 kubelet[3205]: I0514 18:12:16.599158 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4d7029a1-c7af-4313-989a-19c3947b47bd-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-b8xtt\" (UID: \"4d7029a1-c7af-4313-989a-19c3947b47bd\") " pod="tigera-operator/tigera-operator-6f6897fdc5-b8xtt" May 14 18:12:16.601524 kubelet[3205]: I0514 18:12:16.600573 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnqcx\" (UniqueName: \"kubernetes.io/projected/4d7029a1-c7af-4313-989a-19c3947b47bd-kube-api-access-cnqcx\") pod \"tigera-operator-6f6897fdc5-b8xtt\" (UID: \"4d7029a1-c7af-4313-989a-19c3947b47bd\") " pod="tigera-operator/tigera-operator-6f6897fdc5-b8xtt" May 14 18:12:16.903447 containerd[1734]: time="2025-05-14T18:12:16.903399917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-b8xtt,Uid:4d7029a1-c7af-4313-989a-19c3947b47bd,Namespace:tigera-operator,Attempt:0,}" May 14 18:12:17.300356 kubelet[3205]: I0514 18:12:17.300270 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-q8ftt" podStartSLOduration=4.300254184 podStartE2EDuration="4.300254184s" podCreationTimestamp="2025-05-14 18:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:12:16.762953293 +0000 UTC m=+4.255476656" watchObservedRunningTime="2025-05-14 18:12:17.300254184 +0000 UTC m=+4.792777544" May 14 18:12:18.724109 containerd[1734]: time="2025-05-14T18:12:18.724066503Z" level=info msg="connecting to shim 8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f" address="unix:///run/containerd/s/d5f87899901e0ba0e8eb329ded1532f66d81c0229d1fdf4ae4df6f4461537921" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:18.745886 systemd[1]: Started cri-containerd-8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f.scope - libcontainer container 8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f. May 14 18:12:19.430607 containerd[1734]: time="2025-05-14T18:12:19.430535888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-b8xtt,Uid:4d7029a1-c7af-4313-989a-19c3947b47bd,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f\"" May 14 18:12:19.432695 containerd[1734]: time="2025-05-14T18:12:19.432663289Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 14 18:12:19.465953 sudo[2155]: pam_unix(sudo:session): session closed for user root May 14 18:12:19.567838 sshd[2154]: Connection closed by 10.200.16.10 port 47182 May 14 18:12:19.568248 sshd-session[2152]: pam_unix(sshd:session): session closed for user core May 14 18:12:19.571358 systemd[1]: sshd@6-10.200.8.38:22-10.200.16.10:47182.service: Deactivated successfully. May 14 18:12:19.573220 systemd[1]: session-9.scope: Deactivated successfully. May 14 18:12:19.573392 systemd[1]: session-9.scope: Consumed 3.165s CPU time, 227.3M memory peak. May 14 18:12:19.574953 systemd-logind[1706]: Session 9 logged out. Waiting for processes to exit. May 14 18:12:19.576270 systemd-logind[1706]: Removed session 9. May 14 18:12:25.604521 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3617262554.mount: Deactivated successfully. May 14 18:12:26.728868 containerd[1734]: time="2025-05-14T18:12:26.728827974Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:26.775499 containerd[1734]: time="2025-05-14T18:12:26.775469996Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 14 18:12:26.780066 containerd[1734]: time="2025-05-14T18:12:26.780028358Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:26.823757 containerd[1734]: time="2025-05-14T18:12:26.823716065Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:26.824314 containerd[1734]: time="2025-05-14T18:12:26.824218590Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 7.391519565s" May 14 18:12:26.824314 containerd[1734]: time="2025-05-14T18:12:26.824246258Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 14 18:12:26.826270 containerd[1734]: time="2025-05-14T18:12:26.826244376Z" level=info msg="CreateContainer within sandbox \"8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 14 18:12:26.981177 containerd[1734]: time="2025-05-14T18:12:26.980940363Z" level=info msg="Container 134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:27.092326 containerd[1734]: time="2025-05-14T18:12:27.092299568Z" level=info msg="CreateContainer within sandbox \"8d55992598842ae7914e3d4de1249fc2f6de2d61252f32849983249e190fbf2f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f\"" May 14 18:12:27.092761 containerd[1734]: time="2025-05-14T18:12:27.092685564Z" level=info msg="StartContainer for \"134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f\"" May 14 18:12:27.093437 containerd[1734]: time="2025-05-14T18:12:27.093350730Z" level=info msg="connecting to shim 134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f" address="unix:///run/containerd/s/d5f87899901e0ba0e8eb329ded1532f66d81c0229d1fdf4ae4df6f4461537921" protocol=ttrpc version=3 May 14 18:12:27.110883 systemd[1]: Started cri-containerd-134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f.scope - libcontainer container 134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f. May 14 18:12:27.134715 containerd[1734]: time="2025-05-14T18:12:27.134692271Z" level=info msg="StartContainer for \"134acd70d5d8a043ecd1a346fce3890981007c1d3a44daf808de25fec1b02c4f\" returns successfully" May 14 18:12:31.447489 kubelet[3205]: I0514 18:12:31.447284 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-b8xtt" podStartSLOduration=8.054261336 podStartE2EDuration="15.447265428s" podCreationTimestamp="2025-05-14 18:12:16 +0000 UTC" firstStartedPulling="2025-05-14 18:12:19.432030247 +0000 UTC m=+6.924553605" lastFinishedPulling="2025-05-14 18:12:26.82503433 +0000 UTC m=+14.317557697" observedRunningTime="2025-05-14 18:12:27.638778904 +0000 UTC m=+15.131302276" watchObservedRunningTime="2025-05-14 18:12:31.447265428 +0000 UTC m=+18.939788790" May 14 18:12:31.457008 systemd[1]: Created slice kubepods-besteffort-pod283b581c_0c91_42f4_bc6a_cdd938886eff.slice - libcontainer container kubepods-besteffort-pod283b581c_0c91_42f4_bc6a_cdd938886eff.slice. May 14 18:12:31.489114 kubelet[3205]: I0514 18:12:31.489088 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghmgf\" (UniqueName: \"kubernetes.io/projected/283b581c-0c91-42f4-bc6a-cdd938886eff-kube-api-access-ghmgf\") pod \"calico-typha-799d64d946-hm4hn\" (UID: \"283b581c-0c91-42f4-bc6a-cdd938886eff\") " pod="calico-system/calico-typha-799d64d946-hm4hn" May 14 18:12:31.489209 kubelet[3205]: I0514 18:12:31.489123 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/283b581c-0c91-42f4-bc6a-cdd938886eff-tigera-ca-bundle\") pod \"calico-typha-799d64d946-hm4hn\" (UID: \"283b581c-0c91-42f4-bc6a-cdd938886eff\") " pod="calico-system/calico-typha-799d64d946-hm4hn" May 14 18:12:31.489209 kubelet[3205]: I0514 18:12:31.489138 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/283b581c-0c91-42f4-bc6a-cdd938886eff-typha-certs\") pod \"calico-typha-799d64d946-hm4hn\" (UID: \"283b581c-0c91-42f4-bc6a-cdd938886eff\") " pod="calico-system/calico-typha-799d64d946-hm4hn" May 14 18:12:31.756053 systemd[1]: Created slice kubepods-besteffort-pod41e82bc8_542e_4aed_91e7_0d220ea19b51.slice - libcontainer container kubepods-besteffort-pod41e82bc8_542e_4aed_91e7_0d220ea19b51.slice. May 14 18:12:31.762269 containerd[1734]: time="2025-05-14T18:12:31.762218658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-799d64d946-hm4hn,Uid:283b581c-0c91-42f4-bc6a-cdd938886eff,Namespace:calico-system,Attempt:0,}" May 14 18:12:31.791702 kubelet[3205]: I0514 18:12:31.791664 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcc9l\" (UniqueName: \"kubernetes.io/projected/41e82bc8-542e-4aed-91e7-0d220ea19b51-kube-api-access-qcc9l\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.791900 kubelet[3205]: I0514 18:12:31.791710 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e82bc8-542e-4aed-91e7-0d220ea19b51-tigera-ca-bundle\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.791900 kubelet[3205]: I0514 18:12:31.791732 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-lib-calico\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.791900 kubelet[3205]: I0514 18:12:31.791764 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-xtables-lock\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.791900 kubelet[3205]: I0514 18:12:31.791783 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-flexvol-driver-host\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.791900 kubelet[3205]: I0514 18:12:31.791799 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-bin-dir\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792057 kubelet[3205]: I0514 18:12:31.791815 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-lib-modules\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792057 kubelet[3205]: I0514 18:12:31.791839 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/41e82bc8-542e-4aed-91e7-0d220ea19b51-node-certs\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792057 kubelet[3205]: I0514 18:12:31.791862 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-net-dir\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792057 kubelet[3205]: I0514 18:12:31.791888 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-policysync\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792057 kubelet[3205]: I0514 18:12:31.791903 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-run-calico\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.792147 kubelet[3205]: I0514 18:12:31.791919 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-log-dir\") pod \"calico-node-42jns\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " pod="calico-system/calico-node-42jns" May 14 18:12:31.897087 kubelet[3205]: E0514 18:12:31.897028 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:31.897087 kubelet[3205]: W0514 18:12:31.897044 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:31.897087 kubelet[3205]: E0514 18:12:31.897065 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:31.960905 kubelet[3205]: E0514 18:12:31.960889 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:31.961042 kubelet[3205]: W0514 18:12:31.960942 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:31.961042 kubelet[3205]: E0514 18:12:31.960957 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.033918 containerd[1734]: time="2025-05-14T18:12:32.033445467Z" level=info msg="connecting to shim 1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449" address="unix:///run/containerd/s/ed38470e6da65c844c514524c5b37bd8c4a5da27d2a9e0504800be76976f62b5" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:32.053572 kubelet[3205]: E0514 18:12:32.052444 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:32.053136 systemd[1]: Started cri-containerd-1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449.scope - libcontainer container 1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449. May 14 18:12:32.059894 containerd[1734]: time="2025-05-14T18:12:32.059872081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-42jns,Uid:41e82bc8-542e-4aed-91e7-0d220ea19b51,Namespace:calico-system,Attempt:0,}" May 14 18:12:32.089244 kubelet[3205]: E0514 18:12:32.089161 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.089244 kubelet[3205]: W0514 18:12:32.089176 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.089244 kubelet[3205]: E0514 18:12:32.089189 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.089459 kubelet[3205]: E0514 18:12:32.089407 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.089459 kubelet[3205]: W0514 18:12:32.089415 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.089459 kubelet[3205]: E0514 18:12:32.089424 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.089882 kubelet[3205]: E0514 18:12:32.089807 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.089882 kubelet[3205]: W0514 18:12:32.089822 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.089882 kubelet[3205]: E0514 18:12:32.089848 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.090103 kubelet[3205]: E0514 18:12:32.090080 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.090103 kubelet[3205]: W0514 18:12:32.090089 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.090253 kubelet[3205]: E0514 18:12:32.090169 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.090480 kubelet[3205]: E0514 18:12:32.090471 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.090540 kubelet[3205]: W0514 18:12:32.090528 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.090585 kubelet[3205]: E0514 18:12:32.090579 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.090722 kubelet[3205]: E0514 18:12:32.090712 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.090843 kubelet[3205]: W0514 18:12:32.090794 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.090843 kubelet[3205]: E0514 18:12:32.090803 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.091019 kubelet[3205]: E0514 18:12:32.090981 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.091019 kubelet[3205]: W0514 18:12:32.090988 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.091019 kubelet[3205]: E0514 18:12:32.090995 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.091221 kubelet[3205]: E0514 18:12:32.091176 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.091221 kubelet[3205]: W0514 18:12:32.091182 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.091221 kubelet[3205]: E0514 18:12:32.091189 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.091447 kubelet[3205]: E0514 18:12:32.091410 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.091447 kubelet[3205]: W0514 18:12:32.091417 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.091447 kubelet[3205]: E0514 18:12:32.091425 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.091609 kubelet[3205]: E0514 18:12:32.091597 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.091668 kubelet[3205]: W0514 18:12:32.091602 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.091668 kubelet[3205]: E0514 18:12:32.091649 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.091806 kubelet[3205]: E0514 18:12:32.091801 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.091844 kubelet[3205]: W0514 18:12:32.091839 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.091887 kubelet[3205]: E0514 18:12:32.091881 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.092025 kubelet[3205]: E0514 18:12:32.092005 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.092025 kubelet[3205]: W0514 18:12:32.092011 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.092109 kubelet[3205]: E0514 18:12:32.092019 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.092205 kubelet[3205]: E0514 18:12:32.092201 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.092278 kubelet[3205]: W0514 18:12:32.092236 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.092278 kubelet[3205]: E0514 18:12:32.092243 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.092408 kubelet[3205]: E0514 18:12:32.092392 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.092408 kubelet[3205]: W0514 18:12:32.092397 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.092512 kubelet[3205]: E0514 18:12:32.092482 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.092703 kubelet[3205]: E0514 18:12:32.092654 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.092703 kubelet[3205]: W0514 18:12:32.092665 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.092703 kubelet[3205]: E0514 18:12:32.092675 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.092956 kubelet[3205]: E0514 18:12:32.092910 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.092956 kubelet[3205]: W0514 18:12:32.092917 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.092956 kubelet[3205]: E0514 18:12:32.092924 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.093125 kubelet[3205]: E0514 18:12:32.093097 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.093125 kubelet[3205]: W0514 18:12:32.093102 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.093125 kubelet[3205]: E0514 18:12:32.093108 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.093302 kubelet[3205]: E0514 18:12:32.093273 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.093302 kubelet[3205]: W0514 18:12:32.093279 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.093302 kubelet[3205]: E0514 18:12:32.093284 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.093661 kubelet[3205]: E0514 18:12:32.093481 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.093661 kubelet[3205]: W0514 18:12:32.093492 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.093661 kubelet[3205]: E0514 18:12:32.093503 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.093938 kubelet[3205]: E0514 18:12:32.093832 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.093938 kubelet[3205]: W0514 18:12:32.093842 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.093938 kubelet[3205]: E0514 18:12:32.093852 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.094091 kubelet[3205]: E0514 18:12:32.094086 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.094121 kubelet[3205]: W0514 18:12:32.094117 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.094242 kubelet[3205]: E0514 18:12:32.094148 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.094242 kubelet[3205]: I0514 18:12:32.094202 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8c0ddea-8d34-4405-b1d7-cbac1335ae76-socket-dir\") pod \"csi-node-driver-cl9vj\" (UID: \"f8c0ddea-8d34-4405-b1d7-cbac1335ae76\") " pod="calico-system/csi-node-driver-cl9vj" May 14 18:12:32.094380 kubelet[3205]: E0514 18:12:32.094365 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.094380 kubelet[3205]: W0514 18:12:32.094372 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.094471 kubelet[3205]: E0514 18:12:32.094429 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.094471 kubelet[3205]: I0514 18:12:32.094444 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddpzh\" (UniqueName: \"kubernetes.io/projected/f8c0ddea-8d34-4405-b1d7-cbac1335ae76-kube-api-access-ddpzh\") pod \"csi-node-driver-cl9vj\" (UID: \"f8c0ddea-8d34-4405-b1d7-cbac1335ae76\") " pod="calico-system/csi-node-driver-cl9vj" May 14 18:12:32.094620 kubelet[3205]: E0514 18:12:32.094604 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.094620 kubelet[3205]: W0514 18:12:32.094612 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.094725 kubelet[3205]: E0514 18:12:32.094679 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.094725 kubelet[3205]: I0514 18:12:32.094696 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8c0ddea-8d34-4405-b1d7-cbac1335ae76-kubelet-dir\") pod \"csi-node-driver-cl9vj\" (UID: \"f8c0ddea-8d34-4405-b1d7-cbac1335ae76\") " pod="calico-system/csi-node-driver-cl9vj" May 14 18:12:32.094905 kubelet[3205]: E0514 18:12:32.094889 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.094905 kubelet[3205]: W0514 18:12:32.094897 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095012 kubelet[3205]: E0514 18:12:32.094960 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095012 kubelet[3205]: I0514 18:12:32.094977 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f8c0ddea-8d34-4405-b1d7-cbac1335ae76-varrun\") pod \"csi-node-driver-cl9vj\" (UID: \"f8c0ddea-8d34-4405-b1d7-cbac1335ae76\") " pod="calico-system/csi-node-driver-cl9vj" May 14 18:12:32.095195 kubelet[3205]: E0514 18:12:32.095167 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095195 kubelet[3205]: W0514 18:12:32.095184 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095322 kubelet[3205]: E0514 18:12:32.095260 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095322 kubelet[3205]: I0514 18:12:32.095277 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8c0ddea-8d34-4405-b1d7-cbac1335ae76-registration-dir\") pod \"csi-node-driver-cl9vj\" (UID: \"f8c0ddea-8d34-4405-b1d7-cbac1335ae76\") " pod="calico-system/csi-node-driver-cl9vj" May 14 18:12:32.095532 kubelet[3205]: E0514 18:12:32.095516 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095532 kubelet[3205]: W0514 18:12:32.095524 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095622 kubelet[3205]: E0514 18:12:32.095598 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095675 kubelet[3205]: E0514 18:12:32.095664 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095675 kubelet[3205]: W0514 18:12:32.095673 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095739 kubelet[3205]: E0514 18:12:32.095702 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095788 kubelet[3205]: E0514 18:12:32.095784 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095851 kubelet[3205]: W0514 18:12:32.095788 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095851 kubelet[3205]: E0514 18:12:32.095802 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095897 kubelet[3205]: E0514 18:12:32.095883 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095897 kubelet[3205]: W0514 18:12:32.095888 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.095958 kubelet[3205]: E0514 18:12:32.095896 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.095981 kubelet[3205]: E0514 18:12:32.095966 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.095981 kubelet[3205]: W0514 18:12:32.095970 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096024 kubelet[3205]: E0514 18:12:32.095980 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.096070 kubelet[3205]: E0514 18:12:32.096059 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.096070 kubelet[3205]: W0514 18:12:32.096065 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096109 kubelet[3205]: E0514 18:12:32.096071 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.096178 kubelet[3205]: E0514 18:12:32.096169 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.096178 kubelet[3205]: W0514 18:12:32.096176 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096223 kubelet[3205]: E0514 18:12:32.096182 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.096290 kubelet[3205]: E0514 18:12:32.096283 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.096290 kubelet[3205]: W0514 18:12:32.096289 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096334 kubelet[3205]: E0514 18:12:32.096294 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.096389 kubelet[3205]: E0514 18:12:32.096382 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.096389 kubelet[3205]: W0514 18:12:32.096388 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096429 kubelet[3205]: E0514 18:12:32.096393 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.096509 kubelet[3205]: E0514 18:12:32.096499 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.096509 kubelet[3205]: W0514 18:12:32.096507 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.096556 kubelet[3205]: E0514 18:12:32.096514 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.125641 containerd[1734]: time="2025-05-14T18:12:32.125618141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-799d64d946-hm4hn,Uid:283b581c-0c91-42f4-bc6a-cdd938886eff,Namespace:calico-system,Attempt:0,} returns sandbox id \"1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449\"" May 14 18:12:32.126662 containerd[1734]: time="2025-05-14T18:12:32.126635069Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 14 18:12:32.195689 kubelet[3205]: E0514 18:12:32.195674 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.195689 kubelet[3205]: W0514 18:12:32.195686 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.195807 kubelet[3205]: E0514 18:12:32.195697 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.195869 kubelet[3205]: E0514 18:12:32.195847 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.195869 kubelet[3205]: W0514 18:12:32.195867 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.195939 kubelet[3205]: E0514 18:12:32.195878 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.195969 kubelet[3205]: E0514 18:12:32.195961 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.195969 kubelet[3205]: W0514 18:12:32.195966 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196061 kubelet[3205]: E0514 18:12:32.195972 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196099 kubelet[3205]: E0514 18:12:32.196066 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196099 kubelet[3205]: W0514 18:12:32.196073 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196099 kubelet[3205]: E0514 18:12:32.196086 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196199 kubelet[3205]: E0514 18:12:32.196173 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196199 kubelet[3205]: W0514 18:12:32.196178 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196199 kubelet[3205]: E0514 18:12:32.196187 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196293 kubelet[3205]: E0514 18:12:32.196258 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196293 kubelet[3205]: W0514 18:12:32.196262 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196293 kubelet[3205]: E0514 18:12:32.196278 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196438 kubelet[3205]: E0514 18:12:32.196431 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196469 kubelet[3205]: W0514 18:12:32.196448 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196469 kubelet[3205]: E0514 18:12:32.196460 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196543 kubelet[3205]: E0514 18:12:32.196536 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196565 kubelet[3205]: W0514 18:12:32.196543 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196565 kubelet[3205]: E0514 18:12:32.196553 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196641 kubelet[3205]: E0514 18:12:32.196631 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196641 kubelet[3205]: W0514 18:12:32.196638 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196692 kubelet[3205]: E0514 18:12:32.196646 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196794 kubelet[3205]: E0514 18:12:32.196769 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196794 kubelet[3205]: W0514 18:12:32.196775 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196794 kubelet[3205]: E0514 18:12:32.196781 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.196884 kubelet[3205]: E0514 18:12:32.196859 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.196884 kubelet[3205]: W0514 18:12:32.196864 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.196884 kubelet[3205]: E0514 18:12:32.196871 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197002 kubelet[3205]: E0514 18:12:32.196958 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197002 kubelet[3205]: W0514 18:12:32.196962 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197002 kubelet[3205]: E0514 18:12:32.196970 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197066 kubelet[3205]: E0514 18:12:32.197045 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197066 kubelet[3205]: W0514 18:12:32.197049 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197066 kubelet[3205]: E0514 18:12:32.197055 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197142 kubelet[3205]: E0514 18:12:32.197135 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197142 kubelet[3205]: W0514 18:12:32.197141 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197192 kubelet[3205]: E0514 18:12:32.197151 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197284 kubelet[3205]: E0514 18:12:32.197277 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197284 kubelet[3205]: W0514 18:12:32.197283 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197330 kubelet[3205]: E0514 18:12:32.197297 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197400 kubelet[3205]: E0514 18:12:32.197378 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197400 kubelet[3205]: W0514 18:12:32.197398 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197466 kubelet[3205]: E0514 18:12:32.197406 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197502 kubelet[3205]: E0514 18:12:32.197494 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197502 kubelet[3205]: W0514 18:12:32.197500 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197577 kubelet[3205]: E0514 18:12:32.197569 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197634 kubelet[3205]: E0514 18:12:32.197584 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197634 kubelet[3205]: W0514 18:12:32.197605 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197634 kubelet[3205]: E0514 18:12:32.197615 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197767 kubelet[3205]: E0514 18:12:32.197676 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197767 kubelet[3205]: W0514 18:12:32.197681 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197767 kubelet[3205]: E0514 18:12:32.197692 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197767 kubelet[3205]: E0514 18:12:32.197764 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197767 kubelet[3205]: W0514 18:12:32.197768 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197885 kubelet[3205]: E0514 18:12:32.197780 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197885 kubelet[3205]: E0514 18:12:32.197857 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.197885 kubelet[3205]: W0514 18:12:32.197861 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.197885 kubelet[3205]: E0514 18:12:32.197874 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.197978 kubelet[3205]: E0514 18:12:32.197974 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.198006 kubelet[3205]: W0514 18:12:32.197978 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.198006 kubelet[3205]: E0514 18:12:32.197991 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.198073 kubelet[3205]: E0514 18:12:32.198064 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.198073 kubelet[3205]: W0514 18:12:32.198072 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.198129 kubelet[3205]: E0514 18:12:32.198077 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.198169 kubelet[3205]: E0514 18:12:32.198149 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.198169 kubelet[3205]: W0514 18:12:32.198154 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.198169 kubelet[3205]: E0514 18:12:32.198160 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.198299 kubelet[3205]: E0514 18:12:32.198268 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.198299 kubelet[3205]: W0514 18:12:32.198288 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.198299 kubelet[3205]: E0514 18:12:32.198295 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.215028 kubelet[3205]: E0514 18:12:32.214977 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:32.215028 kubelet[3205]: W0514 18:12:32.214991 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:32.215028 kubelet[3205]: E0514 18:12:32.215003 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:32.378911 containerd[1734]: time="2025-05-14T18:12:32.378801809Z" level=info msg="connecting to shim 164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7" address="unix:///run/containerd/s/379c474226fd13bf940cb8978440bcda9592673df3fd98ae004ef42a5b1c0f2d" namespace=k8s.io protocol=ttrpc version=3 May 14 18:12:32.396904 systemd[1]: Started cri-containerd-164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7.scope - libcontainer container 164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7. May 14 18:12:32.422246 containerd[1734]: time="2025-05-14T18:12:32.422228384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-42jns,Uid:41e82bc8-542e-4aed-91e7-0d220ea19b51,Namespace:calico-system,Attempt:0,} returns sandbox id \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\"" May 14 18:12:33.580430 kubelet[3205]: E0514 18:12:33.580385 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:35.580507 kubelet[3205]: E0514 18:12:35.580319 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:36.929586 containerd[1734]: time="2025-05-14T18:12:36.929332033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:36.978206 containerd[1734]: time="2025-05-14T18:12:36.978173846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 14 18:12:36.981297 containerd[1734]: time="2025-05-14T18:12:36.981238332Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:37.022473 containerd[1734]: time="2025-05-14T18:12:37.022431641Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:37.023027 containerd[1734]: time="2025-05-14T18:12:37.022950428Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 4.896115303s" May 14 18:12:37.023027 containerd[1734]: time="2025-05-14T18:12:37.022974728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 14 18:12:37.024088 containerd[1734]: time="2025-05-14T18:12:37.023987503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 14 18:12:37.035035 containerd[1734]: time="2025-05-14T18:12:37.034915525Z" level=info msg="CreateContainer within sandbox \"1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 14 18:12:37.284991 containerd[1734]: time="2025-05-14T18:12:37.284214810Z" level=info msg="Container c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:37.535580 containerd[1734]: time="2025-05-14T18:12:37.535511873Z" level=info msg="CreateContainer within sandbox \"1d8efa3580a3dbb5a81784584d78e829b8ff77c0db43fc6c5b428b807ffe9449\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce\"" May 14 18:12:37.535973 containerd[1734]: time="2025-05-14T18:12:37.535909766Z" level=info msg="StartContainer for \"c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce\"" May 14 18:12:37.536870 containerd[1734]: time="2025-05-14T18:12:37.536845643Z" level=info msg="connecting to shim c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce" address="unix:///run/containerd/s/ed38470e6da65c844c514524c5b37bd8c4a5da27d2a9e0504800be76976f62b5" protocol=ttrpc version=3 May 14 18:12:37.558912 systemd[1]: Started cri-containerd-c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce.scope - libcontainer container c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce. May 14 18:12:37.579940 kubelet[3205]: E0514 18:12:37.579893 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:37.600960 containerd[1734]: time="2025-05-14T18:12:37.600674443Z" level=info msg="StartContainer for \"c796eb5cf37db77da78c5c54044ed582c680b16f715d4d105fd3341dc34a82ce\" returns successfully" May 14 18:12:37.733023 kubelet[3205]: E0514 18:12:37.733003 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.733023 kubelet[3205]: W0514 18:12:37.733022 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.733123 kubelet[3205]: E0514 18:12:37.733037 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733154 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.733848 kubelet[3205]: W0514 18:12:37.733161 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733167 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733254 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.733848 kubelet[3205]: W0514 18:12:37.733258 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733264 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733344 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.733848 kubelet[3205]: W0514 18:12:37.733348 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733353 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.733848 kubelet[3205]: E0514 18:12:37.733437 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734100 kubelet[3205]: W0514 18:12:37.733441 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733474 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733558 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734100 kubelet[3205]: W0514 18:12:37.733565 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733571 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733651 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734100 kubelet[3205]: W0514 18:12:37.733656 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733663 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734100 kubelet[3205]: E0514 18:12:37.733761 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734100 kubelet[3205]: W0514 18:12:37.733772 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734296 kubelet[3205]: E0514 18:12:37.733780 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734411 kubelet[3205]: E0514 18:12:37.734399 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734434 kubelet[3205]: W0514 18:12:37.734412 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734434 kubelet[3205]: E0514 18:12:37.734424 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734548 kubelet[3205]: E0514 18:12:37.734541 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734573 kubelet[3205]: W0514 18:12:37.734548 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734573 kubelet[3205]: E0514 18:12:37.734555 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734828 kubelet[3205]: E0514 18:12:37.734638 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734828 kubelet[3205]: W0514 18:12:37.734643 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734828 kubelet[3205]: E0514 18:12:37.734649 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734828 kubelet[3205]: E0514 18:12:37.734729 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734828 kubelet[3205]: W0514 18:12:37.734734 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734828 kubelet[3205]: E0514 18:12:37.734739 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.734969 kubelet[3205]: E0514 18:12:37.734904 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.734969 kubelet[3205]: W0514 18:12:37.734909 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.734969 kubelet[3205]: E0514 18:12:37.734917 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.735124 kubelet[3205]: E0514 18:12:37.735077 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.735124 kubelet[3205]: W0514 18:12:37.735086 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.735124 kubelet[3205]: E0514 18:12:37.735094 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.735254 kubelet[3205]: E0514 18:12:37.735249 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.735318 kubelet[3205]: W0514 18:12:37.735280 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.735318 kubelet[3205]: E0514 18:12:37.735288 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.736486 kubelet[3205]: E0514 18:12:37.736473 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.736652 kubelet[3205]: W0514 18:12:37.736527 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.736652 kubelet[3205]: E0514 18:12:37.736541 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.736872 kubelet[3205]: E0514 18:12:37.736855 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.736872 kubelet[3205]: W0514 18:12:37.736863 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.736991 kubelet[3205]: E0514 18:12:37.736919 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.737137 kubelet[3205]: E0514 18:12:37.737122 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.737137 kubelet[3205]: W0514 18:12:37.737129 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.737256 kubelet[3205]: E0514 18:12:37.737172 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.737470 kubelet[3205]: E0514 18:12:37.737354 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.737470 kubelet[3205]: W0514 18:12:37.737361 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.737470 kubelet[3205]: E0514 18:12:37.737418 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.737590 kubelet[3205]: E0514 18:12:37.737541 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.737590 kubelet[3205]: W0514 18:12:37.737548 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.737590 kubelet[3205]: E0514 18:12:37.737557 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.737819 kubelet[3205]: E0514 18:12:37.737669 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.737819 kubelet[3205]: W0514 18:12:37.737675 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.737819 kubelet[3205]: E0514 18:12:37.737689 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.737907 kubelet[3205]: E0514 18:12:37.737890 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.737907 kubelet[3205]: W0514 18:12:37.737901 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.737968 kubelet[3205]: E0514 18:12:37.737948 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738058 kubelet[3205]: E0514 18:12:37.738013 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738058 kubelet[3205]: W0514 18:12:37.738020 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738095 kubelet[3205]: E0514 18:12:37.738086 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738172 kubelet[3205]: E0514 18:12:37.738159 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738172 kubelet[3205]: W0514 18:12:37.738166 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738229 kubelet[3205]: E0514 18:12:37.738182 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738307 kubelet[3205]: E0514 18:12:37.738280 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738307 kubelet[3205]: W0514 18:12:37.738302 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738374 kubelet[3205]: E0514 18:12:37.738311 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738404 kubelet[3205]: E0514 18:12:37.738400 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738424 kubelet[3205]: W0514 18:12:37.738405 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738424 kubelet[3205]: E0514 18:12:37.738412 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738540 kubelet[3205]: E0514 18:12:37.738515 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738540 kubelet[3205]: W0514 18:12:37.738536 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738599 kubelet[3205]: E0514 18:12:37.738544 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.738875 kubelet[3205]: E0514 18:12:37.738861 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.738875 kubelet[3205]: W0514 18:12:37.738873 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.738947 kubelet[3205]: E0514 18:12:37.738890 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.739006 kubelet[3205]: E0514 18:12:37.738997 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.739006 kubelet[3205]: W0514 18:12:37.739004 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.739052 kubelet[3205]: E0514 18:12:37.739017 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.739106 kubelet[3205]: E0514 18:12:37.739098 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.739106 kubelet[3205]: W0514 18:12:37.739104 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.739144 kubelet[3205]: E0514 18:12:37.739116 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.739228 kubelet[3205]: E0514 18:12:37.739220 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.739228 kubelet[3205]: W0514 18:12:37.739227 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.739268 kubelet[3205]: E0514 18:12:37.739239 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.739388 kubelet[3205]: E0514 18:12:37.739379 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.739415 kubelet[3205]: W0514 18:12:37.739407 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.739435 kubelet[3205]: E0514 18:12:37.739423 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:37.739562 kubelet[3205]: E0514 18:12:37.739537 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:37.739562 kubelet[3205]: W0514 18:12:37.739559 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:37.739605 kubelet[3205]: E0514 18:12:37.739566 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.645492 kubelet[3205]: I0514 18:12:38.645469 3205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 18:12:38.741340 kubelet[3205]: E0514 18:12:38.741318 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741340 kubelet[3205]: W0514 18:12:38.741334 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741463 kubelet[3205]: E0514 18:12:38.741349 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741463 kubelet[3205]: E0514 18:12:38.741442 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741463 kubelet[3205]: W0514 18:12:38.741447 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741463 kubelet[3205]: E0514 18:12:38.741454 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741555 kubelet[3205]: E0514 18:12:38.741536 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741555 kubelet[3205]: W0514 18:12:38.741540 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741555 kubelet[3205]: E0514 18:12:38.741546 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741627 kubelet[3205]: E0514 18:12:38.741619 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741627 kubelet[3205]: W0514 18:12:38.741626 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741665 kubelet[3205]: E0514 18:12:38.741632 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741727 kubelet[3205]: E0514 18:12:38.741717 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741727 kubelet[3205]: W0514 18:12:38.741724 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741790 kubelet[3205]: E0514 18:12:38.741730 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741836 kubelet[3205]: E0514 18:12:38.741822 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741836 kubelet[3205]: W0514 18:12:38.741833 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741883 kubelet[3205]: E0514 18:12:38.741840 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.741935 kubelet[3205]: E0514 18:12:38.741912 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.741935 kubelet[3205]: W0514 18:12:38.741932 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.741980 kubelet[3205]: E0514 18:12:38.741938 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742030 kubelet[3205]: E0514 18:12:38.742008 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742030 kubelet[3205]: W0514 18:12:38.742027 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742075 kubelet[3205]: E0514 18:12:38.742032 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742117 kubelet[3205]: E0514 18:12:38.742109 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742117 kubelet[3205]: W0514 18:12:38.742115 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742162 kubelet[3205]: E0514 18:12:38.742120 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742210 kubelet[3205]: E0514 18:12:38.742187 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742210 kubelet[3205]: W0514 18:12:38.742206 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742257 kubelet[3205]: E0514 18:12:38.742212 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742290 kubelet[3205]: E0514 18:12:38.742281 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742290 kubelet[3205]: W0514 18:12:38.742287 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742330 kubelet[3205]: E0514 18:12:38.742300 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742391 kubelet[3205]: E0514 18:12:38.742370 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742391 kubelet[3205]: W0514 18:12:38.742389 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742438 kubelet[3205]: E0514 18:12:38.742395 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742469 kubelet[3205]: E0514 18:12:38.742461 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742469 kubelet[3205]: W0514 18:12:38.742467 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742518 kubelet[3205]: E0514 18:12:38.742473 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742545 kubelet[3205]: E0514 18:12:38.742538 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742545 kubelet[3205]: W0514 18:12:38.742542 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742595 kubelet[3205]: E0514 18:12:38.742547 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742619 kubelet[3205]: E0514 18:12:38.742609 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742619 kubelet[3205]: W0514 18:12:38.742613 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742660 kubelet[3205]: E0514 18:12:38.742618 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742801 kubelet[3205]: E0514 18:12:38.742779 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742801 kubelet[3205]: W0514 18:12:38.742798 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742881 kubelet[3205]: E0514 18:12:38.742805 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742914 kubelet[3205]: E0514 18:12:38.742905 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.742914 kubelet[3205]: W0514 18:12:38.742912 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.742990 kubelet[3205]: E0514 18:12:38.742917 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.742990 kubelet[3205]: E0514 18:12:38.742988 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743095 kubelet[3205]: W0514 18:12:38.742993 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743095 kubelet[3205]: E0514 18:12:38.742999 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743173 kubelet[3205]: E0514 18:12:38.743107 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743173 kubelet[3205]: W0514 18:12:38.743115 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743173 kubelet[3205]: E0514 18:12:38.743128 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743264 kubelet[3205]: E0514 18:12:38.743210 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743264 kubelet[3205]: W0514 18:12:38.743215 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743264 kubelet[3205]: E0514 18:12:38.743224 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743348 kubelet[3205]: E0514 18:12:38.743289 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743348 kubelet[3205]: W0514 18:12:38.743293 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743348 kubelet[3205]: E0514 18:12:38.743301 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743425 kubelet[3205]: E0514 18:12:38.743385 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743425 kubelet[3205]: W0514 18:12:38.743390 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743425 kubelet[3205]: E0514 18:12:38.743399 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743688 kubelet[3205]: E0514 18:12:38.743621 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743688 kubelet[3205]: W0514 18:12:38.743632 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743688 kubelet[3205]: E0514 18:12:38.743643 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743776 kubelet[3205]: E0514 18:12:38.743752 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743776 kubelet[3205]: W0514 18:12:38.743771 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743823 kubelet[3205]: E0514 18:12:38.743785 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743888 kubelet[3205]: E0514 18:12:38.743875 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.743888 kubelet[3205]: W0514 18:12:38.743882 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.743995 kubelet[3205]: E0514 18:12:38.743891 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.743995 kubelet[3205]: E0514 18:12:38.743992 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744079 kubelet[3205]: W0514 18:12:38.743996 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744079 kubelet[3205]: E0514 18:12:38.744002 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744184 kubelet[3205]: E0514 18:12:38.744178 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744210 kubelet[3205]: W0514 18:12:38.744189 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744210 kubelet[3205]: E0514 18:12:38.744204 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744339 kubelet[3205]: E0514 18:12:38.744331 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744369 kubelet[3205]: W0514 18:12:38.744339 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744369 kubelet[3205]: E0514 18:12:38.744348 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744442 kubelet[3205]: E0514 18:12:38.744428 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744442 kubelet[3205]: W0514 18:12:38.744439 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744496 kubelet[3205]: E0514 18:12:38.744445 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744569 kubelet[3205]: E0514 18:12:38.744552 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744569 kubelet[3205]: W0514 18:12:38.744567 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744622 kubelet[3205]: E0514 18:12:38.744582 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744718 kubelet[3205]: E0514 18:12:38.744709 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744718 kubelet[3205]: W0514 18:12:38.744717 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744787 kubelet[3205]: E0514 18:12:38.744729 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.744864 kubelet[3205]: E0514 18:12:38.744834 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.744864 kubelet[3205]: W0514 18:12:38.744841 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.744864 kubelet[3205]: E0514 18:12:38.744847 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:38.745058 kubelet[3205]: E0514 18:12:38.745039 3205 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 18:12:38.745058 kubelet[3205]: W0514 18:12:38.745055 3205 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 18:12:38.745098 kubelet[3205]: E0514 18:12:38.745061 3205 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 18:12:39.580188 kubelet[3205]: E0514 18:12:39.580135 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:40.122965 containerd[1734]: time="2025-05-14T18:12:40.122924421Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:40.127640 containerd[1734]: time="2025-05-14T18:12:40.127608619Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 14 18:12:40.173932 containerd[1734]: time="2025-05-14T18:12:40.173876072Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:40.217287 containerd[1734]: time="2025-05-14T18:12:40.217257338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:12:40.217755 containerd[1734]: time="2025-05-14T18:12:40.217686874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 3.193659415s" May 14 18:12:40.217755 containerd[1734]: time="2025-05-14T18:12:40.217712174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 14 18:12:40.219549 containerd[1734]: time="2025-05-14T18:12:40.219512896Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 18:12:40.379800 containerd[1734]: time="2025-05-14T18:12:40.379617420Z" level=info msg="Container 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b: CDI devices from CRI Config.CDIDevices: []" May 14 18:12:40.519778 containerd[1734]: time="2025-05-14T18:12:40.519717509Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\"" May 14 18:12:40.520056 containerd[1734]: time="2025-05-14T18:12:40.520035202Z" level=info msg="StartContainer for \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\"" May 14 18:12:40.521410 containerd[1734]: time="2025-05-14T18:12:40.521351354Z" level=info msg="connecting to shim 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b" address="unix:///run/containerd/s/379c474226fd13bf940cb8978440bcda9592673df3fd98ae004ef42a5b1c0f2d" protocol=ttrpc version=3 May 14 18:12:40.542864 systemd[1]: Started cri-containerd-9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b.scope - libcontainer container 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b. May 14 18:12:40.570336 containerd[1734]: time="2025-05-14T18:12:40.570308705Z" level=info msg="StartContainer for \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" returns successfully" May 14 18:12:40.574988 systemd[1]: cri-containerd-9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b.scope: Deactivated successfully. May 14 18:12:40.577766 containerd[1734]: time="2025-05-14T18:12:40.577726459Z" level=info msg="received exit event container_id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" pid:3894 exited_at:{seconds:1747246360 nanos:577401545}" May 14 18:12:40.577914 containerd[1734]: time="2025-05-14T18:12:40.577822716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" pid:3894 exited_at:{seconds:1747246360 nanos:577401545}" May 14 18:12:40.593035 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b-rootfs.mount: Deactivated successfully. May 14 18:12:40.662584 kubelet[3205]: I0514 18:12:40.660714 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-799d64d946-hm4hn" podStartSLOduration=4.763492217 podStartE2EDuration="9.660694325s" podCreationTimestamp="2025-05-14 18:12:31 +0000 UTC" firstStartedPulling="2025-05-14 18:12:32.126417445 +0000 UTC m=+19.618940793" lastFinishedPulling="2025-05-14 18:12:37.023619547 +0000 UTC m=+24.516142901" observedRunningTime="2025-05-14 18:12:37.664530783 +0000 UTC m=+25.157054143" watchObservedRunningTime="2025-05-14 18:12:40.660694325 +0000 UTC m=+28.153217753" May 14 18:12:41.579868 kubelet[3205]: E0514 18:12:41.579830 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:43.579527 kubelet[3205]: E0514 18:12:43.579497 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:45.579648 kubelet[3205]: E0514 18:12:45.579584 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:47.580146 kubelet[3205]: E0514 18:12:47.580051 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:49.579760 kubelet[3205]: E0514 18:12:49.579708 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:50.577949 containerd[1734]: time="2025-05-14T18:12:50.577894784Z" level=error msg="failed to handle container TaskExit event container_id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" pid:3894 exited_at:{seconds:1747246360 nanos:577401545}" error="failed to stop container: failed to delete task: context deadline exceeded" May 14 18:12:51.580037 kubelet[3205]: E0514 18:12:51.579993 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:52.502815 containerd[1734]: time="2025-05-14T18:12:52.502740875Z" level=info msg="TaskExit event container_id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" id:\"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" pid:3894 exited_at:{seconds:1747246360 nanos:577401545}" May 14 18:12:53.579729 kubelet[3205]: E0514 18:12:53.579698 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:54.503282 containerd[1734]: time="2025-05-14T18:12:54.503239465Z" level=error msg="get state for 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b" error="context deadline exceeded" May 14 18:12:54.503282 containerd[1734]: time="2025-05-14T18:12:54.503271019Z" level=warning msg="unknown status" status=0 May 14 18:12:55.580263 kubelet[3205]: E0514 18:12:55.580193 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:56.504286 containerd[1734]: time="2025-05-14T18:12:56.504216280Z" level=error msg="get state for 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b" error="context deadline exceeded" May 14 18:12:56.504286 containerd[1734]: time="2025-05-14T18:12:56.504275435Z" level=warning msg="unknown status" status=0 May 14 18:12:57.579909 kubelet[3205]: E0514 18:12:57.579844 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:12:58.026589 kubelet[3205]: I0514 18:12:58.026317 3205 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 18:12:58.118116 containerd[1734]: time="2025-05-14T18:12:58.118058835Z" level=error msg="ttrpc: received message on inactive stream" stream=31 May 14 18:12:58.118116 containerd[1734]: time="2025-05-14T18:12:58.118115157Z" level=error msg="ttrpc: received message on inactive stream" stream=35 May 14 18:12:58.118575 containerd[1734]: time="2025-05-14T18:12:58.118123354Z" level=error msg="ttrpc: received message on inactive stream" stream=39 May 14 18:12:58.119374 containerd[1734]: time="2025-05-14T18:12:58.119346498Z" level=info msg="Ensure that container 9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b in task-service has been cleanup successfully" May 14 18:12:58.678502 containerd[1734]: time="2025-05-14T18:12:58.678468274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 14 18:12:59.579625 kubelet[3205]: E0514 18:12:59.579593 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:01.579860 kubelet[3205]: E0514 18:13:01.579816 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:03.579711 kubelet[3205]: E0514 18:13:03.579679 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:05.579563 kubelet[3205]: E0514 18:13:05.579519 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:07.580594 kubelet[3205]: E0514 18:13:07.580558 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:08.523650 containerd[1734]: time="2025-05-14T18:13:08.523591543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:08.526093 containerd[1734]: time="2025-05-14T18:13:08.526066281Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 14 18:13:08.572880 containerd[1734]: time="2025-05-14T18:13:08.572814082Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:08.619091 containerd[1734]: time="2025-05-14T18:13:08.619047530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:08.619636 containerd[1734]: time="2025-05-14T18:13:08.619559706Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 9.941056642s" May 14 18:13:08.619636 containerd[1734]: time="2025-05-14T18:13:08.619584105Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 14 18:13:08.621527 containerd[1734]: time="2025-05-14T18:13:08.621502898Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 18:13:08.827030 containerd[1734]: time="2025-05-14T18:13:08.826971012Z" level=info msg="Container 8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad: CDI devices from CRI Config.CDIDevices: []" May 14 18:13:08.979562 containerd[1734]: time="2025-05-14T18:13:08.979541619Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\"" May 14 18:13:08.979937 containerd[1734]: time="2025-05-14T18:13:08.979876869Z" level=info msg="StartContainer for \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\"" May 14 18:13:08.981234 containerd[1734]: time="2025-05-14T18:13:08.981169884Z" level=info msg="connecting to shim 8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad" address="unix:///run/containerd/s/379c474226fd13bf940cb8978440bcda9592673df3fd98ae004ef42a5b1c0f2d" protocol=ttrpc version=3 May 14 18:13:09.001862 systemd[1]: Started cri-containerd-8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad.scope - libcontainer container 8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad. May 14 18:13:09.033541 containerd[1734]: time="2025-05-14T18:13:09.033522063Z" level=info msg="StartContainer for \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" returns successfully" May 14 18:13:09.579636 kubelet[3205]: E0514 18:13:09.579583 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:11.580052 kubelet[3205]: E0514 18:13:11.579988 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:13.579608 kubelet[3205]: E0514 18:13:13.579564 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:15.579768 kubelet[3205]: E0514 18:13:15.579706 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:17.579959 kubelet[3205]: E0514 18:13:17.579920 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:18.044380 systemd[1]: cri-containerd-8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad.scope: Deactivated successfully. May 14 18:13:18.044644 systemd[1]: cri-containerd-8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad.scope: Consumed 336ms CPU time, 172.8M memory peak, 154M written to disk. May 14 18:13:18.045701 containerd[1734]: time="2025-05-14T18:13:18.045671670Z" level=info msg="received exit event container_id:\"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" id:\"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" pid:3958 exited_at:{seconds:1747246398 nanos:45496526}" May 14 18:13:18.045993 containerd[1734]: time="2025-05-14T18:13:18.045961604Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" id:\"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" pid:3958 exited_at:{seconds:1747246398 nanos:45496526}" May 14 18:13:18.064705 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad-rootfs.mount: Deactivated successfully. May 14 18:13:18.064984 kubelet[3205]: I0514 18:13:18.064962 3205 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 14 18:13:18.101806 systemd[1]: Created slice kubepods-burstable-pod02387ae7_7df0_4320_8070_8518fe0bd4d8.slice - libcontainer container kubepods-burstable-pod02387ae7_7df0_4320_8070_8518fe0bd4d8.slice. May 14 18:13:18.113303 systemd[1]: Created slice kubepods-besteffort-pod0272f273_8188_48f8_bfc1_35ee033246c4.slice - libcontainer container kubepods-besteffort-pod0272f273_8188_48f8_bfc1_35ee033246c4.slice. May 14 18:13:18.118308 systemd[1]: Created slice kubepods-burstable-podf855f8c6_0735_4302_b463_7204c3026e15.slice - libcontainer container kubepods-burstable-podf855f8c6_0735_4302_b463_7204c3026e15.slice. May 14 18:13:18.126276 systemd[1]: Created slice kubepods-besteffort-pod0c3f601d_87ef_4fb7_a83d_21758b09a12d.slice - libcontainer container kubepods-besteffort-pod0c3f601d_87ef_4fb7_a83d_21758b09a12d.slice. May 14 18:13:18.130312 systemd[1]: Created slice kubepods-besteffort-podb7e2c3b5_0b2e_4624_9ffb_7b15307add96.slice - libcontainer container kubepods-besteffort-podb7e2c3b5_0b2e_4624_9ffb_7b15307add96.slice. May 14 18:13:18.252614 kubelet[3205]: I0514 18:13:18.252587 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z8vb\" (UniqueName: \"kubernetes.io/projected/0272f273-8188-48f8-bfc1-35ee033246c4-kube-api-access-8z8vb\") pod \"calico-apiserver-6cb7cd99bf-2dhkq\" (UID: \"0272f273-8188-48f8-bfc1-35ee033246c4\") " pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:18.252614 kubelet[3205]: I0514 18:13:18.252615 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jghr\" (UniqueName: \"kubernetes.io/projected/f855f8c6-0735-4302-b463-7204c3026e15-kube-api-access-2jghr\") pod \"coredns-6f6b679f8f-clvtl\" (UID: \"f855f8c6-0735-4302-b463-7204c3026e15\") " pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:18.252716 kubelet[3205]: I0514 18:13:18.252632 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0c3f601d-87ef-4fb7-a83d-21758b09a12d-calico-apiserver-certs\") pod \"calico-apiserver-6cb7cd99bf-z6fr6\" (UID: \"0c3f601d-87ef-4fb7-a83d-21758b09a12d\") " pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:18.252716 kubelet[3205]: I0514 18:13:18.252648 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02387ae7-7df0-4320-8070-8518fe0bd4d8-config-volume\") pod \"coredns-6f6b679f8f-v4tq6\" (UID: \"02387ae7-7df0-4320-8070-8518fe0bd4d8\") " pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:18.252716 kubelet[3205]: I0514 18:13:18.252667 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgpf\" (UniqueName: \"kubernetes.io/projected/02387ae7-7df0-4320-8070-8518fe0bd4d8-kube-api-access-2tgpf\") pod \"coredns-6f6b679f8f-v4tq6\" (UID: \"02387ae7-7df0-4320-8070-8518fe0bd4d8\") " pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:18.252716 kubelet[3205]: I0514 18:13:18.252681 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0272f273-8188-48f8-bfc1-35ee033246c4-calico-apiserver-certs\") pod \"calico-apiserver-6cb7cd99bf-2dhkq\" (UID: \"0272f273-8188-48f8-bfc1-35ee033246c4\") " pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:18.252716 kubelet[3205]: I0514 18:13:18.252699 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e2c3b5-0b2e-4624-9ffb-7b15307add96-tigera-ca-bundle\") pod \"calico-kube-controllers-58d5b579d7-9wc6k\" (UID: \"b7e2c3b5-0b2e-4624-9ffb-7b15307add96\") " pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:18.252838 kubelet[3205]: I0514 18:13:18.252716 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f855f8c6-0735-4302-b463-7204c3026e15-config-volume\") pod \"coredns-6f6b679f8f-clvtl\" (UID: \"f855f8c6-0735-4302-b463-7204c3026e15\") " pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:18.252838 kubelet[3205]: I0514 18:13:18.252733 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tmh\" (UniqueName: \"kubernetes.io/projected/b7e2c3b5-0b2e-4624-9ffb-7b15307add96-kube-api-access-z2tmh\") pod \"calico-kube-controllers-58d5b579d7-9wc6k\" (UID: \"b7e2c3b5-0b2e-4624-9ffb-7b15307add96\") " pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:18.252838 kubelet[3205]: I0514 18:13:18.252767 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7gn\" (UniqueName: \"kubernetes.io/projected/0c3f601d-87ef-4fb7-a83d-21758b09a12d-kube-api-access-tz7gn\") pod \"calico-apiserver-6cb7cd99bf-z6fr6\" (UID: \"0c3f601d-87ef-4fb7-a83d-21758b09a12d\") " pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:18.406085 containerd[1734]: time="2025-05-14T18:13:18.406060659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,}" May 14 18:13:18.416516 containerd[1734]: time="2025-05-14T18:13:18.416492400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:18.423145 containerd[1734]: time="2025-05-14T18:13:18.423121013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,}" May 14 18:13:18.429782 containerd[1734]: time="2025-05-14T18:13:18.429759836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:18.432205 containerd[1734]: time="2025-05-14T18:13:18.432182685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,}" May 14 18:13:19.584524 systemd[1]: Created slice kubepods-besteffort-podf8c0ddea_8d34_4405_b1d7_cbac1335ae76.slice - libcontainer container kubepods-besteffort-podf8c0ddea_8d34_4405_b1d7_cbac1335ae76.slice. May 14 18:13:19.586287 containerd[1734]: time="2025-05-14T18:13:19.586255935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,}" May 14 18:13:27.101726 containerd[1734]: time="2025-05-14T18:13:27.101682925Z" level=error msg="Failed to destroy network for sandbox \"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:27.103830 systemd[1]: run-netns-cni\x2d3378496d\x2d1261\x2d9b82\x2dfd58\x2d4177099e06a3.mount: Deactivated successfully. May 14 18:13:28.101297 containerd[1734]: time="2025-05-14T18:13:28.101257476Z" level=error msg="Failed to destroy network for sandbox \"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.103386 systemd[1]: run-netns-cni\x2d04ab954f\x2d16df\x2dd2cd\x2d8024\x2d6f4dd7f961fd.mount: Deactivated successfully. May 14 18:13:28.129227 containerd[1734]: time="2025-05-14T18:13:28.129184297Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.129512 kubelet[3205]: E0514 18:13:28.129405 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.129512 kubelet[3205]: E0514 18:13:28.129477 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:28.129512 kubelet[3205]: E0514 18:13:28.129495 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:28.129728 kubelet[3205]: E0514 18:13:28.129537 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"796690cb85cecd4ac93429c07eca5c117435a34a82f4d93dd06145b825c61932\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" podUID="0272f273-8188-48f8-bfc1-35ee033246c4" May 14 18:13:28.207308 containerd[1734]: time="2025-05-14T18:13:28.207274630Z" level=error msg="Failed to destroy network for sandbox \"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.208956 systemd[1]: run-netns-cni\x2d82e53712\x2df96f\x2d18be\x2d225d\x2df4e9ee1f29d2.mount: Deactivated successfully. May 14 18:13:28.254367 containerd[1734]: time="2025-05-14T18:13:28.254328987Z" level=error msg="Failed to destroy network for sandbox \"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.256020 systemd[1]: run-netns-cni\x2d01fd8c23\x2dea3d\x2d6e0c\x2dd2c0\x2d779dd5206f15.mount: Deactivated successfully. May 14 18:13:28.323564 containerd[1734]: time="2025-05-14T18:13:28.323395734Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.323718 kubelet[3205]: E0514 18:13:28.323689 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.323848 kubelet[3205]: E0514 18:13:28.323779 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:28.323848 kubelet[3205]: E0514 18:13:28.323800 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:28.324118 kubelet[3205]: E0514 18:13:28.323936 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5caf8fa4d2f9dee64e2ec9f0615c1ab6a6a72b1beb146e8491abd1561d5ba46a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-v4tq6" podUID="02387ae7-7df0-4320-8070-8518fe0bd4d8" May 14 18:13:28.346215 containerd[1734]: time="2025-05-14T18:13:28.345859310Z" level=error msg="Failed to destroy network for sandbox \"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.348041 systemd[1]: run-netns-cni\x2d51cb40ed\x2dc914\x2d1d4a\x2d79ce\x2d66e23cfdb87b.mount: Deactivated successfully. May 14 18:13:28.361615 containerd[1734]: time="2025-05-14T18:13:28.361570965Z" level=error msg="Failed to destroy network for sandbox \"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.363194 systemd[1]: run-netns-cni\x2de695caf9\x2d48a1\x2d624e\x2dc20a\x2dee13b109d78c.mount: Deactivated successfully. May 14 18:13:28.366724 containerd[1734]: time="2025-05-14T18:13:28.366691705Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.367150 kubelet[3205]: E0514 18:13:28.366868 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.367150 kubelet[3205]: E0514 18:13:28.366912 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:28.367150 kubelet[3205]: E0514 18:13:28.366929 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:28.367296 kubelet[3205]: E0514 18:13:28.366965 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b7d0d11c5171c72657b10f88cf0fb81f21dd0a1beea4ec33390dcd312cad79f8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-clvtl" podUID="f855f8c6-0735-4302-b463-7204c3026e15" May 14 18:13:28.433189 containerd[1734]: time="2025-05-14T18:13:28.433154018Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.433362 kubelet[3205]: E0514 18:13:28.433318 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.433423 kubelet[3205]: E0514 18:13:28.433378 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:28.433423 kubelet[3205]: E0514 18:13:28.433397 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:28.433469 kubelet[3205]: E0514 18:13:28.433432 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd402485a3ccc7ad53f2e08fcb4d3564c5ecc0af79729aa8f045bc855ae96b30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" podUID="0c3f601d-87ef-4fb7-a83d-21758b09a12d" May 14 18:13:28.441356 containerd[1734]: time="2025-05-14T18:13:28.441326749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.441569 kubelet[3205]: E0514 18:13:28.441517 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.441635 kubelet[3205]: E0514 18:13:28.441586 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:28.441635 kubelet[3205]: E0514 18:13:28.441604 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:28.441698 kubelet[3205]: E0514 18:13:28.441639 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38754d751c6450e3c9e4d317b8799fee5ac0f18fd769b7dd4933fc51c1cdd097\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" podUID="b7e2c3b5-0b2e-4624-9ffb-7b15307add96" May 14 18:13:28.478565 containerd[1734]: time="2025-05-14T18:13:28.478532039Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.478690 kubelet[3205]: E0514 18:13:28.478672 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:28.478730 kubelet[3205]: E0514 18:13:28.478705 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:13:28.478730 kubelet[3205]: E0514 18:13:28.478721 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:13:28.478789 kubelet[3205]: E0514 18:13:28.478764 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"293c62aa39d96ba280141cbe5dc24a36e55d61d36363c6783d217c99c4166599\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:28.729601 containerd[1734]: time="2025-05-14T18:13:28.729498497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 14 18:13:39.580070 containerd[1734]: time="2025-05-14T18:13:39.580020782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:39.827586 containerd[1734]: time="2025-05-14T18:13:39.580126800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:40.253576 containerd[1734]: time="2025-05-14T18:13:40.253534773Z" level=error msg="Failed to destroy network for sandbox \"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.255516 systemd[1]: run-netns-cni\x2df5b3e402\x2dff5e\x2d5aa9\x2de38b\x2dd5a81b54084a.mount: Deactivated successfully. May 14 18:13:40.302661 containerd[1734]: time="2025-05-14T18:13:40.302635427Z" level=error msg="Failed to destroy network for sandbox \"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.304308 systemd[1]: run-netns-cni\x2d99100314\x2d09d9\x2d0c58\x2d016a\x2ded214825bfd6.mount: Deactivated successfully. May 14 18:13:40.320930 containerd[1734]: time="2025-05-14T18:13:40.320798422Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.321878 kubelet[3205]: E0514 18:13:40.321844 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.322134 kubelet[3205]: E0514 18:13:40.321902 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:40.322134 kubelet[3205]: E0514 18:13:40.321920 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:13:40.322134 kubelet[3205]: E0514 18:13:40.321964 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a60532a59418203f24dafd504a0b61e91f2044a0bb6ba3c3c0bc50c95e16298b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" podUID="0272f273-8188-48f8-bfc1-35ee033246c4" May 14 18:13:40.367167 containerd[1734]: time="2025-05-14T18:13:40.367122279Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.367296 kubelet[3205]: E0514 18:13:40.367271 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.367538 kubelet[3205]: E0514 18:13:40.367431 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:40.367538 kubelet[3205]: E0514 18:13:40.367484 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:13:40.367663 kubelet[3205]: E0514 18:13:40.367524 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd607c887013a54e9103cbf8b5143036b2b8e7ccb1ded80f80bdb78bf6b16403\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" podUID="0c3f601d-87ef-4fb7-a83d-21758b09a12d" May 14 18:13:40.581246 containerd[1734]: time="2025-05-14T18:13:40.580647820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,}" May 14 18:13:40.581246 containerd[1734]: time="2025-05-14T18:13:40.580969790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,}" May 14 18:13:40.582552 containerd[1734]: time="2025-05-14T18:13:40.581474495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,}" May 14 18:13:40.729243 containerd[1734]: time="2025-05-14T18:13:40.729123386Z" level=error msg="Failed to destroy network for sandbox \"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.774350 containerd[1734]: time="2025-05-14T18:13:40.774219282Z" level=error msg="Failed to destroy network for sandbox \"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.867471 containerd[1734]: time="2025-05-14T18:13:40.867437544Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.867803 kubelet[3205]: E0514 18:13:40.867772 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.867872 kubelet[3205]: E0514 18:13:40.867820 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:40.868036 kubelet[3205]: E0514 18:13:40.868014 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:13:40.868339 kubelet[3205]: E0514 18:13:40.868064 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5608c035ccdf8a9474ff885b50ad6d5bd96be80a7392431c4aab108d5a18560\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-v4tq6" podUID="02387ae7-7df0-4320-8070-8518fe0bd4d8" May 14 18:13:40.870469 containerd[1734]: time="2025-05-14T18:13:40.870400037Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.870768 kubelet[3205]: E0514 18:13:40.870604 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.870829 kubelet[3205]: E0514 18:13:40.870787 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:13:40.870829 kubelet[3205]: E0514 18:13:40.870815 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:13:40.870927 kubelet[3205]: E0514 18:13:40.870907 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f57fbede13678f558007883b4f702dd482ea0820760164ea093a36643cf47af\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:13:40.874414 containerd[1734]: time="2025-05-14T18:13:40.874060749Z" level=error msg="Failed to destroy network for sandbox \"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.916755 containerd[1734]: time="2025-05-14T18:13:40.916425890Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.916854 kubelet[3205]: E0514 18:13:40.916710 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:40.916889 kubelet[3205]: E0514 18:13:40.916849 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:40.916889 kubelet[3205]: E0514 18:13:40.916868 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:13:40.916964 kubelet[3205]: E0514 18:13:40.916902 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c19e518c87c5c2eef6df82b92b69c1b2cd4ccbd7415d16a8f9d0ed94f432e575\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" podUID="b7e2c3b5-0b2e-4624-9ffb-7b15307add96" May 14 18:13:41.580541 containerd[1734]: time="2025-05-14T18:13:41.580512104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,}" May 14 18:13:41.677661 containerd[1734]: time="2025-05-14T18:13:41.677619781Z" level=error msg="Failed to destroy network for sandbox \"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:41.680296 systemd[1]: run-netns-cni\x2de2d35bb7\x2d0392\x2d2738\x2db8a2\x2d55a74920ef58.mount: Deactivated successfully. May 14 18:13:41.682393 containerd[1734]: time="2025-05-14T18:13:41.682208116Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:41.682551 kubelet[3205]: E0514 18:13:41.682529 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:13:41.682736 kubelet[3205]: E0514 18:13:41.682574 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:41.682736 kubelet[3205]: E0514 18:13:41.682593 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:13:41.682736 kubelet[3205]: E0514 18:13:41.682637 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94565b469e893687d2c3f1d76e763e7edd6b84c5ee72254ae4121e8de5353f21\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-clvtl" podUID="f855f8c6-0735-4302-b463-7204c3026e15" May 14 18:13:41.976246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2580525996.mount: Deactivated successfully. May 14 18:13:42.179081 containerd[1734]: time="2025-05-14T18:13:42.179039651Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:42.228175 containerd[1734]: time="2025-05-14T18:13:42.228106561Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 14 18:13:42.273027 containerd[1734]: time="2025-05-14T18:13:42.272955416Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:42.276642 containerd[1734]: time="2025-05-14T18:13:42.276614691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:13:42.277068 containerd[1734]: time="2025-05-14T18:13:42.276971111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 13.547428097s" May 14 18:13:42.277068 containerd[1734]: time="2025-05-14T18:13:42.276998894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 14 18:13:42.287459 containerd[1734]: time="2025-05-14T18:13:42.287406768Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 18:13:42.625757 containerd[1734]: time="2025-05-14T18:13:42.625614144Z" level=info msg="Container ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037: CDI devices from CRI Config.CDIDevices: []" May 14 18:13:42.726112 containerd[1734]: time="2025-05-14T18:13:42.726088184Z" level=info msg="CreateContainer within sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\"" May 14 18:13:42.726860 containerd[1734]: time="2025-05-14T18:13:42.726440595Z" level=info msg="StartContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\"" May 14 18:13:42.727912 containerd[1734]: time="2025-05-14T18:13:42.727855716Z" level=info msg="connecting to shim ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" address="unix:///run/containerd/s/379c474226fd13bf940cb8978440bcda9592673df3fd98ae004ef42a5b1c0f2d" protocol=ttrpc version=3 May 14 18:13:42.742904 systemd[1]: Started cri-containerd-ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037.scope - libcontainer container ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037. May 14 18:13:42.785526 containerd[1734]: time="2025-05-14T18:13:42.785495185Z" level=info msg="StartContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" returns successfully" May 14 18:13:43.348542 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 14 18:13:43.348658 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 14 18:13:43.365419 systemd[1]: cri-containerd-ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037.scope: Deactivated successfully. May 14 18:13:43.367670 containerd[1734]: time="2025-05-14T18:13:43.367541389Z" level=info msg="received exit event container_id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" pid:4385 exit_status:1 exited_at:{seconds:1747246423 nanos:367352510}" May 14 18:13:43.367786 containerd[1734]: time="2025-05-14T18:13:43.367641962Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" pid:4385 exit_status:1 exited_at:{seconds:1747246423 nanos:367352510}" May 14 18:13:43.384521 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037-rootfs.mount: Deactivated successfully. May 14 18:13:43.775616 kubelet[3205]: I0514 18:13:43.775088 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-42jns" podStartSLOduration=2.920285698 podStartE2EDuration="1m12.774723938s" podCreationTimestamp="2025-05-14 18:12:31 +0000 UTC" firstStartedPulling="2025-05-14 18:12:32.4232322 +0000 UTC m=+19.915755545" lastFinishedPulling="2025-05-14 18:13:42.27767043 +0000 UTC m=+89.770193785" observedRunningTime="2025-05-14 18:13:43.773091328 +0000 UTC m=+91.265614685" watchObservedRunningTime="2025-05-14 18:13:43.774723938 +0000 UTC m=+91.267247414" May 14 18:13:45.762077 containerd[1734]: time="2025-05-14T18:13:45.761421699Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:13:45.762077 containerd[1734]: time="2025-05-14T18:13:45.761501592Z" level=warning msg="unknown status" status=0 May 14 18:13:45.762513 containerd[1734]: time="2025-05-14T18:13:45.762098867Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:13:45.762513 containerd[1734]: time="2025-05-14T18:13:45.762114277Z" level=warning msg="unknown status" status=0 May 14 18:13:48.760982 containerd[1734]: time="2025-05-14T18:13:48.760938787Z" level=info msg="StopContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" with timeout 2 (s)" May 14 18:13:50.762162 containerd[1734]: time="2025-05-14T18:13:50.762092247Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:13:50.762162 containerd[1734]: time="2025-05-14T18:13:50.762145375Z" level=warning msg="unknown status" status=0 May 14 18:13:51.363517 containerd[1734]: time="2025-05-14T18:13:50.762180352Z" level=info msg="Stop container \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" with signal terminated" May 14 18:13:51.580731 containerd[1734]: time="2025-05-14T18:13:51.580697443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,}" May 14 18:13:52.581373 containerd[1734]: time="2025-05-14T18:13:52.581190899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:53.367894 containerd[1734]: time="2025-05-14T18:13:53.367832867Z" level=error msg="failed to handle container TaskExit event container_id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" pid:4385 exit_status:1 exited_at:{seconds:1747246423 nanos:367352510}" error="failed to stop container: failed to delete task: context deadline exceeded" May 14 18:13:53.580506 containerd[1734]: time="2025-05-14T18:13:53.580450413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,}" May 14 18:13:54.502662 containerd[1734]: time="2025-05-14T18:13:54.502513418Z" level=info msg="TaskExit event container_id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" id:\"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" pid:4385 exit_status:1 exited_at:{seconds:1747246423 nanos:367352510}" May 14 18:13:54.581006 containerd[1734]: time="2025-05-14T18:13:54.580939097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,}" May 14 18:13:54.581771 containerd[1734]: time="2025-05-14T18:13:54.581272101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,}" May 14 18:13:54.582178 containerd[1734]: time="2025-05-14T18:13:54.582160675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,}" May 14 18:13:56.503422 containerd[1734]: time="2025-05-14T18:13:56.503343636Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:13:56.503422 containerd[1734]: time="2025-05-14T18:13:56.503410022Z" level=warning msg="unknown status" status=0 May 14 18:13:58.505303 containerd[1734]: time="2025-05-14T18:13:58.505227815Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:13:58.505303 containerd[1734]: time="2025-05-14T18:13:58.505295609Z" level=warning msg="unknown status" status=0 May 14 18:14:00.506985 containerd[1734]: time="2025-05-14T18:14:00.506901350Z" level=error msg="get state for ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" error="context deadline exceeded" May 14 18:14:00.506985 containerd[1734]: time="2025-05-14T18:14:00.506974697Z" level=warning msg="unknown status" status=0 May 14 18:14:01.851546 containerd[1734]: time="2025-05-14T18:14:01.851506966Z" level=error msg="Failed to destroy network for sandbox \"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:01.853663 systemd[1]: run-netns-cni\x2db917dc06\x2db096\x2d27ec\x2d33f1\x2db215b36b7dea.mount: Deactivated successfully. May 14 18:14:03.376759 containerd[1734]: time="2025-05-14T18:14:03.376654938Z" level=error msg="ttrpc: received message on inactive stream" stream=31 May 14 18:14:03.376759 containerd[1734]: time="2025-05-14T18:14:03.376726379Z" level=error msg="ttrpc: received message on inactive stream" stream=33 May 14 18:14:03.376759 containerd[1734]: time="2025-05-14T18:14:03.376739351Z" level=error msg="ttrpc: received message on inactive stream" stream=35 May 14 18:14:03.377280 containerd[1734]: time="2025-05-14T18:14:03.376777327Z" level=error msg="ttrpc: received message on inactive stream" stream=43 May 14 18:14:03.377280 containerd[1734]: time="2025-05-14T18:14:03.376788157Z" level=error msg="ttrpc: received message on inactive stream" stream=49 May 14 18:14:03.377280 containerd[1734]: time="2025-05-14T18:14:03.376797629Z" level=error msg="ttrpc: received message on inactive stream" stream=51 May 14 18:14:03.377280 containerd[1734]: time="2025-05-14T18:14:03.376805898Z" level=error msg="ttrpc: received message on inactive stream" stream=53 May 14 18:14:03.378769 containerd[1734]: time="2025-05-14T18:14:03.377830612Z" level=info msg="Ensure that container ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037 in task-service has been cleanup successfully" May 14 18:14:03.379888 containerd[1734]: time="2025-05-14T18:14:03.379858919Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"2bffe9dee8a5c9e552e8483fd4cf23d3265910980aedb8a57ba8c4e13fe3dd1d\": cannot exec in a deleted state" May 14 18:14:03.379940 containerd[1734]: time="2025-05-14T18:14:03.379876982Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"314247fb7b5b3d70c62313b3a6d3bb101c163498d52501f35c7b5d8be75c777a\": cannot exec in a deleted state" May 14 18:14:03.380131 kubelet[3205]: E0514 18:14:03.380060 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"2bffe9dee8a5c9e552e8483fd4cf23d3265910980aedb8a57ba8c4e13fe3dd1d\": cannot exec in a deleted state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-shutdown"] May 14 18:14:03.380383 kubelet[3205]: E0514 18:14:03.380158 3205 kuberuntime_container.go:691] "PreStop hook failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"2bffe9dee8a5c9e552e8483fd4cf23d3265910980aedb8a57ba8c4e13fe3dd1d\": cannot exec in a deleted state" pod="calico-system/calico-node-42jns" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" containerName="calico-node" containerID="containerd://ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" May 14 18:14:03.380494 kubelet[3205]: E0514 18:14:03.380060 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"314247fb7b5b3d70c62313b3a6d3bb101c163498d52501f35c7b5d8be75c777a\": cannot exec in a deleted state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.426102 containerd[1734]: time="2025-05-14T18:14:03.426051563Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 18:14:03.426628 kubelet[3205]: E0514 18:14:03.426454 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.430059 containerd[1734]: time="2025-05-14T18:14:03.430026915Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 18:14:03.430279 kubelet[3205]: E0514 18:14:03.430145 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.431299 containerd[1734]: time="2025-05-14T18:14:03.430571795Z" level=info msg="StopContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" returns successfully" May 14 18:14:03.431299 containerd[1734]: time="2025-05-14T18:14:03.430861894Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 18:14:03.431299 containerd[1734]: time="2025-05-14T18:14:03.431132895Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 18:14:03.431413 kubelet[3205]: E0514 18:14:03.430944 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.431413 kubelet[3205]: E0514 18:14:03.431208 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.431465 containerd[1734]: time="2025-05-14T18:14:03.431347969Z" level=error msg="ExecSync for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 18:14:03.431489 kubelet[3205]: E0514 18:14:03.431412 3205 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 18:14:03.432542 containerd[1734]: time="2025-05-14T18:14:03.432238829Z" level=info msg="StopPodSandbox for \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\"" May 14 18:14:03.432542 containerd[1734]: time="2025-05-14T18:14:03.432295116Z" level=info msg="Container to stop \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 18:14:03.432542 containerd[1734]: time="2025-05-14T18:14:03.432307098Z" level=info msg="Container to stop \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 18:14:03.432542 containerd[1734]: time="2025-05-14T18:14:03.432315282Z" level=info msg="Container to stop \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 18:14:03.438458 systemd[1]: cri-containerd-164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7.scope: Deactivated successfully. May 14 18:14:03.443193 containerd[1734]: time="2025-05-14T18:14:03.443113012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" id:\"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" pid:3756 exit_status:137 exited_at:{seconds:1747246443 nanos:442905076}" May 14 18:14:03.461286 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7-rootfs.mount: Deactivated successfully. May 14 18:14:04.670304 containerd[1734]: time="2025-05-14T18:14:04.670174000Z" level=info msg="shim disconnected" id=164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7 namespace=k8s.io May 14 18:14:04.670304 containerd[1734]: time="2025-05-14T18:14:04.670211150Z" level=warning msg="cleaning up after shim disconnected" id=164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7 namespace=k8s.io May 14 18:14:04.670304 containerd[1734]: time="2025-05-14T18:14:04.670221424Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 14 18:14:04.680143 containerd[1734]: time="2025-05-14T18:14:04.680111978Z" level=info msg="TearDown network for sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" successfully" May 14 18:14:04.680143 containerd[1734]: time="2025-05-14T18:14:04.680141034Z" level=info msg="StopPodSandbox for \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" returns successfully" May 14 18:14:04.682123 containerd[1734]: time="2025-05-14T18:14:04.680249004Z" level=info msg="received exit event sandbox_id:\"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" exit_status:137 exited_at:{seconds:1747246443 nanos:442905076}" May 14 18:14:04.683498 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7-shm.mount: Deactivated successfully. May 14 18:14:04.722301 kubelet[3205]: E0514 18:14:04.722178 3205 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" containerName="flexvol-driver" May 14 18:14:04.722301 kubelet[3205]: E0514 18:14:04.722228 3205 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" containerName="install-cni" May 14 18:14:04.722301 kubelet[3205]: E0514 18:14:04.722236 3205 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" containerName="calico-node" May 14 18:14:04.722301 kubelet[3205]: I0514 18:14:04.722265 3205 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" containerName="calico-node" May 14 18:14:04.728884 systemd[1]: Created slice kubepods-besteffort-pod8adb9c92_ebf2_4942_8944_284ce86f2177.slice - libcontainer container kubepods-besteffort-pod8adb9c92_ebf2_4942_8944_284ce86f2177.slice. May 14 18:14:04.791032 containerd[1734]: time="2025-05-14T18:14:04.790395877Z" level=error msg="Failed to destroy network for sandbox \"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.792569 systemd[1]: run-netns-cni\x2d3b96d00f\x2dd74d\x2dd993\x2da55a\x2df7840daa5363.mount: Deactivated successfully. May 14 18:14:04.803264 kubelet[3205]: I0514 18:14:04.803227 3205 scope.go:117] "RemoveContainer" containerID="ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037" May 14 18:14:04.807602 containerd[1734]: time="2025-05-14T18:14:04.807567153Z" level=info msg="RemoveContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\"" May 14 18:14:04.811046 containerd[1734]: time="2025-05-14T18:14:04.811005139Z" level=error msg="Failed to destroy network for sandbox \"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.812481 containerd[1734]: time="2025-05-14T18:14:04.812449799Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.812758 kubelet[3205]: E0514 18:14:04.812715 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.812902 kubelet[3205]: E0514 18:14:04.812854 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:14:04.812902 kubelet[3205]: E0514 18:14:04.812877 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-v4tq6" May 14 18:14:04.812976 systemd[1]: run-netns-cni\x2d435262aa\x2d12e3\x2d1689\x2df679\x2d6003f2cc4489.mount: Deactivated successfully. May 14 18:14:04.813314 kubelet[3205]: E0514 18:14:04.812989 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-v4tq6_kube-system(02387ae7-7df0-4320-8070-8518fe0bd4d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"895e47b9982549526378e563f0f26f5e557390295a1911a51e501e28a155ed61\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-v4tq6" podUID="02387ae7-7df0-4320-8070-8518fe0bd4d8" May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873517 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-policysync\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873548 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e82bc8-542e-4aed-91e7-0d220ea19b51-tigera-ca-bundle\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873558 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-policysync" (OuterVolumeSpecName: "policysync") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873564 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-run-calico\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873579 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-lib-modules\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873759 kubelet[3205]: I0514 18:14:04.873593 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-bin-dir\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873609 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-log-dir\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873629 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcc9l\" (UniqueName: \"kubernetes.io/projected/41e82bc8-542e-4aed-91e7-0d220ea19b51-kube-api-access-qcc9l\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873644 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-xtables-lock\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873661 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/41e82bc8-542e-4aed-91e7-0d220ea19b51-node-certs\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873675 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-lib-calico\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.873932 kubelet[3205]: I0514 18:14:04.873690 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-flexvol-driver-host\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.874066 kubelet[3205]: I0514 18:14:04.873705 3205 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-net-dir\") pod \"41e82bc8-542e-4aed-91e7-0d220ea19b51\" (UID: \"41e82bc8-542e-4aed-91e7-0d220ea19b51\") " May 14 18:14:04.874259 kubelet[3205]: I0514 18:14:04.874129 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.874259 kubelet[3205]: I0514 18:14:04.874157 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.874259 kubelet[3205]: I0514 18:14:04.874210 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-cni-log-dir\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874259 kubelet[3205]: I0514 18:14:04.874232 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8adb9c92-ebf2-4942-8944-284ce86f2177-node-certs\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874259 kubelet[3205]: I0514 18:14:04.874246 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-cni-net-dir\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874452 kubelet[3205]: I0514 18:14:04.874393 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-flexvol-driver-host\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874452 kubelet[3205]: I0514 18:14:04.874412 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-policysync\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874452 kubelet[3205]: I0514 18:14:04.874430 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99l8b\" (UniqueName: \"kubernetes.io/projected/8adb9c92-ebf2-4942-8944-284ce86f2177-kube-api-access-99l8b\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874596 kubelet[3205]: I0514 18:14:04.874170 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.874596 kubelet[3205]: I0514 18:14:04.874542 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.874596 kubelet[3205]: I0514 18:14:04.874553 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.874736 kubelet[3205]: I0514 18:14:04.874674 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-var-lib-calico\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874736 kubelet[3205]: I0514 18:14:04.874694 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-lib-modules\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874736 kubelet[3205]: I0514 18:14:04.874708 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8adb9c92-ebf2-4942-8944-284ce86f2177-tigera-ca-bundle\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874909 kubelet[3205]: I0514 18:14:04.874840 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-cni-bin-dir\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874909 kubelet[3205]: I0514 18:14:04.874860 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-xtables-lock\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874909 kubelet[3205]: I0514 18:14:04.874878 3205 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8adb9c92-ebf2-4942-8944-284ce86f2177-var-run-calico\") pod \"calico-node-wjvg9\" (UID: \"8adb9c92-ebf2-4942-8944-284ce86f2177\") " pod="calico-system/calico-node-wjvg9" May 14 18:14:04.874909 kubelet[3205]: I0514 18:14:04.874897 3205 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-policysync\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875001 kubelet[3205]: I0514 18:14:04.874916 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.875179 kubelet[3205]: I0514 18:14:04.875034 3205 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-run-calico\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875179 kubelet[3205]: I0514 18:14:04.875045 3205 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-lib-modules\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875179 kubelet[3205]: I0514 18:14:04.875053 3205 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-bin-dir\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875179 kubelet[3205]: I0514 18:14:04.875060 3205 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-log-dir\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875179 kubelet[3205]: I0514 18:14:04.875068 3205 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-cni-net-dir\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.875755 kubelet[3205]: I0514 18:14:04.875700 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.875755 kubelet[3205]: I0514 18:14:04.875731 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 18:14:04.877477 systemd[1]: var-lib-kubelet-pods-41e82bc8\x2d542e\x2d4aed\x2d91e7\x2d0d220ea19b51-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 14 18:14:04.880712 kubelet[3205]: I0514 18:14:04.880690 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e82bc8-542e-4aed-91e7-0d220ea19b51-kube-api-access-qcc9l" (OuterVolumeSpecName: "kube-api-access-qcc9l") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "kube-api-access-qcc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" May 14 18:14:04.881187 kubelet[3205]: I0514 18:14:04.880878 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e82bc8-542e-4aed-91e7-0d220ea19b51-node-certs" (OuterVolumeSpecName: "node-certs") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 14 18:14:04.881868 kubelet[3205]: I0514 18:14:04.881494 3205 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e82bc8-542e-4aed-91e7-0d220ea19b51-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "41e82bc8-542e-4aed-91e7-0d220ea19b51" (UID: "41e82bc8-542e-4aed-91e7-0d220ea19b51"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 14 18:14:04.910461 containerd[1734]: time="2025-05-14T18:14:04.910423689Z" level=error msg="Failed to destroy network for sandbox \"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.957664 containerd[1734]: time="2025-05-14T18:14:04.957577749Z" level=error msg="Failed to destroy network for sandbox \"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:04.976337 kubelet[3205]: I0514 18:14:04.976289 3205 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-qcc9l\" (UniqueName: \"kubernetes.io/projected/41e82bc8-542e-4aed-91e7-0d220ea19b51-kube-api-access-qcc9l\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.976337 kubelet[3205]: I0514 18:14:04.976309 3205 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-xtables-lock\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.976337 kubelet[3205]: I0514 18:14:04.976319 3205 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/41e82bc8-542e-4aed-91e7-0d220ea19b51-node-certs\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.976531 kubelet[3205]: I0514 18:14:04.976452 3205 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-var-lib-calico\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.976531 kubelet[3205]: I0514 18:14:04.976466 3205 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/41e82bc8-542e-4aed-91e7-0d220ea19b51-flexvol-driver-host\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:04.976531 kubelet[3205]: I0514 18:14:04.976476 3205 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e82bc8-542e-4aed-91e7-0d220ea19b51-tigera-ca-bundle\") on node \"ci-4334.0.0-a-c37eb65ec1\" DevicePath \"\"" May 14 18:14:05.003112 containerd[1734]: time="2025-05-14T18:14:05.003088445Z" level=error msg="Failed to destroy network for sandbox \"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.020316 containerd[1734]: time="2025-05-14T18:14:05.020216870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.020409 kubelet[3205]: E0514 18:14:05.020385 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.020442 kubelet[3205]: E0514 18:14:05.020428 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:14:05.020469 kubelet[3205]: E0514 18:14:05.020451 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" May 14 18:14:05.020510 kubelet[3205]: E0514 18:14:05.020482 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-2dhkq_calico-apiserver(0272f273-8188-48f8-bfc1-35ee033246c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d4739f03df52d3b7025f82a3b5888cd3f92bbd3240fc3bc401ee3c2452f9ec2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" podUID="0272f273-8188-48f8-bfc1-35ee033246c4" May 14 18:14:05.066043 containerd[1734]: time="2025-05-14T18:14:05.065987134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wjvg9,Uid:8adb9c92-ebf2-4942-8944-284ce86f2177,Namespace:calico-system,Attempt:0,}" May 14 18:14:05.106263 systemd[1]: Removed slice kubepods-besteffort-pod41e82bc8_542e_4aed_91e7_0d220ea19b51.slice - libcontainer container kubepods-besteffort-pod41e82bc8_542e_4aed_91e7_0d220ea19b51.slice. May 14 18:14:05.106367 systemd[1]: kubepods-besteffort-pod41e82bc8_542e_4aed_91e7_0d220ea19b51.slice: Consumed 416ms CPU time, 188.2M memory peak, 160.4M written to disk. May 14 18:14:05.129290 containerd[1734]: time="2025-05-14T18:14:05.129247990Z" level=info msg="RemoveContainer for \"ac4e488d9a5307b8769f049c630ee16f15a26a0acbb958efcbfb04972a0f8037\" returns successfully" May 14 18:14:05.129481 kubelet[3205]: I0514 18:14:05.129464 3205 scope.go:117] "RemoveContainer" containerID="8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad" May 14 18:14:05.131031 containerd[1734]: time="2025-05-14T18:14:05.130999429Z" level=info msg="RemoveContainer for \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\"" May 14 18:14:05.135797 containerd[1734]: time="2025-05-14T18:14:05.135769822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.135923 kubelet[3205]: E0514 18:14:05.135905 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.135972 kubelet[3205]: E0514 18:14:05.135941 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:14:05.135972 kubelet[3205]: E0514 18:14:05.135961 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" May 14 18:14:05.136017 kubelet[3205]: E0514 18:14:05.135994 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58d5b579d7-9wc6k_calico-system(b7e2c3b5-0b2e-4624-9ffb-7b15307add96)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35da5adb7d1e0eac73a20962856fa970afc7c5cf7087110b27f4d746324d1099\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" podUID="b7e2c3b5-0b2e-4624-9ffb-7b15307add96" May 14 18:14:05.174200 containerd[1734]: time="2025-05-14T18:14:05.174165683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.174368 kubelet[3205]: E0514 18:14:05.174349 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.174414 kubelet[3205]: E0514 18:14:05.174390 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:14:05.174414 kubelet[3205]: E0514 18:14:05.174407 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-clvtl" May 14 18:14:05.174477 kubelet[3205]: E0514 18:14:05.174440 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-clvtl_kube-system(f855f8c6-0735-4302-b463-7204c3026e15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a11a46a2cac53984d24083881a048837a7d3400f1b7ae3bf0fafead859497fe5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-clvtl" podUID="f855f8c6-0735-4302-b463-7204c3026e15" May 14 18:14:05.221435 containerd[1734]: time="2025-05-14T18:14:05.221340077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.222110 kubelet[3205]: E0514 18:14:05.221481 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.222110 kubelet[3205]: E0514 18:14:05.221523 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:14:05.222110 kubelet[3205]: E0514 18:14:05.221539 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" May 14 18:14:05.222211 kubelet[3205]: E0514 18:14:05.221572 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cb7cd99bf-z6fr6_calico-apiserver(0c3f601d-87ef-4fb7-a83d-21758b09a12d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0cc970dfa2ec6f41b1897dc59ea21411df00f39c1923844e39a6a027a68d818c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" podUID="0c3f601d-87ef-4fb7-a83d-21758b09a12d" May 14 18:14:05.519596 containerd[1734]: time="2025-05-14T18:14:05.519493924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.520047 kubelet[3205]: E0514 18:14:05.519709 3205 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 18:14:05.520047 kubelet[3205]: E0514 18:14:05.519789 3205 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:14:05.520047 kubelet[3205]: E0514 18:14:05.519814 3205 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cl9vj" May 14 18:14:05.520163 kubelet[3205]: E0514 18:14:05.519877 3205 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cl9vj_calico-system(f8c0ddea-8d34-4405-b1d7-cbac1335ae76)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"810f3047f71d38c1363a3720abcb11611cb5ba776fe8281618055fb401eeeb52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cl9vj" podUID="f8c0ddea-8d34-4405-b1d7-cbac1335ae76" May 14 18:14:05.578895 systemd[1]: run-netns-cni\x2df3e1c749\x2db920\x2df7fc\x2dd1e7\x2d52cf7fdba56a.mount: Deactivated successfully. May 14 18:14:05.579131 systemd[1]: run-netns-cni\x2da0a3939c\x2d183c\x2df4c6\x2dff2c\x2d6af95df763c6.mount: Deactivated successfully. May 14 18:14:05.579222 systemd[1]: run-netns-cni\x2db347f133\x2db1e9\x2dafe3\x2d19bf\x2d6496c74b11f4.mount: Deactivated successfully. May 14 18:14:05.579309 systemd[1]: var-lib-kubelet-pods-41e82bc8\x2d542e\x2d4aed\x2d91e7\x2d0d220ea19b51-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqcc9l.mount: Deactivated successfully. May 14 18:14:05.579395 systemd[1]: var-lib-kubelet-pods-41e82bc8\x2d542e\x2d4aed\x2d91e7\x2d0d220ea19b51-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 14 18:14:05.724099 containerd[1734]: time="2025-05-14T18:14:05.724072098Z" level=info msg="RemoveContainer for \"8a61e7da802874ef32356e1e4c8ae6f90c1047af98bf404c4073fe2b878a69ad\" returns successfully" May 14 18:14:05.724427 kubelet[3205]: I0514 18:14:05.724224 3205 scope.go:117] "RemoveContainer" containerID="9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b" May 14 18:14:05.726182 containerd[1734]: time="2025-05-14T18:14:05.726156781Z" level=info msg="RemoveContainer for \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\"" May 14 18:14:05.773017 systemd[1]: Started sshd@7-10.200.8.38:22-10.200.16.10:51642.service - OpenSSH per-connection server daemon (10.200.16.10:51642). May 14 18:14:05.819058 containerd[1734]: time="2025-05-14T18:14:05.819004513Z" level=info msg="RemoveContainer for \"9ac6799244715ab40c2f32cb277da464ee1b5dec9980e5bd21fde2c85cd4694b\" returns successfully" May 14 18:14:06.080803 containerd[1734]: time="2025-05-14T18:14:06.080690442Z" level=info msg="connecting to shim b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08" address="unix:///run/containerd/s/393a1d84673ce1e8fab368dddf80cf8fcaa8274c28af89edf750b5051d9b68ea" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:06.102877 systemd[1]: Started cri-containerd-b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08.scope - libcontainer container b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08. May 14 18:14:06.121497 containerd[1734]: time="2025-05-14T18:14:06.121474433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wjvg9,Uid:8adb9c92-ebf2-4942-8944-284ce86f2177,Namespace:calico-system,Attempt:0,} returns sandbox id \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\"" May 14 18:14:06.123500 containerd[1734]: time="2025-05-14T18:14:06.123481071Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 18:14:06.332798 containerd[1734]: time="2025-05-14T18:14:06.329970377Z" level=info msg="Container f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:06.406433 sshd[4662]: Accepted publickey for core from 10.200.16.10 port 51642 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:06.407618 sshd-session[4662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:06.411415 systemd-logind[1706]: New session 10 of user core. May 14 18:14:06.416882 systemd[1]: Started session-10.scope - Session 10 of User core. May 14 18:14:06.471643 containerd[1734]: time="2025-05-14T18:14:06.471620242Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\"" May 14 18:14:06.472091 containerd[1734]: time="2025-05-14T18:14:06.472065680Z" level=info msg="StartContainer for \"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\"" May 14 18:14:06.473892 containerd[1734]: time="2025-05-14T18:14:06.473858731Z" level=info msg="connecting to shim f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609" address="unix:///run/containerd/s/393a1d84673ce1e8fab368dddf80cf8fcaa8274c28af89edf750b5051d9b68ea" protocol=ttrpc version=3 May 14 18:14:06.493888 systemd[1]: Started cri-containerd-f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609.scope - libcontainer container f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609. May 14 18:14:06.519476 containerd[1734]: time="2025-05-14T18:14:06.519441766Z" level=info msg="StartContainer for \"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\" returns successfully" May 14 18:14:06.526478 systemd[1]: cri-containerd-f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609.scope: Deactivated successfully. May 14 18:14:06.526904 systemd[1]: cri-containerd-f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609.scope: Consumed 21ms CPU time, 8.2M memory peak, 6.3M written to disk. May 14 18:14:06.528610 containerd[1734]: time="2025-05-14T18:14:06.528571238Z" level=info msg="received exit event container_id:\"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\" id:\"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\" pid:4724 exited_at:{seconds:1747246446 nanos:528419217}" May 14 18:14:06.528795 containerd[1734]: time="2025-05-14T18:14:06.528735021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\" id:\"f97b5e07a9c22d884d492871d350d6d75bcd9f5893a14071eae4165e7a34a609\" pid:4724 exited_at:{seconds:1747246446 nanos:528419217}" May 14 18:14:06.581804 kubelet[3205]: I0514 18:14:06.581778 3205 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e82bc8-542e-4aed-91e7-0d220ea19b51" path="/var/lib/kubelet/pods/41e82bc8-542e-4aed-91e7-0d220ea19b51/volumes" May 14 18:14:07.530992 sshd[4711]: Connection closed by 10.200.16.10 port 51642 May 14 18:14:07.531560 sshd-session[4662]: pam_unix(sshd:session): session closed for user core May 14 18:14:07.534586 systemd[1]: sshd@7-10.200.8.38:22-10.200.16.10:51642.service: Deactivated successfully. May 14 18:14:07.536269 systemd[1]: session-10.scope: Deactivated successfully. May 14 18:14:07.537857 systemd-logind[1706]: Session 10 logged out. Waiting for processes to exit. May 14 18:14:07.538645 systemd-logind[1706]: Removed session 10. May 14 18:14:10.821874 containerd[1734]: time="2025-05-14T18:14:10.821825680Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 18:14:11.520904 containerd[1734]: time="2025-05-14T18:14:11.520864482Z" level=info msg="Container 716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:11.678335 containerd[1734]: time="2025-05-14T18:14:11.678292891Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\"" May 14 18:14:11.678693 containerd[1734]: time="2025-05-14T18:14:11.678655828Z" level=info msg="StartContainer for \"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\"" May 14 18:14:11.679998 containerd[1734]: time="2025-05-14T18:14:11.679962271Z" level=info msg="connecting to shim 716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a" address="unix:///run/containerd/s/393a1d84673ce1e8fab368dddf80cf8fcaa8274c28af89edf750b5051d9b68ea" protocol=ttrpc version=3 May 14 18:14:11.698910 systemd[1]: Started cri-containerd-716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a.scope - libcontainer container 716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a. May 14 18:14:11.732416 containerd[1734]: time="2025-05-14T18:14:11.732331343Z" level=info msg="StartContainer for \"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\" returns successfully" May 14 18:14:11.992902 containerd[1734]: time="2025-05-14T18:14:11.992869355Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: failed to load CNI config list file /etc/cni/net.d/10-calico.conflist: error parsing configuration list: unexpected end of JSON input: invalid cni config: failed to load cni config" May 14 18:14:11.994694 systemd[1]: cri-containerd-716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a.scope: Deactivated successfully. May 14 18:14:11.995887 containerd[1734]: time="2025-05-14T18:14:11.995864512Z" level=info msg="received exit event container_id:\"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\" id:\"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\" pid:4781 exited_at:{seconds:1747246451 nanos:995638400}" May 14 18:14:11.995973 containerd[1734]: time="2025-05-14T18:14:11.995873467Z" level=info msg="TaskExit event in podsandbox handler container_id:\"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\" id:\"716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a\" pid:4781 exited_at:{seconds:1747246451 nanos:995638400}" May 14 18:14:12.010035 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-716760d7f27b1a95c265606f7caefc2bfc53f747cce29085a36601c73263a03a-rootfs.mount: Deactivated successfully. May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580079631Z" level=info msg="StopPodSandbox for \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\"" May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580185358Z" level=info msg="TearDown network for sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" successfully" May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580195054Z" level=info msg="StopPodSandbox for \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" returns successfully" May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580466727Z" level=info msg="RemovePodSandbox for \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\"" May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580482486Z" level=info msg="Forcibly stopping sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\"" May 14 18:14:12.925839 containerd[1734]: time="2025-05-14T18:14:12.580541406Z" level=info msg="TearDown network for sandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" successfully" May 14 18:14:12.646914 systemd[1]: Started sshd@8-10.200.8.38:22-10.200.16.10:33486.service - OpenSSH per-connection server daemon (10.200.16.10:33486). May 14 18:14:12.926557 containerd[1734]: time="2025-05-14T18:14:12.926166978Z" level=info msg="Ensure that sandbox 164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7 in task-service has been cleanup successfully" May 14 18:14:13.322010 sshd[4813]: Accepted publickey for core from 10.200.16.10 port 33486 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:13.293400 systemd-logind[1706]: New session 11 of user core. May 14 18:14:13.289619 sshd-session[4813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:13.299881 systemd[1]: Started session-11.scope - Session 11 of User core. May 14 18:14:13.763994 containerd[1734]: time="2025-05-14T18:14:13.763867856Z" level=info msg="RemovePodSandbox \"164481cda957baa615a28b334f99e0d87a8f3ebbc03e6a6f5296cdc8a49411b7\" returns successfully" May 14 18:14:13.776823 sshd[4815]: Connection closed by 10.200.16.10 port 33486 May 14 18:14:13.777212 sshd-session[4813]: pam_unix(sshd:session): session closed for user core May 14 18:14:13.779902 systemd[1]: sshd@8-10.200.8.38:22-10.200.16.10:33486.service: Deactivated successfully. May 14 18:14:13.782302 systemd[1]: session-11.scope: Deactivated successfully. May 14 18:14:13.783091 systemd-logind[1706]: Session 11 logged out. Waiting for processes to exit. May 14 18:14:13.787380 systemd-logind[1706]: Removed session 11. May 14 18:14:13.842294 containerd[1734]: time="2025-05-14T18:14:13.842269990Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 18:14:14.115777 containerd[1734]: time="2025-05-14T18:14:14.114814241Z" level=info msg="Container 42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:14.230941 containerd[1734]: time="2025-05-14T18:14:14.230914743Z" level=info msg="CreateContainer within sandbox \"b6d54cd146b9495076ca4aa695300b7500daf35e7b54e830b2ebe4f924a20d08\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\"" May 14 18:14:14.231340 containerd[1734]: time="2025-05-14T18:14:14.231298660Z" level=info msg="StartContainer for \"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\"" May 14 18:14:14.232607 containerd[1734]: time="2025-05-14T18:14:14.232570200Z" level=info msg="connecting to shim 42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c" address="unix:///run/containerd/s/393a1d84673ce1e8fab368dddf80cf8fcaa8274c28af89edf750b5051d9b68ea" protocol=ttrpc version=3 May 14 18:14:14.256888 systemd[1]: Started cri-containerd-42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c.scope - libcontainer container 42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c. May 14 18:14:14.287656 containerd[1734]: time="2025-05-14T18:14:14.287599084Z" level=info msg="StartContainer for \"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" returns successfully" May 14 18:14:14.853079 kubelet[3205]: I0514 18:14:14.852717 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wjvg9" podStartSLOduration=10.852699816 podStartE2EDuration="10.852699816s" podCreationTimestamp="2025-05-14 18:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:14:14.851921708 +0000 UTC m=+122.344445068" watchObservedRunningTime="2025-05-14 18:14:14.852699816 +0000 UTC m=+122.345223177" May 14 18:14:14.878720 containerd[1734]: time="2025-05-14T18:14:14.878671324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"b0681ece60ae685e0680baa4f78b6900a529bd03a8c3245d974ebed14e3ae34f\" pid:4893 exit_status:1 exited_at:{seconds:1747246454 nanos:878398532}" May 14 18:14:15.873158 systemd-networkd[1358]: vxlan.calico: Link UP May 14 18:14:15.873166 systemd-networkd[1358]: vxlan.calico: Gained carrier May 14 18:14:15.889966 containerd[1734]: time="2025-05-14T18:14:15.889932944Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"5043ba5458214e93b320c8a30f7a02d6ef3f9e4a7d3dc7411654d18d19d43586\" pid:5053 exit_status:1 exited_at:{seconds:1747246455 nanos:889502191}" May 14 18:14:16.580856 containerd[1734]: time="2025-05-14T18:14:16.580820505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,}" May 14 18:14:16.685144 systemd-networkd[1358]: cali22829625ffa: Link UP May 14 18:14:16.685674 systemd-networkd[1358]: cali22829625ffa: Gained carrier May 14 18:14:16.697583 containerd[1734]: 2025-05-14 18:14:16.637 [INFO][5117] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0 coredns-6f6b679f8f- kube-system f855f8c6-0735-4302-b463-7204c3026e15 803 0 2025-05-14 18:12:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 coredns-6f6b679f8f-clvtl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali22829625ffa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-" May 14 18:14:16.697583 containerd[1734]: 2025-05-14 18:14:16.637 [INFO][5117] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.697583 containerd[1734]: 2025-05-14 18:14:16.656 [INFO][5129] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" HandleID="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.662 [INFO][5129] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" HandleID="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b540), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"coredns-6f6b679f8f-clvtl", "timestamp":"2025-05-14 18:14:16.656308384 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.662 [INFO][5129] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.662 [INFO][5129] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.662 [INFO][5129] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.664 [INFO][5129] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.666 [INFO][5129] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.669 [INFO][5129] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.670 [INFO][5129] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697786 containerd[1734]: 2025-05-14 18:14:16.671 [INFO][5129] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.671 [INFO][5129] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.672 [INFO][5129] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121 May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.678 [INFO][5129] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.682 [INFO][5129] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.65/26] block=192.168.10.64/26 handle="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.682 [INFO][5129] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.65/26] handle="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.682 [INFO][5129] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:16.697993 containerd[1734]: 2025-05-14 18:14:16.682 [INFO][5129] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.65/26] IPv6=[] ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" HandleID="k8s-pod-network.1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.683 [INFO][5117] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"f855f8c6-0735-4302-b463-7204c3026e15", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"coredns-6f6b679f8f-clvtl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22829625ffa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.683 [INFO][5117] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.65/32] ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.683 [INFO][5117] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali22829625ffa ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.685 [INFO][5117] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.685 [INFO][5117] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"f855f8c6-0735-4302-b463-7204c3026e15", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121", Pod:"coredns-6f6b679f8f-clvtl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali22829625ffa", MAC:"f2:e9:4a:ee:01:f0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:16.698131 containerd[1734]: 2025-05-14 18:14:16.695 [INFO][5117] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" Namespace="kube-system" Pod="coredns-6f6b679f8f-clvtl" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--clvtl-eth0" May 14 18:14:17.125319 containerd[1734]: time="2025-05-14T18:14:17.125280423Z" level=info msg="connecting to shim 1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121" address="unix:///run/containerd/s/5282e70e5eaeb3af86031920c1aaeea1d29e8d11306fecc3dce0e4a195b86b9c" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:17.147887 systemd[1]: Started cri-containerd-1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121.scope - libcontainer container 1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121. May 14 18:14:17.190563 containerd[1734]: time="2025-05-14T18:14:17.190545077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-clvtl,Uid:f855f8c6-0735-4302-b463-7204c3026e15,Namespace:kube-system,Attempt:0,} returns sandbox id \"1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121\"" May 14 18:14:17.192537 containerd[1734]: time="2025-05-14T18:14:17.192517771Z" level=info msg="CreateContainer within sandbox \"1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 18:14:17.406856 systemd-networkd[1358]: vxlan.calico: Gained IPv6LL May 14 18:14:17.414767 containerd[1734]: time="2025-05-14T18:14:17.414460996Z" level=info msg="Container 48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:17.418860 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1435602300.mount: Deactivated successfully. May 14 18:14:17.529069 containerd[1734]: time="2025-05-14T18:14:17.529043203Z" level=info msg="CreateContainer within sandbox \"1df149c3f5facd032a441f5f715ec4c32554082c55b49f83e5fbc5d1d9a20121\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a\"" May 14 18:14:17.529475 containerd[1734]: time="2025-05-14T18:14:17.529417880Z" level=info msg="StartContainer for \"48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a\"" May 14 18:14:17.530293 containerd[1734]: time="2025-05-14T18:14:17.530249146Z" level=info msg="connecting to shim 48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a" address="unix:///run/containerd/s/5282e70e5eaeb3af86031920c1aaeea1d29e8d11306fecc3dce0e4a195b86b9c" protocol=ttrpc version=3 May 14 18:14:17.547882 systemd[1]: Started cri-containerd-48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a.scope - libcontainer container 48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a. May 14 18:14:17.574312 containerd[1734]: time="2025-05-14T18:14:17.574291303Z" level=info msg="StartContainer for \"48c273b90696b77455bc504eeb7ffcfbd1fad9c6f0cbb18d6853eb39df658a8a\" returns successfully" May 14 18:14:17.580506 containerd[1734]: time="2025-05-14T18:14:17.580443574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,}" May 14 18:14:17.739345 systemd-networkd[1358]: cali24b4a6f1036: Link UP May 14 18:14:17.739463 systemd-networkd[1358]: cali24b4a6f1036: Gained carrier May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.688 [INFO][5231] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0 calico-apiserver-6cb7cd99bf- calico-apiserver 0c3f601d-87ef-4fb7-a83d-21758b09a12d 800 0 2025-05-14 18:12:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cb7cd99bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 calico-apiserver-6cb7cd99bf-z6fr6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali24b4a6f1036 [] []}} ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.688 [INFO][5231] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.708 [INFO][5243] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" HandleID="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.714 [INFO][5243] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" HandleID="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b20), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"calico-apiserver-6cb7cd99bf-z6fr6", "timestamp":"2025-05-14 18:14:17.708485927 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.714 [INFO][5243] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.714 [INFO][5243] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.714 [INFO][5243] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.716 [INFO][5243] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.719 [INFO][5243] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.722 [INFO][5243] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.723 [INFO][5243] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.725 [INFO][5243] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.725 [INFO][5243] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.726 [INFO][5243] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91 May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.729 [INFO][5243] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.736 [INFO][5243] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.66/26] block=192.168.10.64/26 handle="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.736 [INFO][5243] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.66/26] handle="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.736 [INFO][5243] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:17.751330 containerd[1734]: 2025-05-14 18:14:17.736 [INFO][5243] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.66/26] IPv6=[] ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" HandleID="k8s-pod-network.cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.737 [INFO][5231] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0", GenerateName:"calico-apiserver-6cb7cd99bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"0c3f601d-87ef-4fb7-a83d-21758b09a12d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cb7cd99bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"calico-apiserver-6cb7cd99bf-z6fr6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24b4a6f1036", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.737 [INFO][5231] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.66/32] ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.737 [INFO][5231] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24b4a6f1036 ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.739 [INFO][5231] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.740 [INFO][5231] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0", GenerateName:"calico-apiserver-6cb7cd99bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"0c3f601d-87ef-4fb7-a83d-21758b09a12d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cb7cd99bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91", Pod:"calico-apiserver-6cb7cd99bf-z6fr6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24b4a6f1036", MAC:"fa:9a:c6:fd:33:01", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:17.751978 containerd[1734]: 2025-05-14 18:14:17.750 [INFO][5231] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-z6fr6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--z6fr6-eth0" May 14 18:14:17.852380 kubelet[3205]: I0514 18:14:17.852322 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-clvtl" podStartSLOduration=123.852304474 podStartE2EDuration="2m3.852304474s" podCreationTimestamp="2025-05-14 18:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:14:17.851417891 +0000 UTC m=+125.343941250" watchObservedRunningTime="2025-05-14 18:14:17.852304474 +0000 UTC m=+125.344827833" May 14 18:14:18.085193 containerd[1734]: time="2025-05-14T18:14:18.085100121Z" level=info msg="connecting to shim cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91" address="unix:///run/containerd/s/b366b09fa0dfe21b4bb4e9a1670a731adbe2d78203ad1ba302bc2bac9909376c" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:18.102892 systemd[1]: Started cri-containerd-cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91.scope - libcontainer container cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91. May 14 18:14:18.134613 containerd[1734]: time="2025-05-14T18:14:18.134589176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-z6fr6,Uid:0c3f601d-87ef-4fb7-a83d-21758b09a12d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91\"" May 14 18:14:18.135767 containerd[1734]: time="2025-05-14T18:14:18.135715424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 18:14:18.366903 systemd-networkd[1358]: cali22829625ffa: Gained IPv6LL May 14 18:14:18.580433 containerd[1734]: time="2025-05-14T18:14:18.580377099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,}" May 14 18:14:18.690779 systemd-networkd[1358]: cali9329e3c0df6: Link UP May 14 18:14:18.691637 systemd-networkd[1358]: cali9329e3c0df6: Gained carrier May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.640 [INFO][5316] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0 csi-node-driver- calico-system f8c0ddea-8d34-4405-b1d7-cbac1335ae76 625 0 2025-05-14 18:12:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 csi-node-driver-cl9vj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9329e3c0df6 [] []}} ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.640 [INFO][5316] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.660 [INFO][5330] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" HandleID="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.666 [INFO][5330] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" HandleID="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291230), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"csi-node-driver-cl9vj", "timestamp":"2025-05-14 18:14:18.660717193 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.666 [INFO][5330] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.666 [INFO][5330] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.666 [INFO][5330] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.668 [INFO][5330] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.670 [INFO][5330] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.673 [INFO][5330] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.674 [INFO][5330] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.676 [INFO][5330] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.676 [INFO][5330] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.677 [INFO][5330] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.683 [INFO][5330] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.687 [INFO][5330] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.67/26] block=192.168.10.64/26 handle="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.687 [INFO][5330] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.67/26] handle="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.687 [INFO][5330] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:18.703310 containerd[1734]: 2025-05-14 18:14:18.687 [INFO][5330] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.67/26] IPv6=[] ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" HandleID="k8s-pod-network.3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.689 [INFO][5316] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f8c0ddea-8d34-4405-b1d7-cbac1335ae76", ResourceVersion:"625", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"csi-node-driver-cl9vj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9329e3c0df6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.689 [INFO][5316] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.67/32] ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.689 [INFO][5316] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9329e3c0df6 ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.691 [INFO][5316] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.691 [INFO][5316] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f8c0ddea-8d34-4405-b1d7-cbac1335ae76", ResourceVersion:"625", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e", Pod:"csi-node-driver-cl9vj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9329e3c0df6", MAC:"26:78:aa:cf:09:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:18.704599 containerd[1734]: 2025-05-14 18:14:18.700 [INFO][5316] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" Namespace="calico-system" Pod="csi-node-driver-cl9vj" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-csi--node--driver--cl9vj-eth0" May 14 18:14:18.891250 systemd[1]: Started sshd@9-10.200.8.38:22-10.200.16.10:55344.service - OpenSSH per-connection server daemon (10.200.16.10:55344). May 14 18:14:19.129212 containerd[1734]: time="2025-05-14T18:14:19.129150788Z" level=info msg="connecting to shim 3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e" address="unix:///run/containerd/s/fa0f5024c5cfaf968a7887e4813dc6b36c9867a611971280240ecec9c2d91466" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:19.152863 systemd[1]: Started cri-containerd-3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e.scope - libcontainer container 3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e. May 14 18:14:19.327996 containerd[1734]: time="2025-05-14T18:14:19.327954259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cl9vj,Uid:f8c0ddea-8d34-4405-b1d7-cbac1335ae76,Namespace:calico-system,Attempt:0,} returns sandbox id \"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e\"" May 14 18:14:19.390918 systemd-networkd[1358]: cali24b4a6f1036: Gained IPv6LL May 14 18:14:19.523703 sshd[5350]: Accepted publickey for core from 10.200.16.10 port 55344 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:19.524800 sshd-session[5350]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:19.528319 systemd-logind[1706]: New session 12 of user core. May 14 18:14:19.532889 systemd[1]: Started session-12.scope - Session 12 of User core. May 14 18:14:19.580751 containerd[1734]: time="2025-05-14T18:14:19.580705715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,}" May 14 18:14:19.846094 systemd-networkd[1358]: cali29b9bb522f3: Link UP May 14 18:14:19.847151 systemd-networkd[1358]: cali29b9bb522f3: Gained carrier May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.794 [INFO][5399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0 coredns-6f6b679f8f- kube-system 02387ae7-7df0-4320-8070-8518fe0bd4d8 797 0 2025-05-14 18:12:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 coredns-6f6b679f8f-v4tq6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali29b9bb522f3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.794 [INFO][5399] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.813 [INFO][5413] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" HandleID="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.820 [INFO][5413] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" HandleID="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000382aa0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"coredns-6f6b679f8f-v4tq6", "timestamp":"2025-05-14 18:14:19.813633617 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.820 [INFO][5413] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.820 [INFO][5413] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.820 [INFO][5413] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.822 [INFO][5413] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.825 [INFO][5413] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.828 [INFO][5413] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.829 [INFO][5413] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.831 [INFO][5413] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.831 [INFO][5413] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.832 [INFO][5413] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686 May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.835 [INFO][5413] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.842 [INFO][5413] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.68/26] block=192.168.10.64/26 handle="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.842 [INFO][5413] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.68/26] handle="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.842 [INFO][5413] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:19.859963 containerd[1734]: 2025-05-14 18:14:19.842 [INFO][5413] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.68/26] IPv6=[] ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" HandleID="k8s-pod-network.6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.844 [INFO][5399] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"02387ae7-7df0-4320-8070-8518fe0bd4d8", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"coredns-6f6b679f8f-v4tq6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali29b9bb522f3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.844 [INFO][5399] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.68/32] ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.844 [INFO][5399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29b9bb522f3 ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.848 [INFO][5399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.848 [INFO][5399] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"02387ae7-7df0-4320-8070-8518fe0bd4d8", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686", Pod:"coredns-6f6b679f8f-v4tq6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali29b9bb522f3", MAC:"b6:f5:81:29:1f:79", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:19.860531 containerd[1734]: 2025-05-14 18:14:19.858 [INFO][5399] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" Namespace="kube-system" Pod="coredns-6f6b679f8f-v4tq6" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-coredns--6f6b679f8f--v4tq6-eth0" May 14 18:14:20.028236 sshd[5398]: Connection closed by 10.200.16.10 port 55344 May 14 18:14:20.028585 sshd-session[5350]: pam_unix(sshd:session): session closed for user core May 14 18:14:20.030774 systemd[1]: sshd@9-10.200.8.38:22-10.200.16.10:55344.service: Deactivated successfully. May 14 18:14:20.032420 systemd[1]: session-12.scope: Deactivated successfully. May 14 18:14:20.033938 systemd-logind[1706]: Session 12 logged out. Waiting for processes to exit. May 14 18:14:20.034582 systemd-logind[1706]: Removed session 12. May 14 18:14:20.287649 containerd[1734]: time="2025-05-14T18:14:20.287615402Z" level=info msg="connecting to shim 6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686" address="unix:///run/containerd/s/5e64867d219b2923d6b09b5ed9f4770bdcebc8ac98c6f6cdf26238f1c7e419ab" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:20.314876 systemd[1]: Started cri-containerd-6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686.scope - libcontainer container 6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686. May 14 18:14:20.425651 containerd[1734]: time="2025-05-14T18:14:20.425622657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-v4tq6,Uid:02387ae7-7df0-4320-8070-8518fe0bd4d8,Namespace:kube-system,Attempt:0,} returns sandbox id \"6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686\"" May 14 18:14:20.427585 containerd[1734]: time="2025-05-14T18:14:20.427500343Z" level=info msg="CreateContainer within sandbox \"6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 18:14:20.478940 systemd-networkd[1358]: cali9329e3c0df6: Gained IPv6LL May 14 18:14:20.580206 containerd[1734]: time="2025-05-14T18:14:20.580141984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,}" May 14 18:14:20.580762 containerd[1734]: time="2025-05-14T18:14:20.580645587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,}" May 14 18:14:20.839400 systemd-networkd[1358]: cali4872644e5a8: Link UP May 14 18:14:20.840557 systemd-networkd[1358]: cali4872644e5a8: Gained carrier May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.791 [INFO][5493] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0 calico-kube-controllers-58d5b579d7- calico-system b7e2c3b5-0b2e-4624-9ffb-7b15307add96 801 0 2025-05-14 18:12:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58d5b579d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 calico-kube-controllers-58d5b579d7-9wc6k eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4872644e5a8 [] []}} ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.791 [INFO][5493] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.809 [INFO][5505] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" HandleID="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.815 [INFO][5505] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" HandleID="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000313ab0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"calico-kube-controllers-58d5b579d7-9wc6k", "timestamp":"2025-05-14 18:14:20.809631804 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.816 [INFO][5505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.816 [INFO][5505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.816 [INFO][5505] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.817 [INFO][5505] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.819 [INFO][5505] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.822 [INFO][5505] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.823 [INFO][5505] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.825 [INFO][5505] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.825 [INFO][5505] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.826 [INFO][5505] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96 May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.829 [INFO][5505] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.836 [INFO][5505] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.69/26] block=192.168.10.64/26 handle="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.836 [INFO][5505] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.69/26] handle="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.836 [INFO][5505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:20.853111 containerd[1734]: 2025-05-14 18:14:20.836 [INFO][5505] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.69/26] IPv6=[] ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" HandleID="k8s-pod-network.c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.837 [INFO][5493] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0", GenerateName:"calico-kube-controllers-58d5b579d7-", Namespace:"calico-system", SelfLink:"", UID:"b7e2c3b5-0b2e-4624-9ffb-7b15307add96", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58d5b579d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"calico-kube-controllers-58d5b579d7-9wc6k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4872644e5a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.837 [INFO][5493] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.69/32] ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.837 [INFO][5493] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4872644e5a8 ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.840 [INFO][5493] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.841 [INFO][5493] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0", GenerateName:"calico-kube-controllers-58d5b579d7-", Namespace:"calico-system", SelfLink:"", UID:"b7e2c3b5-0b2e-4624-9ffb-7b15307add96", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58d5b579d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96", Pod:"calico-kube-controllers-58d5b579d7-9wc6k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4872644e5a8", MAC:"e6:2b:6c:cb:05:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:20.853881 containerd[1734]: 2025-05-14 18:14:20.851 [INFO][5493] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" Namespace="calico-system" Pod="calico-kube-controllers-58d5b579d7-9wc6k" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--kube--controllers--58d5b579d7--9wc6k-eth0" May 14 18:14:20.927881 systemd-networkd[1358]: cali29b9bb522f3: Gained IPv6LL May 14 18:14:20.983563 containerd[1734]: time="2025-05-14T18:14:20.983029842Z" level=info msg="Container c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:21.026568 systemd-networkd[1358]: cali709ad593fca: Link UP May 14 18:14:21.027112 systemd-networkd[1358]: cali709ad593fca: Gained carrier May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:20.950 [INFO][5525] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0 calico-apiserver-6cb7cd99bf- calico-apiserver 0272f273-8188-48f8-bfc1-35ee033246c4 802 0 2025-05-14 18:12:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cb7cd99bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-c37eb65ec1 calico-apiserver-6cb7cd99bf-2dhkq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali709ad593fca [] []}} ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:20.971 [INFO][5525] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:20.994 [INFO][5539] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" HandleID="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:20.999 [INFO][5539] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" HandleID="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bc5b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-c37eb65ec1", "pod":"calico-apiserver-6cb7cd99bf-2dhkq", "timestamp":"2025-05-14 18:14:20.994077694 +0000 UTC"}, Hostname:"ci-4334.0.0-a-c37eb65ec1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:20.999 [INFO][5539] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.000 [INFO][5539] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.000 [INFO][5539] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-c37eb65ec1' May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.001 [INFO][5539] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.004 [INFO][5539] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.007 [INFO][5539] ipam/ipam.go 489: Trying affinity for 192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.008 [INFO][5539] ipam/ipam.go 155: Attempting to load block cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.009 [INFO][5539] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.10.64/26 host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.009 [INFO][5539] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.10.64/26 handle="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.011 [INFO][5539] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.016 [INFO][5539] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.10.64/26 handle="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.023 [INFO][5539] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.10.70/26] block=192.168.10.64/26 handle="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.023 [INFO][5539] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.10.70/26] handle="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" host="ci-4334.0.0-a-c37eb65ec1" May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.023 [INFO][5539] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 18:14:21.038237 containerd[1734]: 2025-05-14 18:14:21.023 [INFO][5539] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.70/26] IPv6=[] ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" HandleID="k8s-pod-network.5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Workload="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.025 [INFO][5525] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0", GenerateName:"calico-apiserver-6cb7cd99bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"0272f273-8188-48f8-bfc1-35ee033246c4", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cb7cd99bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"", Pod:"calico-apiserver-6cb7cd99bf-2dhkq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali709ad593fca", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.025 [INFO][5525] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.10.70/32] ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.025 [INFO][5525] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali709ad593fca ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.027 [INFO][5525] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.027 [INFO][5525] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0", GenerateName:"calico-apiserver-6cb7cd99bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"0272f273-8188-48f8-bfc1-35ee033246c4", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 18, 12, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cb7cd99bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-c37eb65ec1", ContainerID:"5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc", Pod:"calico-apiserver-6cb7cd99bf-2dhkq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali709ad593fca", MAC:"2a:ff:5a:4e:31:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 18:14:21.038700 containerd[1734]: 2025-05-14 18:14:21.036 [INFO][5525] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" Namespace="calico-apiserver" Pod="calico-apiserver-6cb7cd99bf-2dhkq" WorkloadEndpoint="ci--4334.0.0--a--c37eb65ec1-k8s-calico--apiserver--6cb7cd99bf--2dhkq-eth0" May 14 18:14:21.282279 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1827572473.mount: Deactivated successfully. May 14 18:14:21.527552 containerd[1734]: time="2025-05-14T18:14:21.527464487Z" level=info msg="CreateContainer within sandbox \"6268536b6ec8fa754292b2fef715bacbb53d4ecd7a4ba813fc15bfea04ac1686\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0\"" May 14 18:14:21.528985 containerd[1734]: time="2025-05-14T18:14:21.528882160Z" level=info msg="StartContainer for \"c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0\"" May 14 18:14:21.529883 containerd[1734]: time="2025-05-14T18:14:21.529831533Z" level=info msg="connecting to shim c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0" address="unix:///run/containerd/s/5e64867d219b2923d6b09b5ed9f4770bdcebc8ac98c6f6cdf26238f1c7e419ab" protocol=ttrpc version=3 May 14 18:14:21.548217 systemd[1]: Started cri-containerd-c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0.scope - libcontainer container c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0. May 14 18:14:21.623763 containerd[1734]: time="2025-05-14T18:14:21.623233772Z" level=info msg="StartContainer for \"c792b174956bc6153131f290896a905444cad74329e4e44df79ef453ab273ad0\" returns successfully" May 14 18:14:21.863150 kubelet[3205]: I0514 18:14:21.862981 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-v4tq6" podStartSLOduration=127.862964012 podStartE2EDuration="2m7.862964012s" podCreationTimestamp="2025-05-14 18:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 18:14:21.862286892 +0000 UTC m=+129.354810272" watchObservedRunningTime="2025-05-14 18:14:21.862964012 +0000 UTC m=+129.355487366" May 14 18:14:22.031131 containerd[1734]: time="2025-05-14T18:14:22.031105811Z" level=info msg="connecting to shim 5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc" address="unix:///run/containerd/s/bdcf3b9d855c690c86251067c12756062fe5efca59875dd27dba1a3ee24d29db" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:22.034392 containerd[1734]: time="2025-05-14T18:14:22.033981171Z" level=info msg="connecting to shim c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96" address="unix:///run/containerd/s/b0196effd262e3a4dcb4c9463aa7671eda8a3ad82f00b705a052773cad5993a1" namespace=k8s.io protocol=ttrpc version=3 May 14 18:14:22.060920 systemd[1]: Started cri-containerd-5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc.scope - libcontainer container 5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc. May 14 18:14:22.066209 systemd[1]: Started cri-containerd-c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96.scope - libcontainer container c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96. May 14 18:14:22.110702 containerd[1734]: time="2025-05-14T18:14:22.110613131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cb7cd99bf-2dhkq,Uid:0272f273-8188-48f8-bfc1-35ee033246c4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc\"" May 14 18:14:22.128187 containerd[1734]: time="2025-05-14T18:14:22.128076243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58d5b579d7-9wc6k,Uid:b7e2c3b5-0b2e-4624-9ffb-7b15307add96,Namespace:calico-system,Attempt:0,} returns sandbox id \"c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96\"" May 14 18:14:22.526904 systemd-networkd[1358]: cali4872644e5a8: Gained IPv6LL May 14 18:14:22.654854 systemd-networkd[1358]: cali709ad593fca: Gained IPv6LL May 14 18:14:25.157671 systemd[1]: Started sshd@10-10.200.8.38:22-10.200.16.10:55358.service - OpenSSH per-connection server daemon (10.200.16.10:55358). May 14 18:14:25.322833 containerd[1734]: time="2025-05-14T18:14:25.322801154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:25.324944 containerd[1734]: time="2025-05-14T18:14:25.324907263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 14 18:14:25.370206 containerd[1734]: time="2025-05-14T18:14:25.370146448Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:25.422475 containerd[1734]: time="2025-05-14T18:14:25.422233442Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:25.423008 containerd[1734]: time="2025-05-14T18:14:25.422922241Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 7.287179418s" May 14 18:14:25.423008 containerd[1734]: time="2025-05-14T18:14:25.422952545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 18:14:25.423967 containerd[1734]: time="2025-05-14T18:14:25.423934263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 14 18:14:25.424986 containerd[1734]: time="2025-05-14T18:14:25.424959725Z" level=info msg="CreateContainer within sandbox \"cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 18:14:25.671343 containerd[1734]: time="2025-05-14T18:14:25.671271876Z" level=info msg="Container 1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:25.793578 sshd[5703]: Accepted publickey for core from 10.200.16.10 port 55358 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:25.794619 sshd-session[5703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:25.798928 systemd-logind[1706]: New session 13 of user core. May 14 18:14:25.803885 systemd[1]: Started session-13.scope - Session 13 of User core. May 14 18:14:25.816665 containerd[1734]: time="2025-05-14T18:14:25.816582845Z" level=info msg="CreateContainer within sandbox \"cb8162aac16f6172eadaca968230602ba90fd6347a2ef369762c235f582f5c91\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b\"" May 14 18:14:25.817128 containerd[1734]: time="2025-05-14T18:14:25.817014735Z" level=info msg="StartContainer for \"1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b\"" May 14 18:14:25.818025 containerd[1734]: time="2025-05-14T18:14:25.817995860Z" level=info msg="connecting to shim 1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b" address="unix:///run/containerd/s/b366b09fa0dfe21b4bb4e9a1670a731adbe2d78203ad1ba302bc2bac9909376c" protocol=ttrpc version=3 May 14 18:14:25.838969 systemd[1]: Started cri-containerd-1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b.scope - libcontainer container 1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b. May 14 18:14:25.880659 containerd[1734]: time="2025-05-14T18:14:25.880633848Z" level=info msg="StartContainer for \"1189e6c32bd56e2c2a95f7ec1e3ff3f56769ced87375d30486dbdbd4a9b2752b\" returns successfully" May 14 18:14:26.307412 sshd[5705]: Connection closed by 10.200.16.10 port 55358 May 14 18:14:26.307821 sshd-session[5703]: pam_unix(sshd:session): session closed for user core May 14 18:14:26.311892 systemd[1]: sshd@10-10.200.8.38:22-10.200.16.10:55358.service: Deactivated successfully. May 14 18:14:26.314946 systemd[1]: session-13.scope: Deactivated successfully. May 14 18:14:26.316381 systemd-logind[1706]: Session 13 logged out. Waiting for processes to exit. May 14 18:14:26.317256 systemd-logind[1706]: Removed session 13. May 14 18:14:26.876618 kubelet[3205]: I0514 18:14:26.876111 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-z6fr6" podStartSLOduration=109.58799224 podStartE2EDuration="1m56.876095099s" podCreationTimestamp="2025-05-14 18:12:30 +0000 UTC" firstStartedPulling="2025-05-14 18:14:18.135491098 +0000 UTC m=+125.628014452" lastFinishedPulling="2025-05-14 18:14:25.423593956 +0000 UTC m=+132.916117311" observedRunningTime="2025-05-14 18:14:26.875928264 +0000 UTC m=+134.368451624" watchObservedRunningTime="2025-05-14 18:14:26.876095099 +0000 UTC m=+134.368618461" May 14 18:14:28.575435 containerd[1734]: time="2025-05-14T18:14:28.575382524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:28.577467 containerd[1734]: time="2025-05-14T18:14:28.577420919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 14 18:14:28.622212 containerd[1734]: time="2025-05-14T18:14:28.622154047Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:28.672151 containerd[1734]: time="2025-05-14T18:14:28.672120814Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:28.672729 containerd[1734]: time="2025-05-14T18:14:28.672641009Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 3.248581545s" May 14 18:14:28.672729 containerd[1734]: time="2025-05-14T18:14:28.672668081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 14 18:14:28.673684 containerd[1734]: time="2025-05-14T18:14:28.673526500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 18:14:28.674460 containerd[1734]: time="2025-05-14T18:14:28.674437520Z" level=info msg="CreateContainer within sandbox \"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 14 18:14:28.832167 containerd[1734]: time="2025-05-14T18:14:28.831443605Z" level=info msg="Container 4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:28.972411 containerd[1734]: time="2025-05-14T18:14:28.972390186Z" level=info msg="CreateContainer within sandbox \"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10\"" May 14 18:14:28.972762 containerd[1734]: time="2025-05-14T18:14:28.972717822Z" level=info msg="StartContainer for \"4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10\"" May 14 18:14:28.973816 containerd[1734]: time="2025-05-14T18:14:28.973784485Z" level=info msg="connecting to shim 4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10" address="unix:///run/containerd/s/fa0f5024c5cfaf968a7887e4813dc6b36c9867a611971280240ecec9c2d91466" protocol=ttrpc version=3 May 14 18:14:28.992897 systemd[1]: Started cri-containerd-4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10.scope - libcontainer container 4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10. May 14 18:14:29.067622 containerd[1734]: time="2025-05-14T18:14:29.067601744Z" level=info msg="StartContainer for \"4cd657981cef23b1cb84ed645274acfd9479bcce4e538f8faacaee8a2ebe6f10\" returns successfully" May 14 18:14:29.568830 containerd[1734]: time="2025-05-14T18:14:29.568796864Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:29.615544 containerd[1734]: time="2025-05-14T18:14:29.615509798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 14 18:14:29.616995 containerd[1734]: time="2025-05-14T18:14:29.616953987Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 943.402962ms" May 14 18:14:29.616995 containerd[1734]: time="2025-05-14T18:14:29.616994306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 18:14:29.617676 containerd[1734]: time="2025-05-14T18:14:29.617654045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 14 18:14:29.618541 containerd[1734]: time="2025-05-14T18:14:29.618521265Z" level=info msg="CreateContainer within sandbox \"5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 18:14:29.867858 containerd[1734]: time="2025-05-14T18:14:29.867830818Z" level=info msg="Container aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:30.120369 containerd[1734]: time="2025-05-14T18:14:30.120301820Z" level=info msg="CreateContainer within sandbox \"5ece2aadf44e46524bc8131cf8b8aee6c842c844de4b75a776e6cea4ef7eacbc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7\"" May 14 18:14:30.120759 containerd[1734]: time="2025-05-14T18:14:30.120724745Z" level=info msg="StartContainer for \"aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7\"" May 14 18:14:30.122070 containerd[1734]: time="2025-05-14T18:14:30.122048778Z" level=info msg="connecting to shim aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7" address="unix:///run/containerd/s/bdcf3b9d855c690c86251067c12756062fe5efca59875dd27dba1a3ee24d29db" protocol=ttrpc version=3 May 14 18:14:30.144884 systemd[1]: Started cri-containerd-aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7.scope - libcontainer container aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7. May 14 18:14:30.428435 containerd[1734]: time="2025-05-14T18:14:30.428063621Z" level=info msg="StartContainer for \"aac1f60029bd88004bada10ed6ead1a1df69819e8e58562930b3b937fd6629c7\" returns successfully" May 14 18:14:30.888914 kubelet[3205]: I0514 18:14:30.888822 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cb7cd99bf-2dhkq" podStartSLOduration=113.382965148 podStartE2EDuration="2m0.888804327s" podCreationTimestamp="2025-05-14 18:12:30 +0000 UTC" firstStartedPulling="2025-05-14 18:14:22.111675109 +0000 UTC m=+129.604198462" lastFinishedPulling="2025-05-14 18:14:29.61751428 +0000 UTC m=+137.110037641" observedRunningTime="2025-05-14 18:14:30.888683627 +0000 UTC m=+138.381207006" watchObservedRunningTime="2025-05-14 18:14:30.888804327 +0000 UTC m=+138.381327685" May 14 18:14:31.421513 systemd[1]: Started sshd@11-10.200.8.38:22-10.200.16.10:54964.service - OpenSSH per-connection server daemon (10.200.16.10:54964). May 14 18:14:32.058038 sshd[5839]: Accepted publickey for core from 10.200.16.10 port 54964 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:32.059310 sshd-session[5839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:32.063728 systemd-logind[1706]: New session 14 of user core. May 14 18:14:32.074879 systemd[1]: Started session-14.scope - Session 14 of User core. May 14 18:14:32.560479 sshd[5841]: Connection closed by 10.200.16.10 port 54964 May 14 18:14:32.560895 sshd-session[5839]: pam_unix(sshd:session): session closed for user core May 14 18:14:32.563563 systemd[1]: sshd@11-10.200.8.38:22-10.200.16.10:54964.service: Deactivated successfully. May 14 18:14:32.565148 systemd[1]: session-14.scope: Deactivated successfully. May 14 18:14:32.565936 systemd-logind[1706]: Session 14 logged out. Waiting for processes to exit. May 14 18:14:32.567092 systemd-logind[1706]: Removed session 14. May 14 18:14:35.110140 containerd[1734]: time="2025-05-14T18:14:35.110097211Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"61a579ae40b91340c6385f57c72fa84c2deaf1dbfd32c359290b44ab8a2d2838\" pid:5871 exited_at:{seconds:1747246475 nanos:109810465}" May 14 18:14:35.818422 containerd[1734]: time="2025-05-14T18:14:35.818378680Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:35.820822 containerd[1734]: time="2025-05-14T18:14:35.820788490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 14 18:14:35.885571 containerd[1734]: time="2025-05-14T18:14:35.885513428Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:35.930096 containerd[1734]: time="2025-05-14T18:14:35.930038083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:35.930528 containerd[1734]: time="2025-05-14T18:14:35.930502747Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 6.312535632s" May 14 18:14:35.930569 containerd[1734]: time="2025-05-14T18:14:35.930531108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 14 18:14:35.931440 containerd[1734]: time="2025-05-14T18:14:35.931355511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 14 18:14:35.941043 containerd[1734]: time="2025-05-14T18:14:35.940929286Z" level=info msg="CreateContainer within sandbox \"c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 14 18:14:36.124451 containerd[1734]: time="2025-05-14T18:14:36.124398375Z" level=info msg="Container 210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:36.266071 containerd[1734]: time="2025-05-14T18:14:36.266047053Z" level=info msg="CreateContainer within sandbox \"c7b1b6539cc481f188191335ce37cf0c1fe4bf83680aedc99229dcdc8a961b96\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\"" May 14 18:14:36.266563 containerd[1734]: time="2025-05-14T18:14:36.266462505Z" level=info msg="StartContainer for \"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\"" May 14 18:14:36.267597 containerd[1734]: time="2025-05-14T18:14:36.267565077Z" level=info msg="connecting to shim 210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6" address="unix:///run/containerd/s/b0196effd262e3a4dcb4c9463aa7671eda8a3ad82f00b705a052773cad5993a1" protocol=ttrpc version=3 May 14 18:14:36.287871 systemd[1]: Started cri-containerd-210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6.scope - libcontainer container 210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6. May 14 18:14:36.332387 containerd[1734]: time="2025-05-14T18:14:36.332368393Z" level=info msg="StartContainer for \"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" returns successfully" May 14 18:14:36.901806 kubelet[3205]: I0514 18:14:36.901496 3205 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58d5b579d7-9wc6k" podStartSLOduration=111.099408925 podStartE2EDuration="2m4.901477707s" podCreationTimestamp="2025-05-14 18:12:32 +0000 UTC" firstStartedPulling="2025-05-14 18:14:22.129130441 +0000 UTC m=+129.621653798" lastFinishedPulling="2025-05-14 18:14:35.931199227 +0000 UTC m=+143.423722580" observedRunningTime="2025-05-14 18:14:36.899638411 +0000 UTC m=+144.392161773" watchObservedRunningTime="2025-05-14 18:14:36.901477707 +0000 UTC m=+144.394001076" May 14 18:14:36.926400 containerd[1734]: time="2025-05-14T18:14:36.926361918Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"163e1d82a7982db6801c9c85dc0d9f32831e6608dae099f2f948a72fb777e03d\" pid:5934 exited_at:{seconds:1747246476 nanos:926101839}" May 14 18:14:37.672522 systemd[1]: Started sshd@12-10.200.8.38:22-10.200.16.10:54970.service - OpenSSH per-connection server daemon (10.200.16.10:54970). May 14 18:14:38.308610 sshd[5945]: Accepted publickey for core from 10.200.16.10 port 54970 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:38.309728 sshd-session[5945]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:38.316376 systemd-logind[1706]: New session 15 of user core. May 14 18:14:38.321895 systemd[1]: Started session-15.scope - Session 15 of User core. May 14 18:14:38.525739 containerd[1734]: time="2025-05-14T18:14:38.525702375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:38.572805 containerd[1734]: time="2025-05-14T18:14:38.572731792Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 14 18:14:38.619520 containerd[1734]: time="2025-05-14T18:14:38.619461098Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:38.669146 containerd[1734]: time="2025-05-14T18:14:38.669118739Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 18:14:38.669665 containerd[1734]: time="2025-05-14T18:14:38.669429776Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.73804588s" May 14 18:14:38.669665 containerd[1734]: time="2025-05-14T18:14:38.669455307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 14 18:14:38.672151 containerd[1734]: time="2025-05-14T18:14:38.672127705Z" level=info msg="CreateContainer within sandbox \"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 14 18:14:38.815074 sshd[5951]: Connection closed by 10.200.16.10 port 54970 May 14 18:14:38.815457 sshd-session[5945]: pam_unix(sshd:session): session closed for user core May 14 18:14:38.818100 systemd[1]: sshd@12-10.200.8.38:22-10.200.16.10:54970.service: Deactivated successfully. May 14 18:14:38.819653 systemd[1]: session-15.scope: Deactivated successfully. May 14 18:14:38.820371 systemd-logind[1706]: Session 15 logged out. Waiting for processes to exit. May 14 18:14:38.821468 systemd-logind[1706]: Removed session 15. May 14 18:14:38.827355 containerd[1734]: time="2025-05-14T18:14:38.827301039Z" level=info msg="Container e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd: CDI devices from CRI Config.CDIDevices: []" May 14 18:14:38.926528 containerd[1734]: time="2025-05-14T18:14:38.926500136Z" level=info msg="CreateContainer within sandbox \"3f4ba02ca3c310df6a7b37ccb71401eb294171641c0ca3a0998019eae2eca82e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd\"" May 14 18:14:38.926906 containerd[1734]: time="2025-05-14T18:14:38.926839624Z" level=info msg="StartContainer for \"e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd\"" May 14 18:14:38.927871 containerd[1734]: time="2025-05-14T18:14:38.927838417Z" level=info msg="connecting to shim e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd" address="unix:///run/containerd/s/fa0f5024c5cfaf968a7887e4813dc6b36c9867a611971280240ecec9c2d91466" protocol=ttrpc version=3 May 14 18:14:38.947859 systemd[1]: Started cri-containerd-e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd.scope - libcontainer container e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd. May 14 18:14:38.975389 containerd[1734]: time="2025-05-14T18:14:38.975354753Z" level=info msg="StartContainer for \"e274fcf2c7298b3bcfe9f3c03bf9b1d62cce597ae710d122799eefa2d0e765cd\" returns successfully" May 14 18:14:39.704983 kubelet[3205]: I0514 18:14:39.704948 3205 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 14 18:14:39.704983 kubelet[3205]: I0514 18:14:39.704986 3205 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 14 18:14:43.929427 systemd[1]: Started sshd@13-10.200.8.38:22-10.200.16.10:52458.service - OpenSSH per-connection server daemon (10.200.16.10:52458). May 14 18:14:44.562884 sshd[5996]: Accepted publickey for core from 10.200.16.10 port 52458 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:44.563963 sshd-session[5996]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:44.567782 systemd-logind[1706]: New session 16 of user core. May 14 18:14:44.572912 systemd[1]: Started session-16.scope - Session 16 of User core. May 14 18:14:45.052943 sshd[5999]: Connection closed by 10.200.16.10 port 52458 May 14 18:14:45.053377 sshd-session[5996]: pam_unix(sshd:session): session closed for user core May 14 18:14:45.055974 systemd[1]: sshd@13-10.200.8.38:22-10.200.16.10:52458.service: Deactivated successfully. May 14 18:14:45.057428 systemd[1]: session-16.scope: Deactivated successfully. May 14 18:14:45.058129 systemd-logind[1706]: Session 16 logged out. Waiting for processes to exit. May 14 18:14:45.059347 systemd-logind[1706]: Removed session 16. May 14 18:14:48.469149 containerd[1734]: time="2025-05-14T18:14:48.469106305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"28c2fc10bd269130e91e5a9e982bcd007bf2f633890cf7412976c9241a96c6b2\" pid:6027 exited_at:{seconds:1747246488 nanos:468914691}" May 14 18:14:50.169911 systemd[1]: Started sshd@14-10.200.8.38:22-10.200.16.10:50282.service - OpenSSH per-connection server daemon (10.200.16.10:50282). May 14 18:14:50.808200 sshd[6037]: Accepted publickey for core from 10.200.16.10 port 50282 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:50.809229 sshd-session[6037]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:50.814554 systemd-logind[1706]: New session 17 of user core. May 14 18:14:50.817873 systemd[1]: Started session-17.scope - Session 17 of User core. May 14 18:14:51.312359 sshd[6042]: Connection closed by 10.200.16.10 port 50282 May 14 18:14:51.312804 sshd-session[6037]: pam_unix(sshd:session): session closed for user core May 14 18:14:51.315119 systemd[1]: sshd@14-10.200.8.38:22-10.200.16.10:50282.service: Deactivated successfully. May 14 18:14:51.316644 systemd[1]: session-17.scope: Deactivated successfully. May 14 18:14:51.318186 systemd-logind[1706]: Session 17 logged out. Waiting for processes to exit. May 14 18:14:51.318966 systemd-logind[1706]: Removed session 17. May 14 18:14:56.429583 systemd[1]: Started sshd@15-10.200.8.38:22-10.200.16.10:50298.service - OpenSSH per-connection server daemon (10.200.16.10:50298). May 14 18:14:57.063892 sshd[6061]: Accepted publickey for core from 10.200.16.10 port 50298 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:14:57.064922 sshd-session[6061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:14:57.068536 systemd-logind[1706]: New session 18 of user core. May 14 18:14:57.072883 systemd[1]: Started session-18.scope - Session 18 of User core. May 14 18:14:57.557937 sshd[6063]: Connection closed by 10.200.16.10 port 50298 May 14 18:14:57.558333 sshd-session[6061]: pam_unix(sshd:session): session closed for user core May 14 18:14:57.560913 systemd[1]: sshd@15-10.200.8.38:22-10.200.16.10:50298.service: Deactivated successfully. May 14 18:14:57.562348 systemd[1]: session-18.scope: Deactivated successfully. May 14 18:14:57.563042 systemd-logind[1706]: Session 18 logged out. Waiting for processes to exit. May 14 18:14:57.564059 systemd-logind[1706]: Removed session 18. May 14 18:15:02.677674 systemd[1]: Started sshd@16-10.200.8.38:22-10.200.16.10:47918.service - OpenSSH per-connection server daemon (10.200.16.10:47918). May 14 18:15:03.311374 sshd[6083]: Accepted publickey for core from 10.200.16.10 port 47918 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:03.312675 sshd-session[6083]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:03.316933 systemd-logind[1706]: New session 19 of user core. May 14 18:15:03.319902 systemd[1]: Started session-19.scope - Session 19 of User core. May 14 18:15:03.800961 sshd[6085]: Connection closed by 10.200.16.10 port 47918 May 14 18:15:03.801361 sshd-session[6083]: pam_unix(sshd:session): session closed for user core May 14 18:15:03.804046 systemd[1]: sshd@16-10.200.8.38:22-10.200.16.10:47918.service: Deactivated successfully. May 14 18:15:03.805606 systemd[1]: session-19.scope: Deactivated successfully. May 14 18:15:03.806278 systemd-logind[1706]: Session 19 logged out. Waiting for processes to exit. May 14 18:15:03.807445 systemd-logind[1706]: Removed session 19. May 14 18:15:03.912368 systemd[1]: Started sshd@17-10.200.8.38:22-10.200.16.10:47934.service - OpenSSH per-connection server daemon (10.200.16.10:47934). May 14 18:15:04.543464 sshd[6099]: Accepted publickey for core from 10.200.16.10 port 47934 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:04.544492 sshd-session[6099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:04.548599 systemd-logind[1706]: New session 20 of user core. May 14 18:15:04.552895 systemd[1]: Started session-20.scope - Session 20 of User core. May 14 18:15:05.059312 sshd[6101]: Connection closed by 10.200.16.10 port 47934 May 14 18:15:05.059787 sshd-session[6099]: pam_unix(sshd:session): session closed for user core May 14 18:15:05.062723 systemd[1]: sshd@17-10.200.8.38:22-10.200.16.10:47934.service: Deactivated successfully. May 14 18:15:05.064174 systemd[1]: session-20.scope: Deactivated successfully. May 14 18:15:05.064870 systemd-logind[1706]: Session 20 logged out. Waiting for processes to exit. May 14 18:15:05.065897 systemd-logind[1706]: Removed session 20. May 14 18:15:05.108625 containerd[1734]: time="2025-05-14T18:15:05.108586787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"433b828906c4eba926b5d5ba53038ad7eaff9c7754899550d3fbb3e0d025721c\" pid:6122 exited_at:{seconds:1747246505 nanos:108367030}" May 14 18:15:05.175243 systemd[1]: Started sshd@18-10.200.8.38:22-10.200.16.10:47938.service - OpenSSH per-connection server daemon (10.200.16.10:47938). May 14 18:15:05.807110 sshd[6136]: Accepted publickey for core from 10.200.16.10 port 47938 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:05.808365 sshd-session[6136]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:05.811786 systemd-logind[1706]: New session 21 of user core. May 14 18:15:05.820891 systemd[1]: Started session-21.scope - Session 21 of User core. May 14 18:15:06.298972 sshd[6138]: Connection closed by 10.200.16.10 port 47938 May 14 18:15:06.299380 sshd-session[6136]: pam_unix(sshd:session): session closed for user core May 14 18:15:06.301516 systemd[1]: sshd@18-10.200.8.38:22-10.200.16.10:47938.service: Deactivated successfully. May 14 18:15:06.304097 systemd[1]: session-21.scope: Deactivated successfully. May 14 18:15:06.304897 systemd-logind[1706]: Session 21 logged out. Waiting for processes to exit. May 14 18:15:06.305622 systemd-logind[1706]: Removed session 21. May 14 18:15:11.411661 systemd[1]: Started sshd@19-10.200.8.38:22-10.200.16.10:35262.service - OpenSSH per-connection server daemon (10.200.16.10:35262). May 14 18:15:12.050627 sshd[6150]: Accepted publickey for core from 10.200.16.10 port 35262 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:12.051896 sshd-session[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:12.056275 systemd-logind[1706]: New session 22 of user core. May 14 18:15:12.060904 systemd[1]: Started session-22.scope - Session 22 of User core. May 14 18:15:12.554794 sshd[6152]: Connection closed by 10.200.16.10 port 35262 May 14 18:15:12.555231 sshd-session[6150]: pam_unix(sshd:session): session closed for user core May 14 18:15:12.558045 systemd[1]: sshd@19-10.200.8.38:22-10.200.16.10:35262.service: Deactivated successfully. May 14 18:15:12.559595 systemd[1]: session-22.scope: Deactivated successfully. May 14 18:15:12.560264 systemd-logind[1706]: Session 22 logged out. Waiting for processes to exit. May 14 18:15:12.561323 systemd-logind[1706]: Removed session 22. May 14 18:15:17.666694 systemd[1]: Started sshd@20-10.200.8.38:22-10.200.16.10:35264.service - OpenSSH per-connection server daemon (10.200.16.10:35264). May 14 18:15:18.301139 sshd[6168]: Accepted publickey for core from 10.200.16.10 port 35264 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:18.302372 sshd-session[6168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:18.306607 systemd-logind[1706]: New session 23 of user core. May 14 18:15:18.312864 systemd[1]: Started session-23.scope - Session 23 of User core. May 14 18:15:18.475108 containerd[1734]: time="2025-05-14T18:15:18.475076288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"a82233b8266e2d0c22fee57127b294d18c5490171790dcb0ed0c7f090f55d3ad\" pid:6196 exited_at:{seconds:1747246518 nanos:474397333}" May 14 18:15:18.475735 containerd[1734]: time="2025-05-14T18:15:18.475714125Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"2890892e13df5c6eb48e339b5fadbb4d6490392bc225c670a296c73d6c4228ae\" pid:6195 exited_at:{seconds:1747246518 nanos:475414499}" May 14 18:15:18.785449 sshd[6170]: Connection closed by 10.200.16.10 port 35264 May 14 18:15:18.785836 sshd-session[6168]: pam_unix(sshd:session): session closed for user core May 14 18:15:18.788278 systemd[1]: sshd@20-10.200.8.38:22-10.200.16.10:35264.service: Deactivated successfully. May 14 18:15:18.789786 systemd[1]: session-23.scope: Deactivated successfully. May 14 18:15:18.790502 systemd-logind[1706]: Session 23 logged out. Waiting for processes to exit. May 14 18:15:18.791454 systemd-logind[1706]: Removed session 23. May 14 18:15:23.904764 systemd[1]: Started sshd@21-10.200.8.38:22-10.200.16.10:42486.service - OpenSSH per-connection server daemon (10.200.16.10:42486). May 14 18:15:24.534400 sshd[6225]: Accepted publickey for core from 10.200.16.10 port 42486 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:24.535573 sshd-session[6225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:24.540869 systemd-logind[1706]: New session 24 of user core. May 14 18:15:24.546044 systemd[1]: Started session-24.scope - Session 24 of User core. May 14 18:15:25.032324 sshd[6230]: Connection closed by 10.200.16.10 port 42486 May 14 18:15:25.032709 sshd-session[6225]: pam_unix(sshd:session): session closed for user core May 14 18:15:25.034779 systemd[1]: sshd@21-10.200.8.38:22-10.200.16.10:42486.service: Deactivated successfully. May 14 18:15:25.036439 systemd[1]: session-24.scope: Deactivated successfully. May 14 18:15:25.038112 systemd-logind[1706]: Session 24 logged out. Waiting for processes to exit. May 14 18:15:25.038939 systemd-logind[1706]: Removed session 24. May 14 18:15:30.148945 systemd[1]: Started sshd@22-10.200.8.38:22-10.200.16.10:36716.service - OpenSSH per-connection server daemon (10.200.16.10:36716). May 14 18:15:30.787476 sshd[6243]: Accepted publickey for core from 10.200.16.10 port 36716 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:30.788695 sshd-session[6243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:30.793225 systemd-logind[1706]: New session 25 of user core. May 14 18:15:30.798869 systemd[1]: Started session-25.scope - Session 25 of User core. May 14 18:15:31.274944 sshd[6245]: Connection closed by 10.200.16.10 port 36716 May 14 18:15:31.275322 sshd-session[6243]: pam_unix(sshd:session): session closed for user core May 14 18:15:31.277992 systemd[1]: sshd@22-10.200.8.38:22-10.200.16.10:36716.service: Deactivated successfully. May 14 18:15:31.279588 systemd[1]: session-25.scope: Deactivated successfully. May 14 18:15:31.280190 systemd-logind[1706]: Session 25 logged out. Waiting for processes to exit. May 14 18:15:31.281300 systemd-logind[1706]: Removed session 25. May 14 18:15:35.112769 containerd[1734]: time="2025-05-14T18:15:35.112690967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"18fb2d7aa2f6ee679ada5adcb46fc23819aba7b206b2f4c7a170df0d8c5938ac\" pid:6269 exited_at:{seconds:1747246535 nanos:112436311}" May 14 18:15:35.757476 update_engine[1707]: I20250514 18:15:35.757426 1707 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 14 18:15:35.757476 update_engine[1707]: I20250514 18:15:35.757472 1707 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 14 18:15:35.757861 update_engine[1707]: I20250514 18:15:35.757611 1707 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 14 18:15:35.758000 update_engine[1707]: I20250514 18:15:35.757987 1707 omaha_request_params.cc:62] Current group set to developer May 14 18:15:35.758115 update_engine[1707]: I20250514 18:15:35.758097 1707 update_attempter.cc:499] Already updated boot flags. Skipping. May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758150 1707 update_attempter.cc:643] Scheduling an action processor start. May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758170 1707 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758200 1707 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758246 1707 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758251 1707 omaha_request_action.cc:272] Request: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: May 14 18:15:35.758543 update_engine[1707]: I20250514 18:15:35.758256 1707 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 18:15:35.758844 locksmithd[1801]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 14 18:15:35.759316 update_engine[1707]: I20250514 18:15:35.759290 1707 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 18:15:35.759577 update_engine[1707]: I20250514 18:15:35.759555 1707 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 18:15:35.802195 update_engine[1707]: E20250514 18:15:35.802162 1707 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 18:15:35.802282 update_engine[1707]: I20250514 18:15:35.802245 1707 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 14 18:15:36.387540 systemd[1]: Started sshd@23-10.200.8.38:22-10.200.16.10:36728.service - OpenSSH per-connection server daemon (10.200.16.10:36728). May 14 18:15:37.020525 sshd[6289]: Accepted publickey for core from 10.200.16.10 port 36728 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:37.023001 sshd-session[6289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:37.030000 systemd-logind[1706]: New session 26 of user core. May 14 18:15:37.033086 systemd[1]: Started session-26.scope - Session 26 of User core. May 14 18:15:37.521325 sshd[6291]: Connection closed by 10.200.16.10 port 36728 May 14 18:15:37.521472 sshd-session[6289]: pam_unix(sshd:session): session closed for user core May 14 18:15:37.524561 systemd[1]: sshd@23-10.200.8.38:22-10.200.16.10:36728.service: Deactivated successfully. May 14 18:15:37.526952 systemd[1]: session-26.scope: Deactivated successfully. May 14 18:15:37.527632 systemd-logind[1706]: Session 26 logged out. Waiting for processes to exit. May 14 18:15:37.529529 systemd-logind[1706]: Removed session 26. May 14 18:15:42.639704 systemd[1]: Started sshd@24-10.200.8.38:22-10.200.16.10:45832.service - OpenSSH per-connection server daemon (10.200.16.10:45832). May 14 18:15:43.301371 sshd[6305]: Accepted publickey for core from 10.200.16.10 port 45832 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:43.302308 sshd-session[6305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:43.306294 systemd-logind[1706]: New session 27 of user core. May 14 18:15:43.311937 systemd[1]: Started session-27.scope - Session 27 of User core. May 14 18:15:43.796320 sshd[6307]: Connection closed by 10.200.16.10 port 45832 May 14 18:15:43.796719 sshd-session[6305]: pam_unix(sshd:session): session closed for user core May 14 18:15:43.798782 systemd[1]: sshd@24-10.200.8.38:22-10.200.16.10:45832.service: Deactivated successfully. May 14 18:15:43.800399 systemd[1]: session-27.scope: Deactivated successfully. May 14 18:15:43.802100 systemd-logind[1706]: Session 27 logged out. Waiting for processes to exit. May 14 18:15:43.803244 systemd-logind[1706]: Removed session 27. May 14 18:15:45.754094 update_engine[1707]: I20250514 18:15:45.754037 1707 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 18:15:45.754451 update_engine[1707]: I20250514 18:15:45.754264 1707 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 18:15:45.754527 update_engine[1707]: I20250514 18:15:45.754491 1707 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 18:15:45.908290 update_engine[1707]: E20250514 18:15:45.908246 1707 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 18:15:45.908388 update_engine[1707]: I20250514 18:15:45.908313 1707 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 14 18:15:48.465284 containerd[1734]: time="2025-05-14T18:15:48.465160112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"7e0cde2011c37a091f8c326a1abff6d30f8eefc6004bfbad52bc81c92b14c646\" pid:6333 exited_at:{seconds:1747246548 nanos:464958858}" May 14 18:15:48.911810 systemd[1]: Started sshd@25-10.200.8.38:22-10.200.16.10:57604.service - OpenSSH per-connection server daemon (10.200.16.10:57604). May 14 18:15:49.551447 sshd[6343]: Accepted publickey for core from 10.200.16.10 port 57604 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:49.552547 sshd-session[6343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:49.556498 systemd-logind[1706]: New session 28 of user core. May 14 18:15:49.563881 systemd[1]: Started session-28.scope - Session 28 of User core. May 14 18:15:50.040576 sshd[6345]: Connection closed by 10.200.16.10 port 57604 May 14 18:15:50.040974 sshd-session[6343]: pam_unix(sshd:session): session closed for user core May 14 18:15:50.043894 systemd[1]: sshd@25-10.200.8.38:22-10.200.16.10:57604.service: Deactivated successfully. May 14 18:15:50.045458 systemd[1]: session-28.scope: Deactivated successfully. May 14 18:15:50.046223 systemd-logind[1706]: Session 28 logged out. Waiting for processes to exit. May 14 18:15:50.047297 systemd-logind[1706]: Removed session 28. May 14 18:15:50.153258 systemd[1]: Started sshd@26-10.200.8.38:22-10.200.16.10:57620.service - OpenSSH per-connection server daemon (10.200.16.10:57620). May 14 18:15:50.787617 sshd[6356]: Accepted publickey for core from 10.200.16.10 port 57620 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:50.788879 sshd-session[6356]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:50.792796 systemd-logind[1706]: New session 29 of user core. May 14 18:15:50.798873 systemd[1]: Started session-29.scope - Session 29 of User core. May 14 18:15:51.342289 sshd[6358]: Connection closed by 10.200.16.10 port 57620 May 14 18:15:51.342818 sshd-session[6356]: pam_unix(sshd:session): session closed for user core May 14 18:15:51.345306 systemd[1]: sshd@26-10.200.8.38:22-10.200.16.10:57620.service: Deactivated successfully. May 14 18:15:51.346987 systemd[1]: session-29.scope: Deactivated successfully. May 14 18:15:51.348685 systemd-logind[1706]: Session 29 logged out. Waiting for processes to exit. May 14 18:15:51.349538 systemd-logind[1706]: Removed session 29. May 14 18:15:51.456181 systemd[1]: Started sshd@27-10.200.8.38:22-10.200.16.10:57636.service - OpenSSH per-connection server daemon (10.200.16.10:57636). May 14 18:15:52.088979 sshd[6367]: Accepted publickey for core from 10.200.16.10 port 57636 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:52.089879 sshd-session[6367]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:52.093588 systemd-logind[1706]: New session 30 of user core. May 14 18:15:52.096888 systemd[1]: Started session-30.scope - Session 30 of User core. May 14 18:15:54.075853 sshd[6369]: Connection closed by 10.200.16.10 port 57636 May 14 18:15:54.076367 sshd-session[6367]: pam_unix(sshd:session): session closed for user core May 14 18:15:54.079353 systemd[1]: sshd@27-10.200.8.38:22-10.200.16.10:57636.service: Deactivated successfully. May 14 18:15:54.080926 systemd[1]: session-30.scope: Deactivated successfully. May 14 18:15:54.081140 systemd[1]: session-30.scope: Consumed 366ms CPU time, 69.8M memory peak. May 14 18:15:54.081677 systemd-logind[1706]: Session 30 logged out. Waiting for processes to exit. May 14 18:15:54.082883 systemd-logind[1706]: Removed session 30. May 14 18:15:54.186004 systemd[1]: Started sshd@28-10.200.8.38:22-10.200.16.10:57640.service - OpenSSH per-connection server daemon (10.200.16.10:57640). May 14 18:15:54.822241 sshd[6403]: Accepted publickey for core from 10.200.16.10 port 57640 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:54.823645 sshd-session[6403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:54.828316 systemd-logind[1706]: New session 31 of user core. May 14 18:15:54.833065 systemd[1]: Started session-31.scope - Session 31 of User core. May 14 18:15:55.387233 sshd[6405]: Connection closed by 10.200.16.10 port 57640 May 14 18:15:55.387617 sshd-session[6403]: pam_unix(sshd:session): session closed for user core May 14 18:15:55.390362 systemd[1]: sshd@28-10.200.8.38:22-10.200.16.10:57640.service: Deactivated successfully. May 14 18:15:55.392048 systemd[1]: session-31.scope: Deactivated successfully. May 14 18:15:55.392661 systemd-logind[1706]: Session 31 logged out. Waiting for processes to exit. May 14 18:15:55.393686 systemd-logind[1706]: Removed session 31. May 14 18:15:55.499188 systemd[1]: Started sshd@29-10.200.8.38:22-10.200.16.10:57652.service - OpenSSH per-connection server daemon (10.200.16.10:57652). May 14 18:15:55.754213 update_engine[1707]: I20250514 18:15:55.754130 1707 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 18:15:55.754437 update_engine[1707]: I20250514 18:15:55.754330 1707 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 18:15:55.754648 update_engine[1707]: I20250514 18:15:55.754568 1707 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 18:15:55.794177 update_engine[1707]: E20250514 18:15:55.794147 1707 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 18:15:55.794256 update_engine[1707]: I20250514 18:15:55.794202 1707 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 14 18:15:56.132994 sshd[6415]: Accepted publickey for core from 10.200.16.10 port 57652 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:15:56.135421 sshd-session[6415]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:15:56.139273 systemd-logind[1706]: New session 32 of user core. May 14 18:15:56.143852 systemd[1]: Started session-32.scope - Session 32 of User core. May 14 18:15:56.620497 sshd[6417]: Connection closed by 10.200.16.10 port 57652 May 14 18:15:56.620877 sshd-session[6415]: pam_unix(sshd:session): session closed for user core May 14 18:15:56.623302 systemd[1]: sshd@29-10.200.8.38:22-10.200.16.10:57652.service: Deactivated successfully. May 14 18:15:56.624765 systemd[1]: session-32.scope: Deactivated successfully. May 14 18:15:56.625391 systemd-logind[1706]: Session 32 logged out. Waiting for processes to exit. May 14 18:15:56.626432 systemd-logind[1706]: Removed session 32. May 14 18:16:01.739913 systemd[1]: Started sshd@30-10.200.8.38:22-10.200.16.10:52820.service - OpenSSH per-connection server daemon (10.200.16.10:52820). May 14 18:16:02.373493 sshd[6430]: Accepted publickey for core from 10.200.16.10 port 52820 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:02.374662 sshd-session[6430]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:02.378645 systemd-logind[1706]: New session 33 of user core. May 14 18:16:02.381933 systemd[1]: Started session-33.scope - Session 33 of User core. May 14 18:16:02.866784 sshd[6432]: Connection closed by 10.200.16.10 port 52820 May 14 18:16:02.865893 sshd-session[6430]: pam_unix(sshd:session): session closed for user core May 14 18:16:02.868954 systemd-logind[1706]: Session 33 logged out. Waiting for processes to exit. May 14 18:16:02.871015 systemd[1]: sshd@30-10.200.8.38:22-10.200.16.10:52820.service: Deactivated successfully. May 14 18:16:02.875023 systemd[1]: session-33.scope: Deactivated successfully. May 14 18:16:02.878947 systemd-logind[1706]: Removed session 33. May 14 18:16:05.113040 containerd[1734]: time="2025-05-14T18:16:05.112984560Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"8270a08e607efc59368d37397145f6a7535d84a25c4d6c8ae9d87b8871aee2f0\" pid:6455 exited_at:{seconds:1747246565 nanos:112723498}" May 14 18:16:05.754802 update_engine[1707]: I20250514 18:16:05.754716 1707 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 18:16:05.755143 update_engine[1707]: I20250514 18:16:05.754992 1707 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 18:16:05.755296 update_engine[1707]: I20250514 18:16:05.755270 1707 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 18:16:05.787724 update_engine[1707]: E20250514 18:16:05.787683 1707 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 18:16:05.787859 update_engine[1707]: I20250514 18:16:05.787761 1707 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 18:16:05.787859 update_engine[1707]: I20250514 18:16:05.787770 1707 omaha_request_action.cc:617] Omaha request response: May 14 18:16:05.787859 update_engine[1707]: E20250514 18:16:05.787848 1707 omaha_request_action.cc:636] Omaha request network transfer failed. May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787865 1707 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787870 1707 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787875 1707 update_attempter.cc:306] Processing Done. May 14 18:16:05.787940 update_engine[1707]: E20250514 18:16:05.787892 1707 update_attempter.cc:619] Update failed. May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787897 1707 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787906 1707 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 14 18:16:05.787940 update_engine[1707]: I20250514 18:16:05.787912 1707 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 14 18:16:05.788330 update_engine[1707]: I20250514 18:16:05.788120 1707 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 18:16:05.788330 update_engine[1707]: I20250514 18:16:05.788158 1707 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 18:16:05.788330 update_engine[1707]: I20250514 18:16:05.788187 1707 omaha_request_action.cc:272] Request: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: May 14 18:16:05.788330 update_engine[1707]: I20250514 18:16:05.788194 1707 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 18:16:05.788593 locksmithd[1801]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 14 18:16:05.788866 update_engine[1707]: I20250514 18:16:05.788359 1707 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 18:16:05.788866 update_engine[1707]: I20250514 18:16:05.788578 1707 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 18:16:05.803204 update_engine[1707]: E20250514 18:16:05.803177 1707 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803220 1707 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803225 1707 omaha_request_action.cc:617] Omaha request response: May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803231 1707 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803234 1707 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803238 1707 update_attempter.cc:306] Processing Done. May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803242 1707 update_attempter.cc:310] Error event sent. May 14 18:16:05.803258 update_engine[1707]: I20250514 18:16:05.803248 1707 update_check_scheduler.cc:74] Next update check in 44m51s May 14 18:16:05.803542 locksmithd[1801]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 14 18:16:07.984456 systemd[1]: Started sshd@31-10.200.8.38:22-10.200.16.10:52830.service - OpenSSH per-connection server daemon (10.200.16.10:52830). May 14 18:16:08.614797 sshd[6469]: Accepted publickey for core from 10.200.16.10 port 52830 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:08.615820 sshd-session[6469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:08.619074 systemd-logind[1706]: New session 34 of user core. May 14 18:16:08.625876 systemd[1]: Started session-34.scope - Session 34 of User core. May 14 18:16:09.111748 sshd[6471]: Connection closed by 10.200.16.10 port 52830 May 14 18:16:09.112123 sshd-session[6469]: pam_unix(sshd:session): session closed for user core May 14 18:16:09.114906 systemd[1]: sshd@31-10.200.8.38:22-10.200.16.10:52830.service: Deactivated successfully. May 14 18:16:09.116461 systemd[1]: session-34.scope: Deactivated successfully. May 14 18:16:09.117127 systemd-logind[1706]: Session 34 logged out. Waiting for processes to exit. May 14 18:16:09.118319 systemd-logind[1706]: Removed session 34. May 14 18:16:14.224316 systemd[1]: Started sshd@32-10.200.8.38:22-10.200.16.10:36356.service - OpenSSH per-connection server daemon (10.200.16.10:36356). May 14 18:16:14.861312 sshd[6485]: Accepted publickey for core from 10.200.16.10 port 36356 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:14.862626 sshd-session[6485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:14.866106 systemd-logind[1706]: New session 35 of user core. May 14 18:16:14.871870 systemd[1]: Started session-35.scope - Session 35 of User core. May 14 18:16:15.457663 sshd[6487]: Connection closed by 10.200.16.10 port 36356 May 14 18:16:15.458077 sshd-session[6485]: pam_unix(sshd:session): session closed for user core May 14 18:16:15.460724 systemd[1]: sshd@32-10.200.8.38:22-10.200.16.10:36356.service: Deactivated successfully. May 14 18:16:15.462356 systemd[1]: session-35.scope: Deactivated successfully. May 14 18:16:15.463084 systemd-logind[1706]: Session 35 logged out. Waiting for processes to exit. May 14 18:16:15.464140 systemd-logind[1706]: Removed session 35. May 14 18:16:18.475704 containerd[1734]: time="2025-05-14T18:16:18.475649035Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"2c083ba1fbc74481a16303be8449505ccb16bea613f4ae633eb261fca62b50b9\" pid:6527 exited_at:{seconds:1747246578 nanos:475441038}" May 14 18:16:18.476249 containerd[1734]: time="2025-05-14T18:16:18.476231372Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"279ffb55cb04168e9304d7ce1481d1e2e75fce2760cf295c6af2cce43887beb6\" pid:6528 exited_at:{seconds:1747246578 nanos:475445824}" May 14 18:16:20.577957 systemd[1]: Started sshd@33-10.200.8.38:22-10.200.16.10:48420.service - OpenSSH per-connection server daemon (10.200.16.10:48420). May 14 18:16:21.214428 sshd[6547]: Accepted publickey for core from 10.200.16.10 port 48420 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:21.215366 sshd-session[6547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:21.219263 systemd-logind[1706]: New session 36 of user core. May 14 18:16:21.221880 systemd[1]: Started session-36.scope - Session 36 of User core. May 14 18:16:21.760806 sshd[6549]: Connection closed by 10.200.16.10 port 48420 May 14 18:16:21.761235 sshd-session[6547]: pam_unix(sshd:session): session closed for user core May 14 18:16:21.763514 systemd[1]: sshd@33-10.200.8.38:22-10.200.16.10:48420.service: Deactivated successfully. May 14 18:16:21.765104 systemd[1]: session-36.scope: Deactivated successfully. May 14 18:16:21.766654 systemd-logind[1706]: Session 36 logged out. Waiting for processes to exit. May 14 18:16:21.767551 systemd-logind[1706]: Removed session 36. May 14 18:16:26.873896 systemd[1]: Started sshd@34-10.200.8.38:22-10.200.16.10:48436.service - OpenSSH per-connection server daemon (10.200.16.10:48436). May 14 18:16:27.507806 sshd[6561]: Accepted publickey for core from 10.200.16.10 port 48436 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:27.509027 sshd-session[6561]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:27.513371 systemd-logind[1706]: New session 37 of user core. May 14 18:16:27.517872 systemd[1]: Started session-37.scope - Session 37 of User core. May 14 18:16:28.080160 sshd[6563]: Connection closed by 10.200.16.10 port 48436 May 14 18:16:28.080575 sshd-session[6561]: pam_unix(sshd:session): session closed for user core May 14 18:16:28.083250 systemd[1]: sshd@34-10.200.8.38:22-10.200.16.10:48436.service: Deactivated successfully. May 14 18:16:28.084859 systemd[1]: session-37.scope: Deactivated successfully. May 14 18:16:28.085541 systemd-logind[1706]: Session 37 logged out. Waiting for processes to exit. May 14 18:16:28.086583 systemd-logind[1706]: Removed session 37. May 14 18:16:33.192736 systemd[1]: Started sshd@35-10.200.8.38:22-10.200.16.10:52482.service - OpenSSH per-connection server daemon (10.200.16.10:52482). May 14 18:16:33.824942 sshd[6575]: Accepted publickey for core from 10.200.16.10 port 52482 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:33.826231 sshd-session[6575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:33.830334 systemd-logind[1706]: New session 38 of user core. May 14 18:16:33.831913 systemd[1]: Started session-38.scope - Session 38 of User core. May 14 18:16:34.313099 sshd[6577]: Connection closed by 10.200.16.10 port 52482 May 14 18:16:34.313523 sshd-session[6575]: pam_unix(sshd:session): session closed for user core May 14 18:16:34.315690 systemd[1]: sshd@35-10.200.8.38:22-10.200.16.10:52482.service: Deactivated successfully. May 14 18:16:34.317455 systemd[1]: session-38.scope: Deactivated successfully. May 14 18:16:34.318583 systemd-logind[1706]: Session 38 logged out. Waiting for processes to exit. May 14 18:16:34.319644 systemd-logind[1706]: Removed session 38. May 14 18:16:35.111765 containerd[1734]: time="2025-05-14T18:16:35.111712086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"7453978e05e026a8f8b8589386b0d77871782100692f828ceda3a50781db1bba\" pid:6600 exited_at:{seconds:1747246595 nanos:111492018}" May 14 18:16:39.426802 systemd[1]: Started sshd@36-10.200.8.38:22-10.200.16.10:55838.service - OpenSSH per-connection server daemon (10.200.16.10:55838). May 14 18:16:40.059818 sshd[6615]: Accepted publickey for core from 10.200.16.10 port 55838 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:40.060871 sshd-session[6615]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:40.064846 systemd-logind[1706]: New session 39 of user core. May 14 18:16:40.071903 systemd[1]: Started session-39.scope - Session 39 of User core. May 14 18:16:40.625263 sshd[6617]: Connection closed by 10.200.16.10 port 55838 May 14 18:16:40.625676 sshd-session[6615]: pam_unix(sshd:session): session closed for user core May 14 18:16:40.627881 systemd[1]: sshd@36-10.200.8.38:22-10.200.16.10:55838.service: Deactivated successfully. May 14 18:16:40.629495 systemd[1]: session-39.scope: Deactivated successfully. May 14 18:16:40.630690 systemd-logind[1706]: Session 39 logged out. Waiting for processes to exit. May 14 18:16:40.631918 systemd-logind[1706]: Removed session 39. May 14 18:16:45.739618 systemd[1]: Started sshd@37-10.200.8.38:22-10.200.16.10:55848.service - OpenSSH per-connection server daemon (10.200.16.10:55848). May 14 18:16:46.371858 sshd[6629]: Accepted publickey for core from 10.200.16.10 port 55848 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:46.372905 sshd-session[6629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:46.376251 systemd-logind[1706]: New session 40 of user core. May 14 18:16:46.383893 systemd[1]: Started session-40.scope - Session 40 of User core. May 14 18:16:46.856888 sshd[6631]: Connection closed by 10.200.16.10 port 55848 May 14 18:16:46.857517 sshd-session[6629]: pam_unix(sshd:session): session closed for user core May 14 18:16:46.860272 systemd[1]: sshd@37-10.200.8.38:22-10.200.16.10:55848.service: Deactivated successfully. May 14 18:16:46.861933 systemd[1]: session-40.scope: Deactivated successfully. May 14 18:16:46.862650 systemd-logind[1706]: Session 40 logged out. Waiting for processes to exit. May 14 18:16:46.863832 systemd-logind[1706]: Removed session 40. May 14 18:16:48.465477 containerd[1734]: time="2025-05-14T18:16:48.465435390Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d2686a9522baf4ff501a148e46a4f46c581521a0c5d7c8b0876b9f209cbb6\" id:\"48eec43d462d7ef456458bb668de9aa617cd33146a4b26c69ecd7272d3b9422c\" pid:6656 exited_at:{seconds:1747246608 nanos:465092605}" May 14 18:16:51.971134 systemd[1]: Started sshd@38-10.200.8.38:22-10.200.16.10:40370.service - OpenSSH per-connection server daemon (10.200.16.10:40370). May 14 18:16:52.607480 sshd[6666]: Accepted publickey for core from 10.200.16.10 port 40370 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:52.608510 sshd-session[6666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:52.612552 systemd-logind[1706]: New session 41 of user core. May 14 18:16:52.616863 systemd[1]: Started session-41.scope - Session 41 of User core. May 14 18:16:53.104293 sshd[6668]: Connection closed by 10.200.16.10 port 40370 May 14 18:16:53.104720 sshd-session[6666]: pam_unix(sshd:session): session closed for user core May 14 18:16:53.107641 systemd[1]: sshd@38-10.200.8.38:22-10.200.16.10:40370.service: Deactivated successfully. May 14 18:16:53.109342 systemd[1]: session-41.scope: Deactivated successfully. May 14 18:16:53.110168 systemd-logind[1706]: Session 41 logged out. Waiting for processes to exit. May 14 18:16:53.111218 systemd-logind[1706]: Removed session 41. May 14 18:16:58.217424 systemd[1]: Started sshd@39-10.200.8.38:22-10.200.16.10:40384.service - OpenSSH per-connection server daemon (10.200.16.10:40384). May 14 18:16:58.851352 sshd[6686]: Accepted publickey for core from 10.200.16.10 port 40384 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:16:58.852657 sshd-session[6686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:16:58.856159 systemd-logind[1706]: New session 42 of user core. May 14 18:16:58.861889 systemd[1]: Started session-42.scope - Session 42 of User core. May 14 18:16:59.343417 sshd[6688]: Connection closed by 10.200.16.10 port 40384 May 14 18:16:59.343891 sshd-session[6686]: pam_unix(sshd:session): session closed for user core May 14 18:16:59.346674 systemd[1]: sshd@39-10.200.8.38:22-10.200.16.10:40384.service: Deactivated successfully. May 14 18:16:59.348307 systemd[1]: session-42.scope: Deactivated successfully. May 14 18:16:59.348898 systemd-logind[1706]: Session 42 logged out. Waiting for processes to exit. May 14 18:16:59.350032 systemd-logind[1706]: Removed session 42. May 14 18:17:02.332606 containerd[1734]: time="2025-05-14T18:17:02.332514590Z" level=warning msg="container event discarded" container=4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a type=CONTAINER_CREATED_EVENT May 14 18:17:02.343832 containerd[1734]: time="2025-05-14T18:17:02.343781581Z" level=warning msg="container event discarded" container=4527a1b7c9497a96c387c0985d09714e87764df58df437e833b94bd3a3a1e39a type=CONTAINER_STARTED_EVENT May 14 18:17:02.389062 containerd[1734]: time="2025-05-14T18:17:02.389011169Z" level=warning msg="container event discarded" container=cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881 type=CONTAINER_CREATED_EVENT May 14 18:17:02.389062 containerd[1734]: time="2025-05-14T18:17:02.389049439Z" level=warning msg="container event discarded" container=cff0cd43440b754cdca679f976eef48b587788c6ad51cffd0d4d5d554e831881 type=CONTAINER_STARTED_EVENT May 14 18:17:02.483332 containerd[1734]: time="2025-05-14T18:17:02.483284741Z" level=warning msg="container event discarded" container=f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a type=CONTAINER_CREATED_EVENT May 14 18:17:02.483332 containerd[1734]: time="2025-05-14T18:17:02.483319858Z" level=warning msg="container event discarded" container=f89c3c770b125f9bcca1fc6189744eb615d2a83fea883cddece5cfc42d23be3a type=CONTAINER_STARTED_EVENT May 14 18:17:03.192757 containerd[1734]: time="2025-05-14T18:17:03.192711463Z" level=warning msg="container event discarded" container=09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08 type=CONTAINER_CREATED_EVENT May 14 18:17:03.287946 containerd[1734]: time="2025-05-14T18:17:03.287895213Z" level=warning msg="container event discarded" container=c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a type=CONTAINER_CREATED_EVENT May 14 18:17:03.287946 containerd[1734]: time="2025-05-14T18:17:03.287939670Z" level=warning msg="container event discarded" container=09825faad3be53d09ac7216a8a8ef89725074b9cb494d86ba704cd05ce185d08 type=CONTAINER_STARTED_EVENT May 14 18:17:03.287946 containerd[1734]: time="2025-05-14T18:17:03.287952328Z" level=warning msg="container event discarded" container=b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8 type=CONTAINER_CREATED_EVENT May 14 18:17:03.393223 containerd[1734]: time="2025-05-14T18:17:03.393159230Z" level=warning msg="container event discarded" container=b59ebeadcb84d47de8c85bebe2c57e260d516c809b4df00270fc49f6ff70cdc8 type=CONTAINER_STARTED_EVENT May 14 18:17:03.393223 containerd[1734]: time="2025-05-14T18:17:03.393209444Z" level=warning msg="container event discarded" container=c919679a1966565545bb16325f85dc0ee4d62a3444cec0718cd3fbeb2903040a type=CONTAINER_STARTED_EVENT May 14 18:17:04.459637 systemd[1]: Started sshd@40-10.200.8.38:22-10.200.16.10:39210.service - OpenSSH per-connection server daemon (10.200.16.10:39210). May 14 18:17:05.091555 sshd[6702]: Accepted publickey for core from 10.200.16.10 port 39210 ssh2: RSA SHA256:Hd5PQzbog4WemQvfyzJrpsOqYyUj2ZNV4jY0cJYqkm4 May 14 18:17:05.092732 sshd-session[6702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 18:17:05.097946 systemd-logind[1706]: New session 43 of user core. May 14 18:17:05.105901 systemd[1]: Started session-43.scope - Session 43 of User core. May 14 18:17:05.118829 containerd[1734]: time="2025-05-14T18:17:05.118793787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ee62a33139b6c4fb3dbe070aa1de09d911cd4bb817783c31a97b968b7f553c\" id:\"00d0cf2014774cbb9b7dbd7aef6c9c09fca68bfd3eac8d394d0c9979bb1c7bc3\" pid:6716 exited_at:{seconds:1747246625 nanos:118521823}" May 14 18:17:05.578091 sshd[6727]: Connection closed by 10.200.16.10 port 39210 May 14 18:17:05.578478 sshd-session[6702]: pam_unix(sshd:session): session closed for user core May 14 18:17:05.581076 systemd[1]: sshd@40-10.200.8.38:22-10.200.16.10:39210.service: Deactivated successfully. May 14 18:17:05.582725 systemd[1]: session-43.scope: Deactivated successfully. May 14 18:17:05.583391 systemd-logind[1706]: Session 43 logged out. Waiting for processes to exit. May 14 18:17:05.584553 systemd-logind[1706]: Removed session 43.