May 15 12:23:38.952740 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 15 10:42:41 -00 2025 May 15 12:23:38.952765 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:38.952776 kernel: BIOS-provided physical RAM map: May 15 12:23:38.952783 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 15 12:23:38.952790 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved May 15 12:23:38.952798 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable May 15 12:23:38.952808 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc4fff] reserved May 15 12:23:38.952816 kernel: BIOS-e820: [mem 0x000000003ffc5000-0x000000003ffd1fff] usable May 15 12:23:38.952823 kernel: BIOS-e820: [mem 0x000000003ffd2000-0x000000003fffafff] ACPI data May 15 12:23:38.952830 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS May 15 12:23:38.952838 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable May 15 12:23:38.952845 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable May 15 12:23:38.952853 kernel: printk: legacy bootconsole [earlyser0] enabled May 15 12:23:38.952860 kernel: NX (Execute Disable) protection: active May 15 12:23:38.952871 kernel: APIC: Static calls initialized May 15 12:23:38.952879 kernel: efi: EFI v2.7 by Microsoft May 15 12:23:38.952887 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ebb9a98 RNG=0x3ffd2018 May 15 12:23:38.952895 kernel: random: crng init done May 15 12:23:38.952903 kernel: secureboot: Secure boot disabled May 15 12:23:38.952911 kernel: SMBIOS 3.1.0 present. May 15 12:23:38.952919 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 11/21/2024 May 15 12:23:38.952927 kernel: DMI: Memory slots populated: 2/2 May 15 12:23:38.952936 kernel: Hypervisor detected: Microsoft Hyper-V May 15 12:23:38.952944 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 May 15 12:23:38.952952 kernel: Hyper-V: Nested features: 0x3e0101 May 15 12:23:38.952960 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 May 15 12:23:38.952968 kernel: Hyper-V: Using hypercall for remote TLB flush May 15 12:23:38.952975 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 15 12:23:38.952983 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 15 12:23:38.952990 kernel: tsc: Detected 2299.999 MHz processor May 15 12:23:38.952997 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 15 12:23:38.953005 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 15 12:23:38.953012 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 May 15 12:23:38.953021 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 15 12:23:38.953028 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 15 12:23:38.953035 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved May 15 12:23:38.953042 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 May 15 12:23:38.953049 kernel: Using GB pages for direct mapping May 15 12:23:38.953056 kernel: ACPI: Early table checksum verification disabled May 15 12:23:38.953063 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) May 15 12:23:38.953075 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953084 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953091 kernel: ACPI: DSDT 0x000000003FFD6000 01E11C (v02 MSFTVM DSDT01 00000001 INTL 20230628) May 15 12:23:38.953099 kernel: ACPI: FACS 0x000000003FFFE000 000040 May 15 12:23:38.953106 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953113 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953123 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953130 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) May 15 12:23:38.953137 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) May 15 12:23:38.953145 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.953152 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] May 15 12:23:38.953159 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff411b] May 15 12:23:38.953167 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] May 15 12:23:38.953174 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] May 15 12:23:38.953181 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] May 15 12:23:38.953190 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] May 15 12:23:38.953197 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] May 15 12:23:38.953205 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] May 15 12:23:38.953213 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] May 15 12:23:38.953219 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] May 15 12:23:38.953226 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] May 15 12:23:38.953234 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] May 15 12:23:38.953241 kernel: NODE_DATA(0) allocated [mem 0x2bfff7dc0-0x2bfffefff] May 15 12:23:38.953248 kernel: Zone ranges: May 15 12:23:38.953257 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 15 12:23:38.953264 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 15 12:23:38.953271 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] May 15 12:23:38.953278 kernel: Device empty May 15 12:23:38.954649 kernel: Movable zone start for each node May 15 12:23:38.954657 kernel: Early memory node ranges May 15 12:23:38.954664 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 15 12:23:38.954671 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] May 15 12:23:38.954678 kernel: node 0: [mem 0x000000003ffc5000-0x000000003ffd1fff] May 15 12:23:38.954689 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] May 15 12:23:38.954696 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] May 15 12:23:38.954703 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] May 15 12:23:38.954710 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 15 12:23:38.954717 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 15 12:23:38.954724 kernel: On node 0, zone DMA32: 132 pages in unavailable ranges May 15 12:23:38.954731 kernel: On node 0, zone DMA32: 45 pages in unavailable ranges May 15 12:23:38.954738 kernel: ACPI: PM-Timer IO Port: 0x408 May 15 12:23:38.954745 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 15 12:23:38.954754 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 15 12:23:38.954761 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 15 12:23:38.954768 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 May 15 12:23:38.954776 kernel: TSC deadline timer available May 15 12:23:38.954783 kernel: CPU topo: Max. logical packages: 1 May 15 12:23:38.954790 kernel: CPU topo: Max. logical dies: 1 May 15 12:23:38.954797 kernel: CPU topo: Max. dies per package: 1 May 15 12:23:38.954803 kernel: CPU topo: Max. threads per core: 2 May 15 12:23:38.954810 kernel: CPU topo: Num. cores per package: 1 May 15 12:23:38.954819 kernel: CPU topo: Num. threads per package: 2 May 15 12:23:38.954826 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 15 12:23:38.954833 kernel: [mem 0x40000000-0xffffffff] available for PCI devices May 15 12:23:38.954840 kernel: Booting paravirtualized kernel on Hyper-V May 15 12:23:38.954848 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 15 12:23:38.954855 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 15 12:23:38.954862 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 15 12:23:38.954869 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 15 12:23:38.954876 kernel: pcpu-alloc: [0] 0 1 May 15 12:23:38.954885 kernel: Hyper-V: PV spinlocks enabled May 15 12:23:38.954892 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 15 12:23:38.954901 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:38.954909 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 15 12:23:38.954916 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) May 15 12:23:38.954923 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 15 12:23:38.954930 kernel: Fallback order for Node 0: 0 May 15 12:23:38.954937 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2096878 May 15 12:23:38.954946 kernel: Policy zone: Normal May 15 12:23:38.954953 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 15 12:23:38.954960 kernel: software IO TLB: area num 2. May 15 12:23:38.954967 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 15 12:23:38.954974 kernel: ftrace: allocating 40065 entries in 157 pages May 15 12:23:38.954981 kernel: ftrace: allocated 157 pages with 5 groups May 15 12:23:38.954988 kernel: Dynamic Preempt: voluntary May 15 12:23:38.954995 kernel: rcu: Preemptible hierarchical RCU implementation. May 15 12:23:38.955003 kernel: rcu: RCU event tracing is enabled. May 15 12:23:38.955012 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 15 12:23:38.955025 kernel: Trampoline variant of Tasks RCU enabled. May 15 12:23:38.955032 kernel: Rude variant of Tasks RCU enabled. May 15 12:23:38.955042 kernel: Tracing variant of Tasks RCU enabled. May 15 12:23:38.955049 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 15 12:23:38.955057 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 15 12:23:38.955065 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.955072 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.955080 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.955088 kernel: Using NULL legacy PIC May 15 12:23:38.955095 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 May 15 12:23:38.955105 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 15 12:23:38.955112 kernel: Console: colour dummy device 80x25 May 15 12:23:38.955120 kernel: printk: legacy console [tty1] enabled May 15 12:23:38.955127 kernel: printk: legacy console [ttyS0] enabled May 15 12:23:38.955135 kernel: printk: legacy bootconsole [earlyser0] disabled May 15 12:23:38.955143 kernel: ACPI: Core revision 20240827 May 15 12:23:38.955152 kernel: Failed to register legacy timer interrupt May 15 12:23:38.955159 kernel: APIC: Switch to symmetric I/O mode setup May 15 12:23:38.955167 kernel: x2apic enabled May 15 12:23:38.955174 kernel: APIC: Switched APIC routing to: physical x2apic May 15 12:23:38.955182 kernel: Hyper-V: Host Build 10.0.26100.1221-1-0 May 15 12:23:38.955189 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 15 12:23:38.955197 kernel: Hyper-V: Disabling IBT because of Hyper-V bug May 15 12:23:38.955205 kernel: Hyper-V: Using IPI hypercalls May 15 12:23:38.955212 kernel: APIC: send_IPI() replaced with hv_send_ipi() May 15 12:23:38.955221 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() May 15 12:23:38.955229 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() May 15 12:23:38.955237 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() May 15 12:23:38.955245 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() May 15 12:23:38.955252 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() May 15 12:23:38.955260 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 15 12:23:38.955268 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4599.99 BogoMIPS (lpj=2299999) May 15 12:23:38.955276 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 15 12:23:38.955345 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 15 12:23:38.955354 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 15 12:23:38.955361 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 15 12:23:38.955367 kernel: Spectre V2 : Mitigation: Retpolines May 15 12:23:38.955375 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 15 12:23:38.955382 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 15 12:23:38.955389 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! May 15 12:23:38.955397 kernel: RETBleed: Vulnerable May 15 12:23:38.955404 kernel: Speculative Store Bypass: Vulnerable May 15 12:23:38.955411 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 15 12:23:38.955418 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 15 12:23:38.955425 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 15 12:23:38.955434 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' May 15 12:23:38.955441 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' May 15 12:23:38.955448 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' May 15 12:23:38.955456 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' May 15 12:23:38.955463 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' May 15 12:23:38.955470 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' May 15 12:23:38.955477 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 15 12:23:38.955484 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 May 15 12:23:38.955491 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 May 15 12:23:38.955498 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 May 15 12:23:38.955507 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 May 15 12:23:38.955515 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 May 15 12:23:38.955522 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 May 15 12:23:38.955529 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. May 15 12:23:38.955536 kernel: Freeing SMP alternatives memory: 32K May 15 12:23:38.955543 kernel: pid_max: default: 32768 minimum: 301 May 15 12:23:38.955550 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 15 12:23:38.955558 kernel: landlock: Up and running. May 15 12:23:38.955565 kernel: SELinux: Initializing. May 15 12:23:38.955573 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:23:38.955580 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:23:38.955587 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) May 15 12:23:38.955597 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. May 15 12:23:38.955605 kernel: signal: max sigframe size: 11952 May 15 12:23:38.955612 kernel: rcu: Hierarchical SRCU implementation. May 15 12:23:38.955620 kernel: rcu: Max phase no-delay instances is 400. May 15 12:23:38.955628 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 15 12:23:38.955636 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 15 12:23:38.955644 kernel: smp: Bringing up secondary CPUs ... May 15 12:23:38.955651 kernel: smpboot: x86: Booting SMP configuration: May 15 12:23:38.955659 kernel: .... node #0, CPUs: #1 May 15 12:23:38.955668 kernel: smp: Brought up 1 node, 2 CPUs May 15 12:23:38.955675 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) May 15 12:23:38.955684 kernel: Memory: 8081556K/8387512K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 299996K reserved, 0K cma-reserved) May 15 12:23:38.955691 kernel: devtmpfs: initialized May 15 12:23:38.955699 kernel: x86/mm: Memory block size: 128MB May 15 12:23:38.955707 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) May 15 12:23:38.955714 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 15 12:23:38.955722 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 15 12:23:38.955730 kernel: pinctrl core: initialized pinctrl subsystem May 15 12:23:38.955739 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 15 12:23:38.955746 kernel: audit: initializing netlink subsys (disabled) May 15 12:23:38.955754 kernel: audit: type=2000 audit(1747311816.029:1): state=initialized audit_enabled=0 res=1 May 15 12:23:38.955762 kernel: thermal_sys: Registered thermal governor 'step_wise' May 15 12:23:38.955769 kernel: thermal_sys: Registered thermal governor 'user_space' May 15 12:23:38.955776 kernel: cpuidle: using governor menu May 15 12:23:38.955783 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 15 12:23:38.955791 kernel: dca service started, version 1.12.1 May 15 12:23:38.955798 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] May 15 12:23:38.955807 kernel: e820: reserve RAM buffer [mem 0x3ffd2000-0x3fffffff] May 15 12:23:38.955815 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 15 12:23:38.955822 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 15 12:23:38.955830 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 15 12:23:38.955838 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 15 12:23:38.955845 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 15 12:23:38.955853 kernel: ACPI: Added _OSI(Module Device) May 15 12:23:38.955861 kernel: ACPI: Added _OSI(Processor Device) May 15 12:23:38.955868 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 15 12:23:38.955878 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 15 12:23:38.955886 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 15 12:23:38.955893 kernel: ACPI: Interpreter enabled May 15 12:23:38.955901 kernel: ACPI: PM: (supports S0 S5) May 15 12:23:38.955908 kernel: ACPI: Using IOAPIC for interrupt routing May 15 12:23:38.955916 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 15 12:23:38.955924 kernel: PCI: Ignoring E820 reservations for host bridge windows May 15 12:23:38.955931 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F May 15 12:23:38.955939 kernel: iommu: Default domain type: Translated May 15 12:23:38.955948 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 15 12:23:38.955956 kernel: efivars: Registered efivars operations May 15 12:23:38.955964 kernel: PCI: Using ACPI for IRQ routing May 15 12:23:38.955972 kernel: PCI: System does not support PCI May 15 12:23:38.955980 kernel: vgaarb: loaded May 15 12:23:38.955987 kernel: clocksource: Switched to clocksource tsc-early May 15 12:23:38.955995 kernel: VFS: Disk quotas dquot_6.6.0 May 15 12:23:38.956003 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 15 12:23:38.956011 kernel: pnp: PnP ACPI init May 15 12:23:38.956020 kernel: pnp: PnP ACPI: found 3 devices May 15 12:23:38.956033 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 15 12:23:38.956041 kernel: NET: Registered PF_INET protocol family May 15 12:23:38.956049 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 15 12:23:38.956057 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) May 15 12:23:38.956065 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 15 12:23:38.956072 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) May 15 12:23:38.956080 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 15 12:23:38.956088 kernel: TCP: Hash tables configured (established 65536 bind 65536) May 15 12:23:38.956098 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) May 15 12:23:38.956105 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) May 15 12:23:38.956113 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 15 12:23:38.956121 kernel: NET: Registered PF_XDP protocol family May 15 12:23:38.956129 kernel: PCI: CLS 0 bytes, default 64 May 15 12:23:38.956137 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 15 12:23:38.956145 kernel: software IO TLB: mapped [mem 0x000000003aa59000-0x000000003ea59000] (64MB) May 15 12:23:38.956153 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer May 15 12:23:38.956160 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules May 15 12:23:38.956170 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 15 12:23:38.956178 kernel: clocksource: Switched to clocksource tsc May 15 12:23:38.956186 kernel: Initialise system trusted keyrings May 15 12:23:38.956193 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 May 15 12:23:38.956201 kernel: Key type asymmetric registered May 15 12:23:38.956208 kernel: Asymmetric key parser 'x509' registered May 15 12:23:38.956216 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 15 12:23:38.956224 kernel: io scheduler mq-deadline registered May 15 12:23:38.956232 kernel: io scheduler kyber registered May 15 12:23:38.956242 kernel: io scheduler bfq registered May 15 12:23:38.956249 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 15 12:23:38.956257 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 15 12:23:38.956265 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:23:38.956273 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A May 15 12:23:38.957705 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:23:38.957728 kernel: i8042: PNP: No PS/2 controller found. May 15 12:23:38.957857 kernel: rtc_cmos 00:02: registered as rtc0 May 15 12:23:38.957929 kernel: rtc_cmos 00:02: setting system clock to 2025-05-15T12:23:38 UTC (1747311818) May 15 12:23:38.957990 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram May 15 12:23:38.958000 kernel: intel_pstate: Intel P-state driver initializing May 15 12:23:38.958008 kernel: efifb: probing for efifb May 15 12:23:38.958016 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 15 12:23:38.958023 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 15 12:23:38.958031 kernel: efifb: scrolling: redraw May 15 12:23:38.958039 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 15 12:23:38.958048 kernel: Console: switching to colour frame buffer device 128x48 May 15 12:23:38.958056 kernel: fb0: EFI VGA frame buffer device May 15 12:23:38.958064 kernel: pstore: Using crash dump compression: deflate May 15 12:23:38.958072 kernel: pstore: Registered efi_pstore as persistent store backend May 15 12:23:38.958081 kernel: NET: Registered PF_INET6 protocol family May 15 12:23:38.958089 kernel: Segment Routing with IPv6 May 15 12:23:38.958096 kernel: In-situ OAM (IOAM) with IPv6 May 15 12:23:38.958104 kernel: NET: Registered PF_PACKET protocol family May 15 12:23:38.958112 kernel: Key type dns_resolver registered May 15 12:23:38.958122 kernel: IPI shorthand broadcast: enabled May 15 12:23:38.958129 kernel: sched_clock: Marking stable (2822003695, 87155475)->(3203492256, -294333086) May 15 12:23:38.958138 kernel: registered taskstats version 1 May 15 12:23:38.958145 kernel: Loading compiled-in X.509 certificates May 15 12:23:38.958153 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 05e05785144663be6df1db78301487421c4773b6' May 15 12:23:38.958161 kernel: Demotion targets for Node 0: null May 15 12:23:38.958168 kernel: Key type .fscrypt registered May 15 12:23:38.958175 kernel: Key type fscrypt-provisioning registered May 15 12:23:38.958184 kernel: ima: No TPM chip found, activating TPM-bypass! May 15 12:23:38.958194 kernel: ima: Allocated hash algorithm: sha1 May 15 12:23:38.958202 kernel: ima: No architecture policies found May 15 12:23:38.958209 kernel: clk: Disabling unused clocks May 15 12:23:38.958216 kernel: Warning: unable to open an initial console. May 15 12:23:38.958223 kernel: Freeing unused kernel image (initmem) memory: 54416K May 15 12:23:38.958231 kernel: Write protecting the kernel read-only data: 24576k May 15 12:23:38.958239 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 15 12:23:38.958247 kernel: Run /init as init process May 15 12:23:38.958254 kernel: with arguments: May 15 12:23:38.958264 kernel: /init May 15 12:23:38.958272 kernel: with environment: May 15 12:23:38.958295 kernel: HOME=/ May 15 12:23:38.958303 kernel: TERM=linux May 15 12:23:38.958311 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 15 12:23:38.958320 systemd[1]: Successfully made /usr/ read-only. May 15 12:23:38.958333 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:23:38.958342 systemd[1]: Detected virtualization microsoft. May 15 12:23:38.958351 systemd[1]: Detected architecture x86-64. May 15 12:23:38.958359 systemd[1]: Running in initrd. May 15 12:23:38.958366 systemd[1]: No hostname configured, using default hostname. May 15 12:23:38.958374 systemd[1]: Hostname set to . May 15 12:23:38.958382 systemd[1]: Initializing machine ID from random generator. May 15 12:23:38.958391 systemd[1]: Queued start job for default target initrd.target. May 15 12:23:38.958399 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:23:38.958408 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:23:38.958420 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 15 12:23:38.958429 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:23:38.958438 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 15 12:23:38.958447 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 15 12:23:38.958457 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 15 12:23:38.958466 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 15 12:23:38.958477 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:23:38.958485 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:23:38.958521 systemd[1]: Reached target paths.target - Path Units. May 15 12:23:38.958530 systemd[1]: Reached target slices.target - Slice Units. May 15 12:23:38.958539 systemd[1]: Reached target swap.target - Swaps. May 15 12:23:38.958548 systemd[1]: Reached target timers.target - Timer Units. May 15 12:23:38.958556 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:23:38.958565 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:23:38.958573 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 15 12:23:38.958583 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 15 12:23:38.958591 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:23:38.958600 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:23:38.958608 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:23:38.958617 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:23:38.958625 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 15 12:23:38.958634 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:23:38.958642 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 15 12:23:38.958651 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 15 12:23:38.958661 systemd[1]: Starting systemd-fsck-usr.service... May 15 12:23:38.958669 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:23:38.958678 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:23:38.958695 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:38.958705 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 15 12:23:38.958716 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:23:38.958742 systemd-journald[205]: Collecting audit messages is disabled. May 15 12:23:38.958769 systemd[1]: Finished systemd-fsck-usr.service. May 15 12:23:38.958778 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:23:38.958788 systemd-journald[205]: Journal started May 15 12:23:38.958810 systemd-journald[205]: Runtime Journal (/run/log/journal/3344f7ade2994f339d7ffd58f033c663) is 8M, max 159M, 151M free. May 15 12:23:38.956503 systemd-modules-load[206]: Inserted module 'overlay' May 15 12:23:38.962297 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:23:38.971931 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:23:38.976679 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:38.985395 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:23:38.993299 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 15 12:23:38.994415 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 12:23:38.998372 kernel: Bridge firewalling registered May 15 12:23:38.999275 systemd-modules-load[206]: Inserted module 'br_netfilter' May 15 12:23:38.999406 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:23:39.002014 systemd-tmpfiles[221]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 15 12:23:39.008493 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:23:39.013391 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:23:39.018409 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:23:39.022575 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:23:39.026354 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:23:39.031786 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:23:39.035106 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 15 12:23:39.041659 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:23:39.049389 dracut-cmdline[244]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:39.082059 systemd-resolved[245]: Positive Trust Anchors: May 15 12:23:39.082071 systemd-resolved[245]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:23:39.082103 systemd-resolved[245]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:23:39.098925 systemd-resolved[245]: Defaulting to hostname 'linux'. May 15 12:23:39.101605 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:23:39.103520 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:23:39.116297 kernel: SCSI subsystem initialized May 15 12:23:39.123297 kernel: Loading iSCSI transport class v2.0-870. May 15 12:23:39.131300 kernel: iscsi: registered transport (tcp) May 15 12:23:39.146569 kernel: iscsi: registered transport (qla4xxx) May 15 12:23:39.146614 kernel: QLogic iSCSI HBA Driver May 15 12:23:39.157739 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:23:39.169933 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:23:39.172168 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:23:39.199038 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 15 12:23:39.202154 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 15 12:23:39.242297 kernel: raid6: avx512x4 gen() 46470 MB/s May 15 12:23:39.259297 kernel: raid6: avx512x2 gen() 45445 MB/s May 15 12:23:39.276292 kernel: raid6: avx512x1 gen() 28948 MB/s May 15 12:23:39.294291 kernel: raid6: avx2x4 gen() 41553 MB/s May 15 12:23:39.311295 kernel: raid6: avx2x2 gen() 44069 MB/s May 15 12:23:39.328782 kernel: raid6: avx2x1 gen() 30249 MB/s May 15 12:23:39.328815 kernel: raid6: using algorithm avx512x4 gen() 46470 MB/s May 15 12:23:39.347468 kernel: raid6: .... xor() 7850 MB/s, rmw enabled May 15 12:23:39.347490 kernel: raid6: using avx512x2 recovery algorithm May 15 12:23:39.363296 kernel: xor: automatically using best checksumming function avx May 15 12:23:39.466298 kernel: Btrfs loaded, zoned=no, fsverity=no May 15 12:23:39.470627 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 15 12:23:39.480412 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:23:39.495120 systemd-udevd[454]: Using default interface naming scheme 'v255'. May 15 12:23:39.498874 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:23:39.505087 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 15 12:23:39.520522 dracut-pre-trigger[466]: rd.md=0: removing MD RAID activation May 15 12:23:39.536870 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:23:39.539406 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:23:39.567954 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:23:39.575797 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 15 12:23:39.609305 kernel: cryptd: max_cpu_qlen set to 1000 May 15 12:23:39.616297 kernel: AES CTR mode by8 optimization enabled May 15 12:23:39.633298 kernel: hv_vmbus: Vmbus version:5.3 May 15 12:23:39.662299 kernel: hv_vmbus: registering driver hyperv_keyboard May 15 12:23:39.663036 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:39.663491 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:39.673296 kernel: pps_core: LinuxPPS API ver. 1 registered May 15 12:23:39.673317 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 15 12:23:39.674200 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:39.679153 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 May 15 12:23:39.682600 kernel: hv_vmbus: registering driver hv_pci May 15 12:23:39.685302 kernel: PTP clock support registered May 15 12:23:39.685646 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:39.696100 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 May 15 12:23:40.202405 kernel: hid: raw HID events driver (C) Jiri Kosina May 15 12:23:40.202425 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 May 15 12:23:40.202625 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] May 15 12:23:40.203865 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] May 15 12:23:40.203980 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint May 15 12:23:40.204081 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] May 15 12:23:40.204165 kernel: hv_utils: Registering HyperV Utility Driver May 15 12:23:40.204176 kernel: hv_vmbus: registering driver hv_utils May 15 12:23:40.204186 kernel: hv_utils: Shutdown IC version 3.2 May 15 12:23:40.204195 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) May 15 12:23:40.204281 kernel: hv_utils: TimeSync IC version 4.0 May 15 12:23:40.204294 kernel: hv_utils: Heartbeat IC version 3.0 May 15 12:23:40.204303 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 May 15 12:23:40.204376 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned May 15 12:23:40.204454 kernel: hv_vmbus: registering driver hv_storvsc May 15 12:23:40.204464 kernel: hv_vmbus: registering driver hid_hyperv May 15 12:23:40.204473 kernel: scsi host0: storvsc_host_t May 15 12:23:39.700589 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:39.700694 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:40.210360 kernel: hv_vmbus: registering driver hv_netvsc May 15 12:23:40.210387 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 May 15 12:23:39.709396 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:40.216952 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 May 15 12:23:40.216974 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 15 12:23:40.196880 systemd-resolved[245]: Clock change detected. Flushing caches. May 15 12:23:40.230088 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:40.237660 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbce1f (unnamed net_device) (uninitialized): VF slot 1 added May 15 12:23:40.249405 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 15 12:23:40.250692 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 15 12:23:40.250705 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 15 12:23:40.254954 kernel: nvme nvme0: pci function c05b:00:00.0 May 15 12:23:40.256959 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) May 15 12:23:40.553966 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#47 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:23:40.554076 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#16 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:23:40.554168 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 15 12:23:40.554267 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:40.942929 kernel: nvme nvme0: using unchecked data buffer May 15 12:23:41.249845 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 May 15 12:23:41.331120 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 May 15 12:23:41.331215 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] May 15 12:23:41.331308 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] May 15 12:23:41.331381 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint May 15 12:23:41.331483 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] May 15 12:23:41.331574 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] May 15 12:23:41.331661 kernel: pci 7870:00:00.0: enabling Extended Tags May 15 12:23:41.331749 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 May 15 12:23:41.331822 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned May 15 12:23:41.334105 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned May 15 12:23:41.334251 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) May 15 12:23:41.334351 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 May 15 12:23:41.334444 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbce1f eth0: VF registering: eth1 May 15 12:23:41.334524 kernel: mana 7870:00:00.0 eth1: joined to eth0 May 15 12:23:41.334610 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 May 15 12:23:41.294259 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 15 12:23:41.322299 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. May 15 12:23:41.337059 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. May 15 12:23:41.385568 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. May 15 12:23:41.385667 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. May 15 12:23:41.385928 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 15 12:23:41.395457 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:23:41.401018 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:23:41.404680 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:23:41.408553 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 15 12:23:41.419017 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 15 12:23:41.428923 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:41.443395 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 15 12:23:42.442425 disk-uuid[670]: The operation has completed successfully. May 15 12:23:42.445059 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:42.490663 systemd[1]: disk-uuid.service: Deactivated successfully. May 15 12:23:42.490740 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 15 12:23:42.517771 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 15 12:23:42.530036 sh[712]: Success May 15 12:23:42.556111 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 15 12:23:42.556186 kernel: device-mapper: uevent: version 1.0.3 May 15 12:23:42.557263 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 15 12:23:42.565931 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 15 12:23:42.989177 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 15 12:23:42.992800 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 15 12:23:43.003646 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 15 12:23:43.015276 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 15 12:23:43.015319 kernel: BTRFS: device fsid 2d504097-db49-4d66-a0d5-eeb665b21004 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (725) May 15 12:23:43.017247 kernel: BTRFS info (device dm-0): first mount of filesystem 2d504097-db49-4d66-a0d5-eeb665b21004 May 15 12:23:43.018398 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:43.019242 kernel: BTRFS info (device dm-0): using free-space-tree May 15 12:23:43.726245 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 15 12:23:43.730324 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 15 12:23:43.733624 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 15 12:23:43.734188 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 15 12:23:43.742599 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 15 12:23:43.760965 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (748) May 15 12:23:43.765265 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:43.765309 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:43.766402 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:43.804841 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 15 12:23:43.810534 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:43.811218 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 15 12:23:43.815424 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:23:43.822832 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:23:43.849968 systemd-networkd[894]: lo: Link UP May 15 12:23:43.849974 systemd-networkd[894]: lo: Gained carrier May 15 12:23:43.851341 systemd-networkd[894]: Enumeration completed May 15 12:23:43.861255 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 15 12:23:43.861429 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:23:43.861539 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbce1f eth0: Data path switched to VF: enP30832s1 May 15 12:23:43.851665 systemd-networkd[894]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:23:43.851668 systemd-networkd[894]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:23:43.852180 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:23:43.856328 systemd[1]: Reached target network.target - Network. May 15 12:23:43.861463 systemd-networkd[894]: enP30832s1: Link UP May 15 12:23:43.861524 systemd-networkd[894]: eth0: Link UP May 15 12:23:43.861602 systemd-networkd[894]: eth0: Gained carrier May 15 12:23:43.861611 systemd-networkd[894]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:23:43.864051 systemd-networkd[894]: enP30832s1: Gained carrier May 15 12:23:44.212956 systemd-networkd[894]: eth0: DHCPv4 address 10.200.8.32/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:23:45.308095 systemd-networkd[894]: eth0: Gained IPv6LL May 15 12:23:45.884167 systemd-networkd[894]: enP30832s1: Gained IPv6LL May 15 12:23:46.884130 ignition[889]: Ignition 2.21.0 May 15 12:23:46.884142 ignition[889]: Stage: fetch-offline May 15 12:23:46.886043 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:23:46.884222 ignition[889]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:46.891787 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 15 12:23:46.884227 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:46.884306 ignition[889]: parsed url from cmdline: "" May 15 12:23:46.884310 ignition[889]: no config URL provided May 15 12:23:46.884314 ignition[889]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:23:46.884318 ignition[889]: no config at "/usr/lib/ignition/user.ign" May 15 12:23:46.884322 ignition[889]: failed to fetch config: resource requires networking May 15 12:23:46.884530 ignition[889]: Ignition finished successfully May 15 12:23:46.915386 ignition[903]: Ignition 2.21.0 May 15 12:23:46.915395 ignition[903]: Stage: fetch May 15 12:23:46.915584 ignition[903]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:46.915591 ignition[903]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:46.915656 ignition[903]: parsed url from cmdline: "" May 15 12:23:46.915658 ignition[903]: no config URL provided May 15 12:23:46.915662 ignition[903]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:23:46.915667 ignition[903]: no config at "/usr/lib/ignition/user.ign" May 15 12:23:46.915696 ignition[903]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 15 12:23:47.034469 ignition[903]: GET result: OK May 15 12:23:47.034576 ignition[903]: config has been read from IMDS userdata May 15 12:23:47.034623 ignition[903]: parsing config with SHA512: fe3491c447cd6e2f2070d945dcab0e8009fb793fefc80c0c293a9f5818762935c70adf3bbf5f9d9d96fb3b5287e2696a7b8ad58bbcafd7b049a7f4bdea543674 May 15 12:23:47.039459 unknown[903]: fetched base config from "system" May 15 12:23:47.039464 unknown[903]: fetched base config from "system" May 15 12:23:47.039792 ignition[903]: fetch: fetch complete May 15 12:23:47.039468 unknown[903]: fetched user config from "azure" May 15 12:23:47.039796 ignition[903]: fetch: fetch passed May 15 12:23:47.042012 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 15 12:23:47.039828 ignition[903]: Ignition finished successfully May 15 12:23:47.047046 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 15 12:23:47.073851 ignition[909]: Ignition 2.21.0 May 15 12:23:47.073860 ignition[909]: Stage: kargs May 15 12:23:47.076696 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 15 12:23:47.074049 ignition[909]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:47.081965 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 15 12:23:47.074056 ignition[909]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:47.075399 ignition[909]: kargs: kargs passed May 15 12:23:47.075429 ignition[909]: Ignition finished successfully May 15 12:23:47.097190 ignition[915]: Ignition 2.21.0 May 15 12:23:47.097199 ignition[915]: Stage: disks May 15 12:23:47.099103 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 15 12:23:47.097365 ignition[915]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:47.107384 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 15 12:23:47.097372 ignition[915]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:47.109590 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 15 12:23:47.098103 ignition[915]: disks: disks passed May 15 12:23:47.112411 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:23:47.098131 ignition[915]: Ignition finished successfully May 15 12:23:47.114955 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:23:47.117283 systemd[1]: Reached target basic.target - Basic System. May 15 12:23:47.121796 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 15 12:23:47.237650 systemd-fsck[923]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks May 15 12:23:47.241600 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 15 12:23:47.246984 systemd[1]: Mounting sysroot.mount - /sysroot... May 15 12:23:47.619144 kernel: EXT4-fs (nvme0n1p9): mounted filesystem f7dea4bd-2644-4592-b85b-330f322c4d2b r/w with ordered data mode. Quota mode: none. May 15 12:23:47.619685 systemd[1]: Mounted sysroot.mount - /sysroot. May 15 12:23:47.620091 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 15 12:23:47.649040 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:23:47.652981 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 15 12:23:47.667809 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 15 12:23:47.671033 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 15 12:23:47.671060 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:23:47.677058 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 15 12:23:47.681467 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 15 12:23:47.691561 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (933) May 15 12:23:47.691589 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:47.692935 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:47.694353 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:47.700328 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:23:48.713318 coreos-metadata[935]: May 15 12:23:48.713 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 15 12:23:49.163013 initrd-setup-root[962]: cut: /sysroot/etc/passwd: No such file or directory May 15 12:23:49.203008 coreos-metadata[935]: May 15 12:23:49.202 INFO Fetch successful May 15 12:23:49.203955 coreos-metadata[935]: May 15 12:23:49.203 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 15 12:23:49.273169 initrd-setup-root[969]: cut: /sysroot/etc/group: No such file or directory May 15 12:23:49.278167 initrd-setup-root[976]: cut: /sysroot/etc/shadow: No such file or directory May 15 12:23:49.282724 initrd-setup-root[983]: cut: /sysroot/etc/gshadow: No such file or directory May 15 12:23:50.214119 coreos-metadata[935]: May 15 12:23:50.214 INFO Fetch successful May 15 12:23:50.216959 coreos-metadata[935]: May 15 12:23:50.215 INFO wrote hostname ci-4334.0.0-a-81f65144c0 to /sysroot/etc/hostname May 15 12:23:50.216583 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:23:50.415425 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 15 12:23:50.416807 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 15 12:23:50.428262 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 15 12:23:50.436434 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 15 12:23:50.441376 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:50.457554 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 15 12:23:50.463589 ignition[1051]: INFO : Ignition 2.21.0 May 15 12:23:50.463589 ignition[1051]: INFO : Stage: mount May 15 12:23:50.469152 ignition[1051]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:50.469152 ignition[1051]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:50.469152 ignition[1051]: INFO : mount: mount passed May 15 12:23:50.469152 ignition[1051]: INFO : Ignition finished successfully May 15 12:23:50.467102 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 15 12:23:50.469996 systemd[1]: Starting ignition-files.service - Ignition (files)... May 15 12:23:50.484003 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:23:50.503922 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1063) May 15 12:23:50.505935 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:50.506031 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:50.507373 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:50.513610 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:23:50.532522 ignition[1079]: INFO : Ignition 2.21.0 May 15 12:23:50.532522 ignition[1079]: INFO : Stage: files May 15 12:23:50.535385 ignition[1079]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:50.535385 ignition[1079]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:50.539260 ignition[1079]: DEBUG : files: compiled without relabeling support, skipping May 15 12:23:50.606663 ignition[1079]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 15 12:23:50.606663 ignition[1079]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 15 12:23:50.744594 ignition[1079]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 15 12:23:50.747717 ignition[1079]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 15 12:23:50.747717 ignition[1079]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 15 12:23:50.745058 unknown[1079]: wrote ssh authorized keys file for user: core May 15 12:23:50.814758 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:23:50.817531 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 15 12:23:51.090376 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 15 12:23:51.326952 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:23:51.332009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:23:51.558618 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:23:51.561317 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:23:51.561317 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 15 12:23:51.569007 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 15 12:23:51.569007 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 15 12:23:51.569007 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 15 12:23:52.149009 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 15 12:23:52.746282 ignition[1079]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 15 12:23:52.746282 ignition[1079]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 15 12:23:53.107421 ignition[1079]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:23:54.688275 ignition[1079]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:23:54.688275 ignition[1079]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 15 12:23:54.688275 ignition[1079]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 15 12:23:54.693570 ignition[1079]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 15 12:23:54.693570 ignition[1079]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 15 12:23:54.693570 ignition[1079]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 15 12:23:54.693570 ignition[1079]: INFO : files: files passed May 15 12:23:54.693570 ignition[1079]: INFO : Ignition finished successfully May 15 12:23:54.690548 systemd[1]: Finished ignition-files.service - Ignition (files). May 15 12:23:54.705034 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 15 12:23:54.710726 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 15 12:23:54.722782 systemd[1]: ignition-quench.service: Deactivated successfully. May 15 12:23:54.722856 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 15 12:23:55.346156 initrd-setup-root-after-ignition[1110]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.346156 initrd-setup-root-after-ignition[1110]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.353480 initrd-setup-root-after-ignition[1114]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.350141 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:23:55.353739 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 15 12:23:55.358019 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 15 12:23:55.395151 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 15 12:23:55.395240 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 15 12:23:55.397970 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 15 12:23:55.398363 systemd[1]: Reached target initrd.target - Initrd Default Target. May 15 12:23:55.398453 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 15 12:23:55.400053 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 15 12:23:55.411310 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:23:55.417036 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 15 12:23:55.437724 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 15 12:23:55.438284 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:23:55.438782 systemd[1]: Stopped target timers.target - Timer Units. May 15 12:23:55.439345 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 15 12:23:55.439446 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:23:55.439964 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 15 12:23:55.440281 systemd[1]: Stopped target basic.target - Basic System. May 15 12:23:55.452757 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 15 12:23:55.456074 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:23:55.457833 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 15 12:23:55.458149 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 15 12:23:55.458703 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 15 12:23:55.459013 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:23:55.459135 systemd[1]: Stopped target sysinit.target - System Initialization. May 15 12:23:55.459400 systemd[1]: Stopped target local-fs.target - Local File Systems. May 15 12:23:55.459699 systemd[1]: Stopped target swap.target - Swaps. May 15 12:23:55.460031 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 15 12:23:55.460159 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 15 12:23:55.460816 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 15 12:23:55.461206 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:23:55.461511 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 15 12:23:55.461833 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:23:55.489741 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 15 12:23:55.489884 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 15 12:23:55.492899 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 15 12:23:55.493020 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:23:55.497087 systemd[1]: ignition-files.service: Deactivated successfully. May 15 12:23:55.497198 systemd[1]: Stopped ignition-files.service - Ignition (files). May 15 12:23:55.501041 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 15 12:23:55.501132 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:23:55.506659 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 15 12:23:55.511971 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 15 12:23:55.512126 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:23:55.523485 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 15 12:23:55.528978 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 15 12:23:55.530847 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:23:55.535934 ignition[1134]: INFO : Ignition 2.21.0 May 15 12:23:55.535934 ignition[1134]: INFO : Stage: umount May 15 12:23:55.535934 ignition[1134]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:55.535934 ignition[1134]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:55.548007 ignition[1134]: INFO : umount: umount passed May 15 12:23:55.548007 ignition[1134]: INFO : Ignition finished successfully May 15 12:23:55.536152 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 15 12:23:55.537105 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:23:55.547515 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 15 12:23:55.548617 systemd[1]: ignition-mount.service: Deactivated successfully. May 15 12:23:55.548685 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 15 12:23:55.551120 systemd[1]: ignition-disks.service: Deactivated successfully. May 15 12:23:55.551294 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 15 12:23:55.554440 systemd[1]: ignition-kargs.service: Deactivated successfully. May 15 12:23:55.554488 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 15 12:23:55.556054 systemd[1]: ignition-fetch.service: Deactivated successfully. May 15 12:23:55.556083 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 15 12:23:55.556410 systemd[1]: Stopped target network.target - Network. May 15 12:23:55.556433 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 15 12:23:55.556459 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:23:55.556783 systemd[1]: Stopped target paths.target - Path Units. May 15 12:23:55.556803 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 15 12:23:55.556938 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:23:55.556984 systemd[1]: Stopped target slices.target - Slice Units. May 15 12:23:55.563142 systemd[1]: Stopped target sockets.target - Socket Units. May 15 12:23:55.566206 systemd[1]: iscsid.socket: Deactivated successfully. May 15 12:23:55.566236 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:23:55.573081 systemd[1]: iscsiuio.socket: Deactivated successfully. May 15 12:23:55.573102 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:23:55.577000 systemd[1]: ignition-setup.service: Deactivated successfully. May 15 12:23:55.577040 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 15 12:23:55.579407 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 15 12:23:55.579442 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 15 12:23:55.584935 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 15 12:23:55.588052 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 15 12:23:55.590729 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 15 12:23:55.590795 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 15 12:23:55.595734 systemd[1]: systemd-resolved.service: Deactivated successfully. May 15 12:23:55.595825 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 15 12:23:55.598748 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 15 12:23:55.599294 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 15 12:23:55.599334 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:23:55.608124 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 15 12:23:55.608268 systemd[1]: systemd-networkd.service: Deactivated successfully. May 15 12:23:55.608336 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 15 12:23:55.633633 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 15 12:23:55.633975 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 15 12:23:55.638998 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 15 12:23:55.639040 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 15 12:23:55.644482 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 15 12:23:55.647442 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 15 12:23:55.647488 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:23:55.648134 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 15 12:23:55.648164 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 15 12:23:55.651183 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 15 12:23:55.651219 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 15 12:23:55.661217 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:23:55.664930 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 15 12:23:55.670174 systemd[1]: systemd-udevd.service: Deactivated successfully. May 15 12:23:55.672039 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:23:55.688000 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbce1f eth0: Data path switched from VF: enP30832s1 May 15 12:23:55.688147 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:23:55.674486 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 15 12:23:55.674548 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 15 12:23:55.675191 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 15 12:23:55.675220 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:23:55.675430 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 15 12:23:55.675465 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 15 12:23:55.675755 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 15 12:23:55.675784 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 15 12:23:55.676322 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 12:23:55.676348 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:23:55.678019 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 15 12:23:55.689695 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 15 12:23:55.689742 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:23:55.693105 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 15 12:23:55.693136 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:23:55.699288 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:55.699328 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:55.702640 systemd[1]: network-cleanup.service: Deactivated successfully. May 15 12:23:55.702704 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 15 12:23:55.703835 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 15 12:23:55.703896 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 15 12:23:56.053258 systemd[1]: sysroot-boot.service: Deactivated successfully. May 15 12:23:56.053353 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 15 12:23:56.056081 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 15 12:23:56.059635 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 15 12:23:56.059685 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 15 12:23:56.063116 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 15 12:23:56.077696 systemd[1]: Switching root. May 15 12:23:56.129151 systemd-journald[205]: Journal stopped May 15 12:24:00.346655 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). May 15 12:24:00.346684 kernel: SELinux: policy capability network_peer_controls=1 May 15 12:24:00.346696 kernel: SELinux: policy capability open_perms=1 May 15 12:24:00.346704 kernel: SELinux: policy capability extended_socket_class=1 May 15 12:24:00.346711 kernel: SELinux: policy capability always_check_network=0 May 15 12:24:00.346719 kernel: SELinux: policy capability cgroup_seclabel=1 May 15 12:24:00.346729 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 15 12:24:00.346737 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 15 12:24:00.346745 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 15 12:24:00.346753 kernel: SELinux: policy capability userspace_initial_context=0 May 15 12:24:00.346761 kernel: audit: type=1403 audit(1747311839.078:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 15 12:24:00.346770 systemd[1]: Successfully loaded SELinux policy in 63.282ms. May 15 12:24:00.346781 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.283ms. May 15 12:24:00.346792 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:24:00.346802 systemd[1]: Detected virtualization microsoft. May 15 12:24:00.346811 systemd[1]: Detected architecture x86-64. May 15 12:24:00.346820 systemd[1]: Detected first boot. May 15 12:24:00.346829 systemd[1]: Hostname set to . May 15 12:24:00.346839 systemd[1]: Initializing machine ID from random generator. May 15 12:24:00.346848 zram_generator::config[1178]: No configuration found. May 15 12:24:00.346858 kernel: Guest personality initialized and is inactive May 15 12:24:00.346866 kernel: VMCI host device registered (name=vmci, major=10, minor=124) May 15 12:24:00.346874 kernel: Initialized host personality May 15 12:24:00.346883 kernel: NET: Registered PF_VSOCK protocol family May 15 12:24:00.346892 systemd[1]: Populated /etc with preset unit settings. May 15 12:24:00.346903 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 15 12:24:00.349204 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 15 12:24:00.349224 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 15 12:24:00.349233 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 15 12:24:00.349242 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 15 12:24:00.349252 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 15 12:24:00.349260 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 15 12:24:00.349272 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 15 12:24:00.349280 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 15 12:24:00.349289 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 15 12:24:00.349298 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 15 12:24:00.349640 systemd[1]: Created slice user.slice - User and Session Slice. May 15 12:24:00.349654 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:24:00.349668 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:24:00.349677 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 15 12:24:00.349689 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 15 12:24:00.349700 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 15 12:24:00.349709 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:24:00.349717 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 15 12:24:00.349726 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:24:00.349734 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:24:00.349743 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 15 12:24:00.349752 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 15 12:24:00.349762 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 15 12:24:00.349770 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 15 12:24:00.349779 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:24:00.349787 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:24:00.349795 systemd[1]: Reached target slices.target - Slice Units. May 15 12:24:00.349804 systemd[1]: Reached target swap.target - Swaps. May 15 12:24:00.349812 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 15 12:24:00.349821 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 15 12:24:00.349831 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 15 12:24:00.349839 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:24:00.349848 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:24:00.349857 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:24:00.349865 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 15 12:24:00.349875 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 15 12:24:00.349884 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 15 12:24:00.349919 systemd[1]: Mounting media.mount - External Media Directory... May 15 12:24:00.349929 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:00.349938 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 15 12:24:00.349946 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 15 12:24:00.349955 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 15 12:24:00.349965 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 15 12:24:00.349975 systemd[1]: Reached target machines.target - Containers. May 15 12:24:00.349983 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 15 12:24:00.349992 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:00.350000 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:24:00.350009 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 15 12:24:00.350017 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:00.350026 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:24:00.350035 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:00.350045 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 15 12:24:00.350053 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:00.350062 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 15 12:24:00.350071 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 15 12:24:00.350080 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 15 12:24:00.350090 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 15 12:24:00.350099 systemd[1]: Stopped systemd-fsck-usr.service. May 15 12:24:00.350109 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:00.350118 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:24:00.350127 kernel: loop: module loaded May 15 12:24:00.350135 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:24:00.350143 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:24:00.350152 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 15 12:24:00.350160 kernel: fuse: init (API version 7.41) May 15 12:24:00.350169 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 15 12:24:00.350177 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:24:00.350186 systemd[1]: verity-setup.service: Deactivated successfully. May 15 12:24:00.350196 systemd[1]: Stopped verity-setup.service. May 15 12:24:00.350204 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:00.350213 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 15 12:24:00.350442 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 15 12:24:00.350451 systemd[1]: Mounted media.mount - External Media Directory. May 15 12:24:00.350460 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 15 12:24:00.350469 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 15 12:24:00.350478 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 15 12:24:00.350489 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:24:00.350498 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 15 12:24:00.350541 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 15 12:24:00.350573 systemd-journald[1264]: Collecting audit messages is disabled. May 15 12:24:00.350593 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:00.350604 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:00.350613 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:00.350622 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:00.350631 systemd-journald[1264]: Journal started May 15 12:24:00.350651 systemd-journald[1264]: Runtime Journal (/run/log/journal/32dcf38f4b5940ef9721d90a75293291) is 8M, max 159M, 151M free. May 15 12:23:59.951320 systemd[1]: Queued start job for default target multi-user.target. May 15 12:23:59.970160 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 15 12:23:59.970445 systemd[1]: systemd-journald.service: Deactivated successfully. May 15 12:24:00.354109 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:24:00.379113 kernel: ACPI: bus type drm_connector registered May 15 12:24:00.357464 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 15 12:24:00.357592 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 15 12:24:00.359995 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:00.360116 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:00.362520 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:24:00.367445 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:24:00.370480 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 15 12:24:00.373430 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:24:00.373963 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:24:00.381549 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 15 12:24:00.388655 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 15 12:24:00.394556 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:24:00.399986 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 15 12:24:00.402872 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 15 12:24:00.405241 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 15 12:24:00.405267 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:24:00.408154 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 15 12:24:00.413617 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 15 12:24:00.421442 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:00.426215 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 15 12:24:00.429415 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 15 12:24:00.431546 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:24:00.433968 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 15 12:24:00.435687 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:24:00.437028 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:24:00.439616 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 15 12:24:00.448210 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 15 12:24:00.458781 systemd-journald[1264]: Time spent on flushing to /var/log/journal/32dcf38f4b5940ef9721d90a75293291 is 19.152ms for 979 entries. May 15 12:24:00.458781 systemd-journald[1264]: System Journal (/var/log/journal/32dcf38f4b5940ef9721d90a75293291) is 8M, max 2.6G, 2.6G free. May 15 12:24:00.673525 systemd-journald[1264]: Received client request to flush runtime journal. May 15 12:24:00.673556 kernel: loop0: detected capacity change from 0 to 146240 May 15 12:24:00.451880 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:24:00.455516 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 15 12:24:00.458016 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 15 12:24:00.653856 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 15 12:24:00.656510 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 15 12:24:00.660092 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 15 12:24:00.673421 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:24:00.677374 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 15 12:24:01.448512 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 15 12:24:01.451302 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:24:01.893443 systemd-tmpfiles[1333]: ACLs are not supported, ignoring. May 15 12:24:01.893456 systemd-tmpfiles[1333]: ACLs are not supported, ignoring. May 15 12:24:01.896341 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:24:02.197954 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 15 12:24:02.299568 kernel: loop1: detected capacity change from 0 to 205544 May 15 12:24:03.349950 kernel: loop2: detected capacity change from 0 to 28536 May 15 12:24:03.394387 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 15 12:24:03.396539 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 15 12:24:04.323520 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 15 12:24:04.329170 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:24:04.355566 systemd-udevd[1341]: Using default interface naming scheme 'v255'. May 15 12:24:04.440932 kernel: loop3: detected capacity change from 0 to 113872 May 15 12:24:04.814887 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:24:04.819445 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:24:04.862564 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 15 12:24:04.902046 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 15 12:24:04.971839 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 15 12:24:05.006934 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#60 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:24:05.050984 kernel: mousedev: PS/2 mouse device common for all mice May 15 12:24:05.107942 kernel: hv_vmbus: registering driver hv_balloon May 15 12:24:05.107994 kernel: hv_vmbus: registering driver hyperv_fb May 15 12:24:05.111609 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 15 12:24:05.112031 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 15 12:24:05.114189 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 15 12:24:05.115447 kernel: Console: switching to colour dummy device 80x25 May 15 12:24:05.119394 kernel: Console: switching to colour frame buffer device 128x48 May 15 12:24:05.159537 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:24:05.169492 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:24:05.169654 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:24:05.172931 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:24:05.345440 systemd-networkd[1350]: lo: Link UP May 15 12:24:05.345446 systemd-networkd[1350]: lo: Gained carrier May 15 12:24:05.346719 systemd-networkd[1350]: Enumeration completed May 15 12:24:05.346787 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:24:05.349108 systemd-networkd[1350]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:05.349112 systemd-networkd[1350]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:24:05.349609 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 15 12:24:05.352932 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 15 12:24:05.353219 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 15 12:24:05.362958 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:24:05.363193 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbce1f eth0: Data path switched to VF: enP30832s1 May 15 12:24:05.364971 systemd-networkd[1350]: enP30832s1: Link UP May 15 12:24:05.365031 systemd-networkd[1350]: eth0: Link UP May 15 12:24:05.365034 systemd-networkd[1350]: eth0: Gained carrier May 15 12:24:05.365048 systemd-networkd[1350]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:05.369082 systemd-networkd[1350]: enP30832s1: Gained carrier May 15 12:24:05.378933 systemd-networkd[1350]: eth0: DHCPv4 address 10.200.8.32/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:24:05.557878 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 15 12:24:05.589631 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 15 12:24:05.599999 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 15 12:24:05.615969 kernel: kvm_intel: Using Hyper-V Enlightened VMCS May 15 12:24:05.756739 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 15 12:24:05.866942 kernel: loop4: detected capacity change from 0 to 146240 May 15 12:24:05.879928 kernel: loop5: detected capacity change from 0 to 205544 May 15 12:24:06.447945 kernel: loop6: detected capacity change from 0 to 28536 May 15 12:24:06.460929 kernel: loop7: detected capacity change from 0 to 113872 May 15 12:24:06.844854 (sd-merge)[1438]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 15 12:24:06.845397 (sd-merge)[1438]: Merged extensions into '/usr'. May 15 12:24:06.850095 systemd[1]: Reload requested from client PID 1319 ('systemd-sysext') (unit systemd-sysext.service)... May 15 12:24:06.850107 systemd[1]: Reloading... May 15 12:24:06.894931 zram_generator::config[1466]: No configuration found. May 15 12:24:06.976224 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:07.062984 systemd[1]: Reloading finished in 212 ms. May 15 12:24:07.094351 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 15 12:24:07.097207 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:24:07.108765 systemd[1]: Starting ensure-sysext.service... May 15 12:24:07.113030 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:24:07.123790 systemd[1]: Reload requested from client PID 1529 ('systemctl') (unit ensure-sysext.service)... May 15 12:24:07.123874 systemd[1]: Reloading... May 15 12:24:07.128872 systemd-tmpfiles[1530]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 12:24:07.128894 systemd-tmpfiles[1530]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 12:24:07.129322 systemd-tmpfiles[1530]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 12:24:07.129546 systemd-tmpfiles[1530]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 12:24:07.130172 systemd-tmpfiles[1530]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 12:24:07.130430 systemd-tmpfiles[1530]: ACLs are not supported, ignoring. May 15 12:24:07.130501 systemd-tmpfiles[1530]: ACLs are not supported, ignoring. May 15 12:24:07.166936 zram_generator::config[1558]: No configuration found. May 15 12:24:07.194237 systemd-tmpfiles[1530]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:24:07.194245 systemd-tmpfiles[1530]: Skipping /boot May 15 12:24:07.201983 systemd-tmpfiles[1530]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:24:07.201992 systemd-tmpfiles[1530]: Skipping /boot May 15 12:24:07.253396 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:07.329449 systemd[1]: Reloading finished in 205 ms. May 15 12:24:07.388009 systemd-networkd[1350]: eth0: Gained IPv6LL May 15 12:24:07.388248 systemd-networkd[1350]: enP30832s1: Gained IPv6LL May 15 12:24:07.408310 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 15 12:24:07.410653 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:24:07.417555 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:24:07.448683 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 15 12:24:07.453217 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 15 12:24:07.459816 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:24:07.464030 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 15 12:24:07.469879 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.470081 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:07.472134 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:07.479095 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:07.488181 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:07.493053 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:07.493250 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:07.493332 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.497427 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:07.497944 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:07.501144 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:07.501420 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:07.504537 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:07.504723 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:07.513299 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.514037 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:07.516103 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:07.520110 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:07.525122 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:07.527276 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:07.527386 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:07.527475 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.530155 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 15 12:24:07.533320 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:07.533449 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:07.536426 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:07.536519 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:07.540276 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:07.540392 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:07.549419 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.549600 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:07.551132 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:07.555450 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:24:07.558722 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:07.563127 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:07.565073 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:07.565177 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:07.565303 systemd[1]: Reached target time-set.target - System Time Set. May 15 12:24:07.567535 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:07.572963 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 15 12:24:07.577569 systemd[1]: Finished ensure-sysext.service. May 15 12:24:07.581255 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:07.581371 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:07.582870 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:24:07.583001 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:24:07.584570 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:07.584678 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:07.588121 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:07.588252 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:07.591383 systemd-resolved[1626]: Positive Trust Anchors: May 15 12:24:07.591393 systemd-resolved[1626]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:24:07.591425 systemd-resolved[1626]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:24:07.594406 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:24:07.594455 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:24:07.639574 systemd-resolved[1626]: Using system hostname 'ci-4334.0.0-a-81f65144c0'. May 15 12:24:07.745851 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:24:07.750048 systemd[1]: Reached target network.target - Network. May 15 12:24:07.751827 systemd[1]: Reached target network-online.target - Network is Online. May 15 12:24:07.753215 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:24:07.944573 augenrules[1670]: No rules May 15 12:24:07.945327 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:24:07.945502 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:24:08.350767 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 15 12:24:08.354123 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 15 12:24:11.473975 ldconfig[1314]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 15 12:24:11.510109 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 15 12:24:11.514156 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 15 12:24:11.530385 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 15 12:24:11.531704 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:24:11.533148 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 15 12:24:11.536087 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 15 12:24:11.538973 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 15 12:24:11.542071 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 15 12:24:11.545028 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 15 12:24:11.547979 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 15 12:24:11.549346 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 15 12:24:11.549373 systemd[1]: Reached target paths.target - Path Units. May 15 12:24:11.551971 systemd[1]: Reached target timers.target - Timer Units. May 15 12:24:11.644704 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 15 12:24:11.648038 systemd[1]: Starting docker.socket - Docker Socket for the API... May 15 12:24:11.651165 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 15 12:24:11.653031 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 15 12:24:11.656014 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 15 12:24:11.665309 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 15 12:24:11.667002 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 15 12:24:11.670390 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 15 12:24:11.673529 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:24:11.676439 systemd[1]: Reached target basic.target - Basic System. May 15 12:24:11.678993 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 15 12:24:11.679018 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 15 12:24:11.681167 systemd[1]: Starting chronyd.service - NTP client/server... May 15 12:24:11.684995 systemd[1]: Starting containerd.service - containerd container runtime... May 15 12:24:11.701591 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 15 12:24:11.704998 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 15 12:24:11.709986 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 15 12:24:11.718184 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 15 12:24:11.721129 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 15 12:24:11.723385 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 15 12:24:11.726026 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 15 12:24:11.732998 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:11.733264 jq[1687]: false May 15 12:24:11.738038 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 15 12:24:11.743664 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 15 12:24:11.749451 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 15 12:24:11.755027 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 15 12:24:11.759785 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 15 12:24:11.767106 (chronyd)[1682]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 15 12:24:11.773985 systemd[1]: Starting systemd-logind.service - User Login Management... May 15 12:24:11.778691 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 15 12:24:11.779079 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 15 12:24:11.780012 systemd[1]: Starting update-engine.service - Update Engine... May 15 12:24:11.787514 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 15 12:24:11.792495 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 15 12:24:11.792659 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 15 12:24:11.794222 systemd[1]: motdgen.service: Deactivated successfully. May 15 12:24:11.794383 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 15 12:24:11.798785 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 15 12:24:11.807048 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 15 12:24:11.812052 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 15 12:24:11.816510 jq[1711]: true May 15 12:24:11.825168 (ntainerd)[1716]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 15 12:24:11.827625 extend-filesystems[1688]: Found loop4 May 15 12:24:11.827625 extend-filesystems[1688]: Found loop5 May 15 12:24:11.827625 extend-filesystems[1688]: Found loop6 May 15 12:24:11.827625 extend-filesystems[1688]: Found loop7 May 15 12:24:11.827625 extend-filesystems[1688]: Found sr0 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p1 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p2 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p3 May 15 12:24:11.827625 extend-filesystems[1688]: Found usr May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p4 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p6 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p7 May 15 12:24:11.827625 extend-filesystems[1688]: Found nvme0n1p9 May 15 12:24:11.827625 extend-filesystems[1688]: Checking size of /dev/nvme0n1p9 May 15 12:24:11.860248 chronyd[1745]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 15 12:24:11.863199 jq[1717]: true May 15 12:24:11.862873 chronyd[1745]: Timezone right/UTC failed leap second check, ignoring May 15 12:24:11.863466 chronyd[1745]: Loaded seccomp filter (level 2) May 15 12:24:11.864510 systemd[1]: Started chronyd.service - NTP client/server. May 15 12:24:11.899663 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Refreshing passwd entry cache May 15 12:24:11.899868 oslogin_cache_refresh[1689]: Refreshing passwd entry cache May 15 12:24:11.910453 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Failure getting users, quitting May 15 12:24:11.910497 oslogin_cache_refresh[1689]: Failure getting users, quitting May 15 12:24:11.910541 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:24:11.910554 oslogin_cache_refresh[1689]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:24:11.910600 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Refreshing group entry cache May 15 12:24:11.910613 oslogin_cache_refresh[1689]: Refreshing group entry cache May 15 12:24:11.946395 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Failure getting groups, quitting May 15 12:24:11.946457 oslogin_cache_refresh[1689]: Failure getting groups, quitting May 15 12:24:11.946500 google_oslogin_nss_cache[1689]: oslogin_cache_refresh[1689]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:24:11.946522 oslogin_cache_refresh[1689]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:24:11.947432 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 15 12:24:11.947600 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 15 12:24:11.952288 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 15 12:24:12.016584 systemd-logind[1703]: New seat seat0. May 15 12:24:12.019253 update_engine[1709]: I20250515 12:24:12.018355 1709 main.cc:92] Flatcar Update Engine starting May 15 12:24:12.019138 systemd-logind[1703]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 15 12:24:12.019890 systemd[1]: Started systemd-logind.service - User Login Management. May 15 12:24:12.082646 sshd_keygen[1710]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 15 12:24:12.097152 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 15 12:24:12.100752 systemd[1]: Starting issuegen.service - Generate /run/issue... May 15 12:24:12.103017 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 15 12:24:12.121152 systemd[1]: issuegen.service: Deactivated successfully. May 15 12:24:12.121629 systemd[1]: Finished issuegen.service - Generate /run/issue. May 15 12:24:12.124589 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 15 12:24:12.127199 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 15 12:24:12.143285 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 15 12:24:12.145716 systemd[1]: Started getty@tty1.service - Getty on tty1. May 15 12:24:12.148806 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 15 12:24:12.150655 systemd[1]: Reached target getty.target - Login Prompts. May 15 12:24:12.157753 tar[1714]: linux-amd64/helm May 15 12:24:12.162453 extend-filesystems[1688]: Old size kept for /dev/nvme0n1p9 May 15 12:24:12.164948 systemd[1]: extend-filesystems.service: Deactivated successfully. May 15 12:24:12.165136 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 15 12:24:12.477705 dbus-daemon[1685]: [system] SELinux support is enabled May 15 12:24:12.478146 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 15 12:24:12.592632 update_engine[1709]: I20250515 12:24:12.480898 1709 update_check_scheduler.cc:74] Next update check in 5m4s May 15 12:24:12.483878 dbus-daemon[1685]: [system] Successfully activated service 'org.freedesktop.systemd1' May 15 12:24:12.483185 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 15 12:24:12.483206 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 15 12:24:12.486181 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 15 12:24:12.486200 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 15 12:24:12.491107 systemd[1]: Started update-engine.service - Update Engine. May 15 12:24:12.493495 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 15 12:24:12.798240 bash[1743]: Updated "/home/core/.ssh/authorized_keys" May 15 12:24:12.798883 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 15 12:24:12.802444 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 15 12:24:12.816460 coreos-metadata[1684]: May 15 12:24:12.816 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 15 12:24:12.821626 coreos-metadata[1684]: May 15 12:24:12.821 INFO Fetch successful May 15 12:24:12.822404 coreos-metadata[1684]: May 15 12:24:12.822 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 15 12:24:12.825552 coreos-metadata[1684]: May 15 12:24:12.825 INFO Fetch successful May 15 12:24:12.825552 coreos-metadata[1684]: May 15 12:24:12.825 INFO Fetching http://168.63.129.16/machine/665429ba-7293-4207-9aab-cd7264364dca/3c591639%2D9aa5%2D4a5e%2D8319%2D1382cfb71be5.%5Fci%2D4334.0.0%2Da%2D81f65144c0?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 15 12:24:12.827190 coreos-metadata[1684]: May 15 12:24:12.827 INFO Fetch successful May 15 12:24:12.829433 coreos-metadata[1684]: May 15 12:24:12.827 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 15 12:24:12.843013 coreos-metadata[1684]: May 15 12:24:12.842 INFO Fetch successful May 15 12:24:12.879466 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 15 12:24:12.881770 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 15 12:24:13.183817 tar[1714]: linux-amd64/LICENSE May 15 12:24:13.184048 tar[1714]: linux-amd64/README.md May 15 12:24:13.195260 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 15 12:24:13.470668 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:13.500758 locksmithd[1805]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 15 12:24:13.593400 (kubelet)[1825]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:14.055505 kubelet[1825]: E0515 12:24:14.055456 1825 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:14.056894 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:14.057026 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:14.057352 systemd[1]: kubelet.service: Consumed 764ms CPU time, 235.5M memory peak. May 15 12:24:14.465028 containerd[1716]: time="2025-05-15T12:24:14Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 15 12:24:14.465581 containerd[1716]: time="2025-05-15T12:24:14.465547424Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 15 12:24:14.473342 containerd[1716]: time="2025-05-15T12:24:14.473310285Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.059µs" May 15 12:24:14.473342 containerd[1716]: time="2025-05-15T12:24:14.473333998Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 15 12:24:14.473429 containerd[1716]: time="2025-05-15T12:24:14.473350425Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 15 12:24:14.473489 containerd[1716]: time="2025-05-15T12:24:14.473476351Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 15 12:24:14.473517 containerd[1716]: time="2025-05-15T12:24:14.473489315Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 15 12:24:14.473517 containerd[1716]: time="2025-05-15T12:24:14.473507882Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:24:14.473574 containerd[1716]: time="2025-05-15T12:24:14.473551244Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:24:14.473574 containerd[1716]: time="2025-05-15T12:24:14.473570752Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:24:14.473763 containerd[1716]: time="2025-05-15T12:24:14.473749933Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:24:14.473763 containerd[1716]: time="2025-05-15T12:24:14.473759356Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:24:14.473803 containerd[1716]: time="2025-05-15T12:24:14.473767727Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:24:14.473803 containerd[1716]: time="2025-05-15T12:24:14.473774618Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 15 12:24:14.473843 containerd[1716]: time="2025-05-15T12:24:14.473823422Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 15 12:24:14.473998 containerd[1716]: time="2025-05-15T12:24:14.473984273Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:24:14.474023 containerd[1716]: time="2025-05-15T12:24:14.474005209Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:24:14.474023 containerd[1716]: time="2025-05-15T12:24:14.474013894Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 15 12:24:14.474061 containerd[1716]: time="2025-05-15T12:24:14.474053183Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 15 12:24:14.474310 containerd[1716]: time="2025-05-15T12:24:14.474286770Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 15 12:24:14.474368 containerd[1716]: time="2025-05-15T12:24:14.474351235Z" level=info msg="metadata content store policy set" policy=shared May 15 12:24:14.489280 containerd[1716]: time="2025-05-15T12:24:14.489252449Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 15 12:24:14.489386 containerd[1716]: time="2025-05-15T12:24:14.489373131Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 15 12:24:14.489534 containerd[1716]: time="2025-05-15T12:24:14.489523407Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489573853Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489589014Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489599251Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489611511Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489621752Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489633332Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489642594Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489650493Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489662005Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489756237Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489771705Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489791532Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489802984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 15 12:24:14.490007 containerd[1716]: time="2025-05-15T12:24:14.489812180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489820999Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489831190Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489841271Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489851627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489859884Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489873140Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489950162Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489963065Z" level=info msg="Start snapshots syncer" May 15 12:24:14.490328 containerd[1716]: time="2025-05-15T12:24:14.489980201Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 15 12:24:14.490493 containerd[1716]: time="2025-05-15T12:24:14.490432491Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 15 12:24:14.490493 containerd[1716]: time="2025-05-15T12:24:14.490478323Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 15 12:24:14.490604 containerd[1716]: time="2025-05-15T12:24:14.490548736Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 15 12:24:14.490675 containerd[1716]: time="2025-05-15T12:24:14.490663586Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 15 12:24:14.490699 containerd[1716]: time="2025-05-15T12:24:14.490686137Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 15 12:24:14.490720 containerd[1716]: time="2025-05-15T12:24:14.490697232Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 15 12:24:14.490802 containerd[1716]: time="2025-05-15T12:24:14.490707586Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 15 12:24:14.490802 containerd[1716]: time="2025-05-15T12:24:14.490760287Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 15 12:24:14.490802 containerd[1716]: time="2025-05-15T12:24:14.490771499Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 15 12:24:14.490802 containerd[1716]: time="2025-05-15T12:24:14.490781134Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 15 12:24:14.490876 containerd[1716]: time="2025-05-15T12:24:14.490814898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 15 12:24:14.490876 containerd[1716]: time="2025-05-15T12:24:14.490825822Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 15 12:24:14.490876 containerd[1716]: time="2025-05-15T12:24:14.490835727Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 15 12:24:14.490876 containerd[1716]: time="2025-05-15T12:24:14.490862214Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490882699Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490891085Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490899136Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490905621Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490929209Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490938629Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490948771Z" level=info msg="runtime interface created" May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490953419Z" level=info msg="created NRI interface" May 15 12:24:14.490962 containerd[1716]: time="2025-05-15T12:24:14.490960439Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 15 12:24:14.491110 containerd[1716]: time="2025-05-15T12:24:14.490971906Z" level=info msg="Connect containerd service" May 15 12:24:14.491110 containerd[1716]: time="2025-05-15T12:24:14.491007357Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 15 12:24:14.491706 containerd[1716]: time="2025-05-15T12:24:14.491684990Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:24:15.661765 containerd[1716]: time="2025-05-15T12:24:15.661538493Z" level=info msg="Start subscribing containerd event" May 15 12:24:15.661765 containerd[1716]: time="2025-05-15T12:24:15.661609087Z" level=info msg="Start recovering state" May 15 12:24:15.661765 containerd[1716]: time="2025-05-15T12:24:15.661717104Z" level=info msg="Start event monitor" May 15 12:24:15.661765 containerd[1716]: time="2025-05-15T12:24:15.661732752Z" level=info msg="Start cni network conf syncer for default" May 15 12:24:15.662229 containerd[1716]: time="2025-05-15T12:24:15.661744070Z" level=info msg="Start streaming server" May 15 12:24:15.662229 containerd[1716]: time="2025-05-15T12:24:15.661874416Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 15 12:24:15.662229 containerd[1716]: time="2025-05-15T12:24:15.661882533Z" level=info msg="runtime interface starting up..." May 15 12:24:15.662229 containerd[1716]: time="2025-05-15T12:24:15.661893943Z" level=info msg="starting plugins..." May 15 12:24:15.662229 containerd[1716]: time="2025-05-15T12:24:15.661907420Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 15 12:24:15.664330 containerd[1716]: time="2025-05-15T12:24:15.661807474Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 15 12:24:15.664330 containerd[1716]: time="2025-05-15T12:24:15.662296684Z" level=info msg=serving... address=/run/containerd/containerd.sock May 15 12:24:15.664330 containerd[1716]: time="2025-05-15T12:24:15.662345149Z" level=info msg="containerd successfully booted in 1.197759s" May 15 12:24:15.662430 systemd[1]: Started containerd.service - containerd container runtime. May 15 12:24:15.664962 systemd[1]: Reached target multi-user.target - Multi-User System. May 15 12:24:15.667785 systemd[1]: Startup finished in 2.940s (kernel) + 19.899s (initrd) + 16.651s (userspace) = 39.490s. May 15 12:24:15.923655 waagent[1783]: 2025-05-15T12:24:15.923559Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.923810Z INFO Daemon Daemon OS: flatcar 4334.0.0 May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.924282Z INFO Daemon Daemon Python: 3.11.12 May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.924904Z INFO Daemon Daemon Run daemon May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.925094Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4334.0.0' May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.925247Z INFO Daemon Daemon Using waagent for provisioning May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.925381Z INFO Daemon Daemon Activate resource disk May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.925822Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.927314Z INFO Daemon Daemon Found device: None May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.927449Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.927500Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.927929Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 15 12:24:15.931843 waagent[1783]: 2025-05-15T12:24:15.928262Z INFO Daemon Daemon Running default provisioning handler May 15 12:24:15.943595 waagent[1783]: 2025-05-15T12:24:15.943555Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 15 12:24:15.945934 waagent[1783]: 2025-05-15T12:24:15.945343Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 15 12:24:15.945934 waagent[1783]: 2025-05-15T12:24:15.945458Z INFO Daemon Daemon cloud-init is enabled: False May 15 12:24:15.945934 waagent[1783]: 2025-05-15T12:24:15.945707Z INFO Daemon Daemon Copying ovf-env.xml May 15 12:24:15.985925 waagent[1783]: 2025-05-15T12:24:15.984651Z INFO Daemon Daemon Successfully mounted dvd May 15 12:24:15.995062 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 15 12:24:15.996086 waagent[1783]: 2025-05-15T12:24:15.996050Z INFO Daemon Daemon Detect protocol endpoint May 15 12:24:15.998040 waagent[1783]: 2025-05-15T12:24:15.996203Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 15 12:24:15.998040 waagent[1783]: 2025-05-15T12:24:15.996392Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 15 12:24:15.998040 waagent[1783]: 2025-05-15T12:24:15.996442Z INFO Daemon Daemon Test for route to 168.63.129.16 May 15 12:24:15.998040 waagent[1783]: 2025-05-15T12:24:15.996856Z INFO Daemon Daemon Route to 168.63.129.16 exists May 15 12:24:15.998040 waagent[1783]: 2025-05-15T12:24:15.997120Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 15 12:24:16.007774 waagent[1783]: 2025-05-15T12:24:16.007734Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 15 12:24:16.008444 waagent[1783]: 2025-05-15T12:24:16.008004Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 15 12:24:16.008444 waagent[1783]: 2025-05-15T12:24:16.008245Z INFO Daemon Daemon Server preferred version:2015-04-05 May 15 12:24:16.147319 waagent[1783]: 2025-05-15T12:24:16.147266Z INFO Daemon Daemon Initializing goal state during protocol detection May 15 12:24:16.148324 waagent[1783]: 2025-05-15T12:24:16.147442Z INFO Daemon Daemon Forcing an update of the goal state. May 15 12:24:16.151726 waagent[1783]: 2025-05-15T12:24:16.151696Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 15 12:24:16.174666 waagent[1783]: 2025-05-15T12:24:16.174611Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 15 12:24:16.176370 waagent[1783]: 2025-05-15T12:24:16.175052Z INFO Daemon May 15 12:24:16.176370 waagent[1783]: 2025-05-15T12:24:16.175269Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 43ee4915-b12d-4bd6-bf9f-cbd1838e0c3e eTag: 15241132347928121620 source: Fabric] May 15 12:24:16.176370 waagent[1783]: 2025-05-15T12:24:16.175494Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 15 12:24:16.176370 waagent[1783]: 2025-05-15T12:24:16.175733Z INFO Daemon May 15 12:24:16.176370 waagent[1783]: 2025-05-15T12:24:16.175866Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 15 12:24:16.187335 waagent[1783]: 2025-05-15T12:24:16.187312Z INFO Daemon Daemon Downloading artifacts profile blob May 15 12:24:16.270865 waagent[1783]: 2025-05-15T12:24:16.270822Z INFO Daemon Downloaded certificate {'thumbprint': '449BD50715E601EEA45CF07A9805DE003608E905', 'hasPrivateKey': True} May 15 12:24:16.273108 waagent[1783]: 2025-05-15T12:24:16.273018Z INFO Daemon Fetch goal state completed May 15 12:24:16.279828 login[1786]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying May 15 12:24:16.281145 login[1787]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 12:24:16.284246 waagent[1783]: 2025-05-15T12:24:16.284209Z INFO Daemon Daemon Starting provisioning May 15 12:24:16.284565 waagent[1783]: 2025-05-15T12:24:16.284382Z INFO Daemon Daemon Handle ovf-env.xml. May 15 12:24:16.285706 waagent[1783]: 2025-05-15T12:24:16.284570Z INFO Daemon Daemon Set hostname [ci-4334.0.0-a-81f65144c0] May 15 12:24:16.297293 waagent[1783]: 2025-05-15T12:24:16.297251Z INFO Daemon Daemon Publish hostname [ci-4334.0.0-a-81f65144c0] May 15 12:24:16.298727 waagent[1783]: 2025-05-15T12:24:16.297504Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 15 12:24:16.298727 waagent[1783]: 2025-05-15T12:24:16.297757Z INFO Daemon Daemon Primary interface is [eth0] May 15 12:24:16.304112 systemd-networkd[1350]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:16.304118 systemd-networkd[1350]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:24:16.304137 systemd-networkd[1350]: eth0: DHCP lease lost May 15 12:24:16.304840 waagent[1783]: 2025-05-15T12:24:16.304796Z INFO Daemon Daemon Create user account if not exists May 15 12:24:16.305812 waagent[1783]: 2025-05-15T12:24:16.305745Z INFO Daemon Daemon User core already exists, skip useradd May 15 12:24:16.306855 waagent[1783]: 2025-05-15T12:24:16.306530Z INFO Daemon Daemon Configure sudoer May 15 12:24:16.313265 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 15 12:24:16.313949 waagent[1783]: 2025-05-15T12:24:16.313693Z INFO Daemon Daemon Configure sshd May 15 12:24:16.315146 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 15 12:24:16.318758 waagent[1783]: 2025-05-15T12:24:16.318093Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 15 12:24:16.318758 waagent[1783]: 2025-05-15T12:24:16.318236Z INFO Daemon Daemon Deploy ssh public key. May 15 12:24:16.325567 systemd-logind[1703]: New session 2 of user core. May 15 12:24:16.328972 systemd-networkd[1350]: eth0: DHCPv4 address 10.200.8.32/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:24:16.340579 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 15 12:24:16.342573 systemd[1]: Starting user@500.service - User Manager for UID 500... May 15 12:24:16.350653 (systemd)[1878]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 15 12:24:16.352303 systemd-logind[1703]: New session c1 of user core. May 15 12:24:16.504461 systemd[1878]: Queued start job for default target default.target. May 15 12:24:16.513504 systemd[1878]: Created slice app.slice - User Application Slice. May 15 12:24:16.513531 systemd[1878]: Reached target paths.target - Paths. May 15 12:24:16.513557 systemd[1878]: Reached target timers.target - Timers. May 15 12:24:16.514355 systemd[1878]: Starting dbus.socket - D-Bus User Message Bus Socket... May 15 12:24:16.520785 systemd[1878]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 15 12:24:16.520826 systemd[1878]: Reached target sockets.target - Sockets. May 15 12:24:16.520858 systemd[1878]: Reached target basic.target - Basic System. May 15 12:24:16.520880 systemd[1878]: Reached target default.target - Main User Target. May 15 12:24:16.520901 systemd[1878]: Startup finished in 164ms. May 15 12:24:16.521172 systemd[1]: Started user@500.service - User Manager for UID 500. May 15 12:24:16.522124 systemd[1]: Started session-2.scope - Session 2 of User core. May 15 12:24:17.280220 login[1786]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 12:24:17.285433 systemd-logind[1703]: New session 1 of user core. May 15 12:24:17.291051 systemd[1]: Started session-1.scope - Session 1 of User core. May 15 12:24:17.376050 waagent[1783]: 2025-05-15T12:24:17.376017Z INFO Daemon Daemon Provisioning complete May 15 12:24:17.388198 waagent[1783]: 2025-05-15T12:24:17.388170Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 15 12:24:17.389579 waagent[1783]: 2025-05-15T12:24:17.389549Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 15 12:24:17.391361 waagent[1783]: 2025-05-15T12:24:17.391087Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 15 12:24:17.480108 waagent[1906]: 2025-05-15T12:24:17.480046Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 15 12:24:17.480333 waagent[1906]: 2025-05-15T12:24:17.480135Z INFO ExtHandler ExtHandler OS: flatcar 4334.0.0 May 15 12:24:17.480333 waagent[1906]: 2025-05-15T12:24:17.480172Z INFO ExtHandler ExtHandler Python: 3.11.12 May 15 12:24:17.480333 waagent[1906]: 2025-05-15T12:24:17.480208Z INFO ExtHandler ExtHandler CPU Arch: x86_64 May 15 12:24:17.904394 waagent[1906]: 2025-05-15T12:24:17.904345Z INFO ExtHandler ExtHandler Distro: flatcar-4334.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; May 15 12:24:17.904524 waagent[1906]: 2025-05-15T12:24:17.904503Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:17.904576 waagent[1906]: 2025-05-15T12:24:17.904551Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:17.913250 waagent[1906]: 2025-05-15T12:24:17.913204Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 15 12:24:17.926869 waagent[1906]: 2025-05-15T12:24:17.926839Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 15 12:24:17.927209 waagent[1906]: 2025-05-15T12:24:17.927181Z INFO ExtHandler May 15 12:24:17.927247 waagent[1906]: 2025-05-15T12:24:17.927232Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 7a900ae5-3d88-4288-b1b2-5bbcc6bfc1c4 eTag: 15241132347928121620 source: Fabric] May 15 12:24:17.927426 waagent[1906]: 2025-05-15T12:24:17.927407Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 15 12:24:17.927722 waagent[1906]: 2025-05-15T12:24:17.927700Z INFO ExtHandler May 15 12:24:17.927752 waagent[1906]: 2025-05-15T12:24:17.927735Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 15 12:24:17.931550 waagent[1906]: 2025-05-15T12:24:17.931525Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 15 12:24:18.198372 waagent[1906]: 2025-05-15T12:24:18.198262Z INFO ExtHandler Downloaded certificate {'thumbprint': '449BD50715E601EEA45CF07A9805DE003608E905', 'hasPrivateKey': True} May 15 12:24:18.198803 waagent[1906]: 2025-05-15T12:24:18.198766Z INFO ExtHandler Fetch goal state completed May 15 12:24:18.209347 waagent[1906]: 2025-05-15T12:24:18.209302Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 15 12:24:18.213262 waagent[1906]: 2025-05-15T12:24:18.213224Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1906 May 15 12:24:18.213358 waagent[1906]: 2025-05-15T12:24:18.213338Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 15 12:24:18.213573 waagent[1906]: 2025-05-15T12:24:18.213554Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 15 12:24:18.214484 waagent[1906]: 2025-05-15T12:24:18.214457Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 15 12:24:18.214740 waagent[1906]: 2025-05-15T12:24:18.214717Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 15 12:24:18.214843 waagent[1906]: 2025-05-15T12:24:18.214825Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 15 12:24:18.215220 waagent[1906]: 2025-05-15T12:24:18.215198Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 15 12:24:18.275551 waagent[1906]: 2025-05-15T12:24:18.275529Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 15 12:24:18.275656 waagent[1906]: 2025-05-15T12:24:18.275639Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 15 12:24:18.280247 waagent[1906]: 2025-05-15T12:24:18.280129Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 15 12:24:18.284898 systemd[1]: Reload requested from client PID 1921 ('systemctl') (unit waagent.service)... May 15 12:24:18.284926 systemd[1]: Reloading... May 15 12:24:18.356933 zram_generator::config[1962]: No configuration found. May 15 12:24:18.455316 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:18.536115 systemd[1]: Reloading finished in 250 ms. May 15 12:24:18.555982 waagent[1906]: 2025-05-15T12:24:18.553995Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 15 12:24:18.555982 waagent[1906]: 2025-05-15T12:24:18.554077Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 15 12:24:20.376790 waagent[1906]: 2025-05-15T12:24:20.376704Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 15 12:24:20.377179 waagent[1906]: 2025-05-15T12:24:20.377133Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 15 12:24:20.378001 waagent[1906]: 2025-05-15T12:24:20.377958Z INFO ExtHandler ExtHandler Starting env monitor service. May 15 12:24:20.378607 waagent[1906]: 2025-05-15T12:24:20.378569Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:20.378692 waagent[1906]: 2025-05-15T12:24:20.378613Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 15 12:24:20.378734 waagent[1906]: 2025-05-15T12:24:20.378715Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:20.378893 waagent[1906]: 2025-05-15T12:24:20.378865Z INFO EnvHandler ExtHandler Configure routes May 15 12:24:20.378994 waagent[1906]: 2025-05-15T12:24:20.378956Z INFO EnvHandler ExtHandler Gateway:None May 15 12:24:20.379052 waagent[1906]: 2025-05-15T12:24:20.379028Z INFO EnvHandler ExtHandler Routes:None May 15 12:24:20.379248 waagent[1906]: 2025-05-15T12:24:20.379208Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:20.379544 waagent[1906]: 2025-05-15T12:24:20.379522Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 15 12:24:20.379762 waagent[1906]: 2025-05-15T12:24:20.379744Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 15 12:24:20.380028 waagent[1906]: 2025-05-15T12:24:20.379954Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:20.380155 waagent[1906]: 2025-05-15T12:24:20.380137Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 15 12:24:20.380331 waagent[1906]: 2025-05-15T12:24:20.380310Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 15 12:24:20.380625 waagent[1906]: 2025-05-15T12:24:20.380565Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 15 12:24:20.380664 waagent[1906]: 2025-05-15T12:24:20.380644Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 15 12:24:20.381600 waagent[1906]: 2025-05-15T12:24:20.381566Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 15 12:24:20.381600 waagent[1906]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 15 12:24:20.381600 waagent[1906]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 May 15 12:24:20.381600 waagent[1906]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 15 12:24:20.381600 waagent[1906]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:20.381600 waagent[1906]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:20.381600 waagent[1906]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:20.437615 waagent[1906]: 2025-05-15T12:24:20.437587Z INFO ExtHandler ExtHandler May 15 12:24:20.437673 waagent[1906]: 2025-05-15T12:24:20.437638Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: f22c63a2-1954-4f63-9ce8-fdc4f2d41152 correlation 04b72791-5413-405d-941f-75494aeb3af8 created: 2025-05-15T12:22:59.836914Z] May 15 12:24:20.437882 waagent[1906]: 2025-05-15T12:24:20.437864Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 15 12:24:20.438229 waagent[1906]: 2025-05-15T12:24:20.438209Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] May 15 12:24:20.485847 waagent[1906]: 2025-05-15T12:24:20.485807Z INFO MonitorHandler ExtHandler Network interfaces: May 15 12:24:20.485847 waagent[1906]: Executing ['ip', '-a', '-o', 'link']: May 15 12:24:20.485847 waagent[1906]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 15 12:24:20.485847 waagent[1906]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fb:ce:1f brd ff:ff:ff:ff:ff:ff\ alias Network Device May 15 12:24:20.485847 waagent[1906]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fb:ce:1f brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 May 15 12:24:20.485847 waagent[1906]: Executing ['ip', '-4', '-a', '-o', 'address']: May 15 12:24:20.485847 waagent[1906]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 15 12:24:20.485847 waagent[1906]: 2: eth0 inet 10.200.8.32/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever May 15 12:24:20.485847 waagent[1906]: Executing ['ip', '-6', '-a', '-o', 'address']: May 15 12:24:20.485847 waagent[1906]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 15 12:24:20.485847 waagent[1906]: 2: eth0 inet6 fe80::7e1e:52ff:fefb:ce1f/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 15 12:24:20.485847 waagent[1906]: 3: enP30832s1 inet6 fe80::7e1e:52ff:fefb:ce1f/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 15 12:24:20.569377 waagent[1906]: 2025-05-15T12:24:20.569339Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command May 15 12:24:20.569377 waagent[1906]: Try `iptables -h' or 'iptables --help' for more information.) May 15 12:24:20.570240 waagent[1906]: 2025-05-15T12:24:20.570207Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 10A0813A-196F-4B17-9A39-90BF8A499F58;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 15 12:24:20.584151 waagent[1906]: 2025-05-15T12:24:20.584115Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 15 12:24:20.584151 waagent[1906]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:20.584151 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.584151 waagent[1906]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:20.584151 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.584151 waagent[1906]: Chain OUTPUT (policy ACCEPT 2 packets, 112 bytes) May 15 12:24:20.584151 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.584151 waagent[1906]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 15 12:24:20.584151 waagent[1906]: 5 468 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 15 12:24:20.584151 waagent[1906]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 15 12:24:20.586644 waagent[1906]: 2025-05-15T12:24:20.586607Z INFO EnvHandler ExtHandler Current Firewall rules: May 15 12:24:20.586644 waagent[1906]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:20.586644 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.586644 waagent[1906]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:20.586644 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.586644 waagent[1906]: Chain OUTPUT (policy ACCEPT 2 packets, 112 bytes) May 15 12:24:20.586644 waagent[1906]: pkts bytes target prot opt in out source destination May 15 12:24:20.586644 waagent[1906]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 15 12:24:20.586644 waagent[1906]: 7 704 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 15 12:24:20.586644 waagent[1906]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 15 12:24:24.209907 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 15 12:24:24.211783 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:30.182822 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:30.185837 (kubelet)[2058]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:30.218310 kubelet[2058]: E0515 12:24:30.218269 2058 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:30.220662 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:30.220777 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:30.221098 systemd[1]: kubelet.service: Consumed 118ms CPU time, 98M memory peak. May 15 12:24:35.645369 chronyd[1745]: Selected source PHC0 May 15 12:24:40.460005 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 15 12:24:40.461762 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:43.785754 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:43.790165 (kubelet)[2076]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:43.823555 kubelet[2076]: E0515 12:24:43.823513 2076 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:43.825116 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:43.825232 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:43.825511 systemd[1]: kubelet.service: Consumed 108ms CPU time, 96.3M memory peak. May 15 12:24:50.089016 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 15 12:24:50.090225 systemd[1]: Started sshd@0-10.200.8.32:22-10.200.16.10:55384.service - OpenSSH per-connection server daemon (10.200.16.10:55384). May 15 12:24:50.769719 sshd[2084]: Accepted publickey for core from 10.200.16.10 port 55384 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:50.770999 sshd-session[2084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:50.775262 systemd-logind[1703]: New session 3 of user core. May 15 12:24:50.784032 systemd[1]: Started session-3.scope - Session 3 of User core. May 15 12:24:51.346098 systemd[1]: Started sshd@1-10.200.8.32:22-10.200.16.10:45234.service - OpenSSH per-connection server daemon (10.200.16.10:45234). May 15 12:24:51.984333 sshd[2089]: Accepted publickey for core from 10.200.16.10 port 45234 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:51.985551 sshd-session[2089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:51.989709 systemd-logind[1703]: New session 4 of user core. May 15 12:24:51.997036 systemd[1]: Started session-4.scope - Session 4 of User core. May 15 12:24:52.436639 sshd[2091]: Connection closed by 10.200.16.10 port 45234 May 15 12:24:52.437384 sshd-session[2089]: pam_unix(sshd:session): session closed for user core May 15 12:24:52.440408 systemd[1]: sshd@1-10.200.8.32:22-10.200.16.10:45234.service: Deactivated successfully. May 15 12:24:52.441795 systemd[1]: session-4.scope: Deactivated successfully. May 15 12:24:52.442438 systemd-logind[1703]: Session 4 logged out. Waiting for processes to exit. May 15 12:24:52.443439 systemd-logind[1703]: Removed session 4. May 15 12:24:52.683155 systemd[1]: Started sshd@2-10.200.8.32:22-10.200.16.10:45242.service - OpenSSH per-connection server daemon (10.200.16.10:45242). May 15 12:24:53.256617 kernel: hv_balloon: Max. dynamic memory size: 8192 MB May 15 12:24:53.320090 sshd[2097]: Accepted publickey for core from 10.200.16.10 port 45242 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:53.321301 sshd-session[2097]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:53.325655 systemd-logind[1703]: New session 5 of user core. May 15 12:24:53.335034 systemd[1]: Started session-5.scope - Session 5 of User core. May 15 12:24:53.768314 sshd[2099]: Connection closed by 10.200.16.10 port 45242 May 15 12:24:53.768800 sshd-session[2097]: pam_unix(sshd:session): session closed for user core May 15 12:24:53.771903 systemd[1]: sshd@2-10.200.8.32:22-10.200.16.10:45242.service: Deactivated successfully. May 15 12:24:53.773328 systemd[1]: session-5.scope: Deactivated successfully. May 15 12:24:53.773862 systemd-logind[1703]: Session 5 logged out. Waiting for processes to exit. May 15 12:24:53.774905 systemd-logind[1703]: Removed session 5. May 15 12:24:53.879466 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 15 12:24:53.880679 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:53.883118 systemd[1]: Started sshd@3-10.200.8.32:22-10.200.16.10:45258.service - OpenSSH per-connection server daemon (10.200.16.10:45258). May 15 12:24:54.519464 sshd[2106]: Accepted publickey for core from 10.200.16.10 port 45258 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:55.193018 sshd-session[2106]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:55.198519 systemd-logind[1703]: New session 6 of user core. May 15 12:24:55.205353 systemd[1]: Started session-6.scope - Session 6 of User core. May 15 12:24:55.558643 sshd[2110]: Connection closed by 10.200.16.10 port 45258 May 15 12:24:55.559141 sshd-session[2106]: pam_unix(sshd:session): session closed for user core May 15 12:24:55.562272 systemd[1]: sshd@3-10.200.8.32:22-10.200.16.10:45258.service: Deactivated successfully. May 15 12:24:55.563592 systemd[1]: session-6.scope: Deactivated successfully. May 15 12:24:55.564236 systemd-logind[1703]: Session 6 logged out. Waiting for processes to exit. May 15 12:24:55.565151 systemd-logind[1703]: Removed session 6. May 15 12:24:55.673732 systemd[1]: Started sshd@4-10.200.8.32:22-10.200.16.10:45266.service - OpenSSH per-connection server daemon (10.200.16.10:45266). May 15 12:24:56.311021 sshd[2116]: Accepted publickey for core from 10.200.16.10 port 45266 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:56.312223 sshd-session[2116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:56.316407 systemd-logind[1703]: New session 7 of user core. May 15 12:24:56.323057 systemd[1]: Started session-7.scope - Session 7 of User core. May 15 12:24:57.538738 sudo[2119]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 15 12:24:57.538967 sudo[2119]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:57.551728 sudo[2119]: pam_unix(sudo:session): session closed for user root May 15 12:24:57.655093 sshd[2118]: Connection closed by 10.200.16.10 port 45266 May 15 12:24:57.655576 sshd-session[2116]: pam_unix(sshd:session): session closed for user core May 15 12:24:57.657997 systemd[1]: sshd@4-10.200.8.32:22-10.200.16.10:45266.service: Deactivated successfully. May 15 12:24:57.659271 systemd[1]: session-7.scope: Deactivated successfully. May 15 12:24:57.660810 systemd-logind[1703]: Session 7 logged out. Waiting for processes to exit. May 15 12:24:57.661616 systemd-logind[1703]: Removed session 7. May 15 12:24:57.773996 systemd[1]: Started sshd@5-10.200.8.32:22-10.200.16.10:45278.service - OpenSSH per-connection server daemon (10.200.16.10:45278). May 15 12:24:57.996563 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:58.008139 (kubelet)[2131]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:58.036824 kubelet[2131]: E0515 12:24:58.036794 2131 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:58.037667 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:58.037780 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:58.038046 systemd[1]: kubelet.service: Consumed 111ms CPU time, 95.5M memory peak. May 15 12:24:58.177522 update_engine[1709]: I20250515 12:24:58.177460 1709 update_attempter.cc:509] Updating boot flags... May 15 12:24:58.424438 sshd[2125]: Accepted publickey for core from 10.200.16.10 port 45278 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:58.425481 sshd-session[2125]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:58.428956 systemd-logind[1703]: New session 8 of user core. May 15 12:24:58.438048 systemd[1]: Started session-8.scope - Session 8 of User core. May 15 12:24:58.770390 sudo[2175]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 15 12:24:58.770574 sudo[2175]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:58.775800 sudo[2175]: pam_unix(sudo:session): session closed for user root May 15 12:24:58.779018 sudo[2174]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 15 12:24:58.779200 sudo[2174]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:58.785271 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:24:58.811582 augenrules[2197]: No rules May 15 12:24:58.812456 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:24:58.812606 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:24:58.813411 sudo[2174]: pam_unix(sudo:session): session closed for user root May 15 12:24:58.915741 sshd[2161]: Connection closed by 10.200.16.10 port 45278 May 15 12:24:58.916188 sshd-session[2125]: pam_unix(sshd:session): session closed for user core May 15 12:24:58.918578 systemd[1]: sshd@5-10.200.8.32:22-10.200.16.10:45278.service: Deactivated successfully. May 15 12:24:58.919863 systemd[1]: session-8.scope: Deactivated successfully. May 15 12:24:58.921248 systemd-logind[1703]: Session 8 logged out. Waiting for processes to exit. May 15 12:24:58.921906 systemd-logind[1703]: Removed session 8. May 15 12:24:59.033859 systemd[1]: Started sshd@6-10.200.8.32:22-10.200.16.10:36626.service - OpenSSH per-connection server daemon (10.200.16.10:36626). May 15 12:24:59.671045 sshd[2206]: Accepted publickey for core from 10.200.16.10 port 36626 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:59.672230 sshd-session[2206]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:59.676235 systemd-logind[1703]: New session 9 of user core. May 15 12:24:59.685049 systemd[1]: Started session-9.scope - Session 9 of User core. May 15 12:25:00.018340 sudo[2209]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 15 12:25:00.018529 sudo[2209]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:25:01.222469 systemd[1]: Starting docker.service - Docker Application Container Engine... May 15 12:25:01.231216 (dockerd)[2226]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 15 12:25:02.438395 dockerd[2226]: time="2025-05-15T12:25:02.438338849Z" level=info msg="Starting up" May 15 12:25:02.439359 dockerd[2226]: time="2025-05-15T12:25:02.439333001Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 15 12:25:03.362160 dockerd[2226]: time="2025-05-15T12:25:03.362116658Z" level=info msg="Loading containers: start." May 15 12:25:03.404931 kernel: Initializing XFRM netlink socket May 15 12:25:03.865688 systemd-networkd[1350]: docker0: Link UP May 15 12:25:03.933487 dockerd[2226]: time="2025-05-15T12:25:03.933462321Z" level=info msg="Loading containers: done." May 15 12:25:03.943121 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2685055321-merged.mount: Deactivated successfully. May 15 12:25:05.086849 dockerd[2226]: time="2025-05-15T12:25:05.086791621Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 15 12:25:05.087292 dockerd[2226]: time="2025-05-15T12:25:05.086940719Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 15 12:25:05.087292 dockerd[2226]: time="2025-05-15T12:25:05.087067523Z" level=info msg="Initializing buildkit" May 15 12:25:05.537672 dockerd[2226]: time="2025-05-15T12:25:05.537628975Z" level=info msg="Completed buildkit initialization" May 15 12:25:05.544040 dockerd[2226]: time="2025-05-15T12:25:05.543989394Z" level=info msg="Daemon has completed initialization" May 15 12:25:05.544703 dockerd[2226]: time="2025-05-15T12:25:05.544098102Z" level=info msg="API listen on /run/docker.sock" May 15 12:25:05.544207 systemd[1]: Started docker.service - Docker Application Container Engine. May 15 12:25:06.837246 containerd[1716]: time="2025-05-15T12:25:06.837206328Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 15 12:25:08.209567 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 15 12:25:08.211119 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:12.373783 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:12.382102 (kubelet)[2434]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:12.409917 kubelet[2434]: E0515 12:25:12.409883 2434 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:12.410726 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:12.410835 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:12.411084 systemd[1]: kubelet.service: Consumed 107ms CPU time, 93.4M memory peak. May 15 12:25:13.905820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3168674658.mount: Deactivated successfully. May 15 12:25:22.459691 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 15 12:25:22.461163 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:26.708856 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:26.713198 (kubelet)[2487]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:26.741393 kubelet[2487]: E0515 12:25:26.741360 2487 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:26.742729 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:26.742843 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:26.743126 systemd[1]: kubelet.service: Consumed 109ms CPU time, 95.5M memory peak. May 15 12:25:30.135617 containerd[1716]: time="2025-05-15T12:25:30.135559837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:30.494129 containerd[1716]: time="2025-05-15T12:25:30.494068307Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960995" May 15 12:25:30.542199 containerd[1716]: time="2025-05-15T12:25:30.542133887Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:30.546068 containerd[1716]: time="2025-05-15T12:25:30.546018902Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:30.546921 containerd[1716]: time="2025-05-15T12:25:30.546875001Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 23.709635889s" May 15 12:25:30.547086 containerd[1716]: time="2025-05-15T12:25:30.546905331Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 15 12:25:30.548481 containerd[1716]: time="2025-05-15T12:25:30.548461517Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 15 12:25:36.959850 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 15 12:25:36.961965 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:42.851873 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:42.860170 (kubelet)[2522]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:42.889337 kubelet[2522]: E0515 12:25:42.889298 2522 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:42.890559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:42.890670 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:42.890980 systemd[1]: kubelet.service: Consumed 114ms CPU time, 93.5M memory peak. May 15 12:25:44.580091 containerd[1716]: time="2025-05-15T12:25:44.580019245Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:44.641796 containerd[1716]: time="2025-05-15T12:25:44.641731827Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713784" May 15 12:25:44.943507 containerd[1716]: time="2025-05-15T12:25:44.943387588Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:44.990628 containerd[1716]: time="2025-05-15T12:25:44.990538262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:44.991799 containerd[1716]: time="2025-05-15T12:25:44.991618094Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 14.443121193s" May 15 12:25:44.991799 containerd[1716]: time="2025-05-15T12:25:44.991655332Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 15 12:25:44.992159 containerd[1716]: time="2025-05-15T12:25:44.992136587Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 15 12:25:51.392688 containerd[1716]: time="2025-05-15T12:25:51.392617122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:51.438201 containerd[1716]: time="2025-05-15T12:25:51.438153398Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780394" May 15 12:25:51.440843 containerd[1716]: time="2025-05-15T12:25:51.440793834Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:51.831244 containerd[1716]: time="2025-05-15T12:25:51.831184403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:51.832946 containerd[1716]: time="2025-05-15T12:25:51.832340575Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 6.840168974s" May 15 12:25:51.832946 containerd[1716]: time="2025-05-15T12:25:51.832380274Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 15 12:25:51.833070 containerd[1716]: time="2025-05-15T12:25:51.833011391Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 15 12:25:52.959560 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 15 12:25:52.960926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:56.904746 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:56.909167 (kubelet)[2542]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:56.935436 kubelet[2542]: E0515 12:25:56.935395 2542 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:56.936614 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:56.936736 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:56.937043 systemd[1]: kubelet.service: Consumed 105ms CPU time, 93.8M memory peak. May 15 12:25:57.609579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1567945703.mount: Deactivated successfully. May 15 12:25:59.283882 containerd[1716]: time="2025-05-15T12:25:59.283823104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:59.288162 containerd[1716]: time="2025-05-15T12:25:59.288130349Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354633" May 15 12:25:59.331676 containerd[1716]: time="2025-05-15T12:25:59.331619081Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:59.335296 containerd[1716]: time="2025-05-15T12:25:59.335248120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:59.335723 containerd[1716]: time="2025-05-15T12:25:59.335598436Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 7.502560492s" May 15 12:25:59.335723 containerd[1716]: time="2025-05-15T12:25:59.335624092Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 15 12:25:59.336159 containerd[1716]: time="2025-05-15T12:25:59.336122703Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 15 12:26:01.005720 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3916627857.mount: Deactivated successfully. May 15 12:26:06.740152 containerd[1716]: time="2025-05-15T12:26:06.740095555Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:06.787286 containerd[1716]: time="2025-05-15T12:26:06.787249790Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 15 12:26:06.833774 containerd[1716]: time="2025-05-15T12:26:06.833714118Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:06.838452 containerd[1716]: time="2025-05-15T12:26:06.838415538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:06.839187 containerd[1716]: time="2025-05-15T12:26:06.839047964Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 7.502901444s" May 15 12:26:06.839187 containerd[1716]: time="2025-05-15T12:26:06.839080178Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 15 12:26:06.839639 containerd[1716]: time="2025-05-15T12:26:06.839601283Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 15 12:26:06.959764 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 15 12:26:06.961490 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:10.203853 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:10.212134 (kubelet)[2610]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:26:10.239879 kubelet[2610]: E0515 12:26:10.239847 2610 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:26:10.240732 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:26:10.240834 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:26:10.241119 systemd[1]: kubelet.service: Consumed 105ms CPU time, 95.5M memory peak. May 15 12:26:12.752479 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1775792858.mount: Deactivated successfully. May 15 12:26:12.931862 containerd[1716]: time="2025-05-15T12:26:12.931811740Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:12.993280 containerd[1716]: time="2025-05-15T12:26:12.993233063Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 15 12:26:13.040271 containerd[1716]: time="2025-05-15T12:26:13.040149176Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:13.046463 containerd[1716]: time="2025-05-15T12:26:13.046433394Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:13.046922 containerd[1716]: time="2025-05-15T12:26:13.046869032Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 6.207233157s" May 15 12:26:13.046922 containerd[1716]: time="2025-05-15T12:26:13.046895405Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 15 12:26:13.047469 containerd[1716]: time="2025-05-15T12:26:13.047432410Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 15 12:26:14.592167 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1908513516.mount: Deactivated successfully. May 15 12:26:20.459830 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 15 12:26:20.461648 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:24.309828 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:24.317126 (kubelet)[2647]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:26:24.347187 kubelet[2647]: E0515 12:26:24.347143 2647 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:26:24.348085 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:26:24.348201 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:26:24.348508 systemd[1]: kubelet.service: Consumed 111ms CPU time, 95.6M memory peak. May 15 12:26:28.784328 containerd[1716]: time="2025-05-15T12:26:28.784271596Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:28.786638 containerd[1716]: time="2025-05-15T12:26:28.786600867Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" May 15 12:26:28.831459 containerd[1716]: time="2025-05-15T12:26:28.831401924Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:28.877980 containerd[1716]: time="2025-05-15T12:26:28.877881634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:28.879260 containerd[1716]: time="2025-05-15T12:26:28.879123945Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 15.831662285s" May 15 12:26:28.879260 containerd[1716]: time="2025-05-15T12:26:28.879162189Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 15 12:26:32.581492 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:32.581664 systemd[1]: kubelet.service: Consumed 111ms CPU time, 95.6M memory peak. May 15 12:26:32.583760 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:32.602747 systemd[1]: Reload requested from client PID 2719 ('systemctl') (unit session-9.scope)... May 15 12:26:32.602848 systemd[1]: Reloading... May 15 12:26:32.682938 zram_generator::config[2760]: No configuration found. May 15 12:26:32.747975 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:26:32.831225 systemd[1]: Reloading finished in 228 ms. May 15 12:26:33.788669 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 15 12:26:33.788769 systemd[1]: kubelet.service: Failed with result 'signal'. May 15 12:26:33.789159 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:33.791661 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:39.565804 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:39.573233 (kubelet)[2831]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:26:39.603021 kubelet[2831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:39.603021 kubelet[2831]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:26:39.603021 kubelet[2831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:39.603281 kubelet[2831]: I0515 12:26:39.603074 2831 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:26:39.714663 kubelet[2831]: I0515 12:26:39.714636 2831 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 15 12:26:39.715107 kubelet[2831]: I0515 12:26:39.715092 2831 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:26:39.715508 kubelet[2831]: I0515 12:26:39.715455 2831 server.go:929] "Client rotation is on, will bootstrap in background" May 15 12:26:39.738173 kubelet[2831]: I0515 12:26:39.738040 2831 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:26:39.738670 kubelet[2831]: E0515 12:26:39.738625 2831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:39.745081 kubelet[2831]: I0515 12:26:39.745049 2831 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 12:26:39.747819 kubelet[2831]: I0515 12:26:39.747798 2831 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:26:39.747901 kubelet[2831]: I0515 12:26:39.747869 2831 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 15 12:26:39.747984 kubelet[2831]: I0515 12:26:39.747966 2831 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:26:39.748103 kubelet[2831]: I0515 12:26:39.747984 2831 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-81f65144c0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 12:26:39.748210 kubelet[2831]: I0515 12:26:39.748104 2831 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:26:39.748210 kubelet[2831]: I0515 12:26:39.748113 2831 container_manager_linux.go:300] "Creating device plugin manager" May 15 12:26:39.748210 kubelet[2831]: I0515 12:26:39.748186 2831 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:39.750266 kubelet[2831]: I0515 12:26:39.750250 2831 kubelet.go:408] "Attempting to sync node with API server" May 15 12:26:39.750266 kubelet[2831]: I0515 12:26:39.750267 2831 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:26:39.750343 kubelet[2831]: I0515 12:26:39.750292 2831 kubelet.go:314] "Adding apiserver pod source" May 15 12:26:39.750343 kubelet[2831]: I0515 12:26:39.750308 2831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:26:39.756793 kubelet[2831]: W0515 12:26:39.756045 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-81f65144c0&limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:39.756793 kubelet[2831]: E0515 12:26:39.756089 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-81f65144c0&limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:39.756885 kubelet[2831]: W0515 12:26:39.756781 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:39.756885 kubelet[2831]: E0515 12:26:39.756827 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:39.756954 kubelet[2831]: I0515 12:26:39.756887 2831 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:26:39.760141 kubelet[2831]: I0515 12:26:39.759106 2831 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:26:39.760141 kubelet[2831]: W0515 12:26:39.759204 2831 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 15 12:26:39.760141 kubelet[2831]: I0515 12:26:39.760070 2831 server.go:1269] "Started kubelet" May 15 12:26:39.762509 kubelet[2831]: I0515 12:26:39.762138 2831 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:26:39.763547 kubelet[2831]: I0515 12:26:39.763527 2831 server.go:460] "Adding debug handlers to kubelet server" May 15 12:26:39.765839 kubelet[2831]: I0515 12:26:39.765799 2831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:26:39.766134 kubelet[2831]: I0515 12:26:39.766123 2831 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:26:39.766693 kubelet[2831]: I0515 12:26:39.766670 2831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:26:39.769108 kubelet[2831]: E0515 12:26:39.766530 2831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.32:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334.0.0-a-81f65144c0.183fb3019179ad96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-81f65144c0,UID:ci-4334.0.0-a-81f65144c0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-81f65144c0,},FirstTimestamp:2025-05-15 12:26:39.760051606 +0000 UTC m=+0.184079309,LastTimestamp:2025-05-15 12:26:39.760051606 +0000 UTC m=+0.184079309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-81f65144c0,}" May 15 12:26:39.770296 kubelet[2831]: I0515 12:26:39.770260 2831 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 12:26:39.771318 kubelet[2831]: I0515 12:26:39.771302 2831 volume_manager.go:289] "Starting Kubelet Volume Manager" May 15 12:26:39.771489 kubelet[2831]: E0515 12:26:39.771475 2831 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:39.772765 kubelet[2831]: E0515 12:26:39.772737 2831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-81f65144c0?timeout=10s\": dial tcp 10.200.8.32:6443: connect: connection refused" interval="200ms" May 15 12:26:39.773153 kubelet[2831]: I0515 12:26:39.773138 2831 factory.go:221] Registration of the systemd container factory successfully May 15 12:26:39.773210 kubelet[2831]: I0515 12:26:39.773203 2831 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:26:39.773391 kubelet[2831]: E0515 12:26:39.773380 2831 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:26:39.773662 kubelet[2831]: I0515 12:26:39.773650 2831 reconciler.go:26] "Reconciler: start to sync state" May 15 12:26:39.773701 kubelet[2831]: I0515 12:26:39.773679 2831 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 15 12:26:39.774246 kubelet[2831]: W0515 12:26:39.774057 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:39.774246 kubelet[2831]: E0515 12:26:39.774095 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:39.774790 kubelet[2831]: I0515 12:26:39.774775 2831 factory.go:221] Registration of the containerd container factory successfully May 15 12:26:39.780772 kubelet[2831]: I0515 12:26:39.780739 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:26:39.781601 kubelet[2831]: I0515 12:26:39.781579 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:26:39.781601 kubelet[2831]: I0515 12:26:39.781602 2831 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:26:39.781673 kubelet[2831]: I0515 12:26:39.781615 2831 kubelet.go:2321] "Starting kubelet main sync loop" May 15 12:26:39.781673 kubelet[2831]: E0515 12:26:39.781644 2831 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:26:39.786771 kubelet[2831]: W0515 12:26:39.786743 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:39.786840 kubelet[2831]: E0515 12:26:39.786786 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:39.803206 kubelet[2831]: I0515 12:26:39.803194 2831 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:26:39.803206 kubelet[2831]: I0515 12:26:39.803213 2831 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:26:39.803296 kubelet[2831]: I0515 12:26:39.803226 2831 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:39.872466 kubelet[2831]: E0515 12:26:39.872386 2831 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:39.882717 kubelet[2831]: E0515 12:26:39.882671 2831 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:39.972893 kubelet[2831]: E0515 12:26:39.972861 2831 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:39.973548 kubelet[2831]: E0515 12:26:39.973516 2831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-81f65144c0?timeout=10s\": dial tcp 10.200.8.32:6443: connect: connection refused" interval="400ms" May 15 12:26:40.073863 kubelet[2831]: E0515 12:26:40.073833 2831 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:40.090455 kubelet[2831]: E0515 12:26:40.083101 2831 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:40.091676 kubelet[2831]: I0515 12:26:40.091659 2831 policy_none.go:49] "None policy: Start" May 15 12:26:40.092325 kubelet[2831]: I0515 12:26:40.092309 2831 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:26:40.092393 kubelet[2831]: I0515 12:26:40.092331 2831 state_mem.go:35] "Initializing new in-memory state store" May 15 12:26:40.101610 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 15 12:26:40.113721 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 15 12:26:40.116456 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 15 12:26:40.123566 kubelet[2831]: I0515 12:26:40.123419 2831 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:26:40.123620 kubelet[2831]: I0515 12:26:40.123616 2831 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 12:26:40.124030 kubelet[2831]: I0515 12:26:40.123624 2831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:26:40.124030 kubelet[2831]: I0515 12:26:40.123931 2831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:26:40.125667 kubelet[2831]: E0515 12:26:40.125649 2831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:40.225530 kubelet[2831]: I0515 12:26:40.225491 2831 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:40.225850 kubelet[2831]: E0515 12:26:40.225817 2831 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.32:6443/api/v1/nodes\": dial tcp 10.200.8.32:6443: connect: connection refused" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:40.374227 kubelet[2831]: E0515 12:26:40.374132 2831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-81f65144c0?timeout=10s\": dial tcp 10.200.8.32:6443: connect: connection refused" interval="800ms" May 15 12:26:40.427698 kubelet[2831]: I0515 12:26:40.427679 2831 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:40.428030 kubelet[2831]: E0515 12:26:40.427997 2831 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.32:6443/api/v1/nodes\": dial tcp 10.200.8.32:6443: connect: connection refused" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:40.491777 systemd[1]: Created slice kubepods-burstable-pode07fdda3436c42f8edc4b9caf48c4845.slice - libcontainer container kubepods-burstable-pode07fdda3436c42f8edc4b9caf48c4845.slice. May 15 12:26:40.504356 systemd[1]: Created slice kubepods-burstable-pod861855d9e736c1b3b0ac4d06a1b2019d.slice - libcontainer container kubepods-burstable-pod861855d9e736c1b3b0ac4d06a1b2019d.slice. May 15 12:26:40.507319 systemd[1]: Created slice kubepods-burstable-pod7f0de386ac9f29ada88fbb9b6e4a5c39.slice - libcontainer container kubepods-burstable-pod7f0de386ac9f29ada88fbb9b6e4a5c39.slice. May 15 12:26:40.578836 kubelet[2831]: I0515 12:26:40.578809 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.578904 kubelet[2831]: I0515 12:26:40.578844 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.578904 kubelet[2831]: I0515 12:26:40.578857 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.578904 kubelet[2831]: I0515 12:26:40.578872 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.578904 kubelet[2831]: I0515 12:26:40.578889 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.579018 kubelet[2831]: I0515 12:26:40.578904 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.579018 kubelet[2831]: I0515 12:26:40.578933 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.579018 kubelet[2831]: I0515 12:26:40.578949 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.579018 kubelet[2831]: I0515 12:26:40.578964 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7f0de386ac9f29ada88fbb9b6e4a5c39-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-81f65144c0\" (UID: \"7f0de386ac9f29ada88fbb9b6e4a5c39\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-81f65144c0" May 15 12:26:40.702607 kubelet[2831]: W0515 12:26:40.702525 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:40.702607 kubelet[2831]: E0515 12:26:40.702575 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:40.803374 containerd[1716]: time="2025-05-15T12:26:40.803320977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-81f65144c0,Uid:e07fdda3436c42f8edc4b9caf48c4845,Namespace:kube-system,Attempt:0,}" May 15 12:26:40.806789 containerd[1716]: time="2025-05-15T12:26:40.806744424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-81f65144c0,Uid:861855d9e736c1b3b0ac4d06a1b2019d,Namespace:kube-system,Attempt:0,}" May 15 12:26:40.809470 containerd[1716]: time="2025-05-15T12:26:40.809405022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-81f65144c0,Uid:7f0de386ac9f29ada88fbb9b6e4a5c39,Namespace:kube-system,Attempt:0,}" May 15 12:26:40.829650 kubelet[2831]: I0515 12:26:40.829634 2831 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:40.829870 kubelet[2831]: E0515 12:26:40.829856 2831 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.32:6443/api/v1/nodes\": dial tcp 10.200.8.32:6443: connect: connection refused" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:41.000996 kubelet[2831]: W0515 12:26:41.000944 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:41.001083 kubelet[2831]: E0515 12:26:41.001003 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:41.102028 kubelet[2831]: W0515 12:26:41.101984 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-81f65144c0&limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:41.135983 kubelet[2831]: E0515 12:26:41.102038 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-81f65144c0&limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:41.135983 kubelet[2831]: W0515 12:26:41.119550 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:41.135983 kubelet[2831]: E0515 12:26:41.119575 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:41.175350 kubelet[2831]: E0515 12:26:41.175322 2831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-81f65144c0?timeout=10s\": dial tcp 10.200.8.32:6443: connect: connection refused" interval="1.6s" May 15 12:26:41.631801 kubelet[2831]: I0515 12:26:41.631759 2831 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:41.632120 kubelet[2831]: E0515 12:26:41.632094 2831 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.32:6443/api/v1/nodes\": dial tcp 10.200.8.32:6443: connect: connection refused" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:41.687931 containerd[1716]: time="2025-05-15T12:26:41.687804048Z" level=info msg="connecting to shim b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0" address="unix:///run/containerd/s/0a80a2d500da1e330c19db944f913c4f64591889f47e483d10486a89358fd129" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:41.710047 systemd[1]: Started cri-containerd-b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0.scope - libcontainer container b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0. May 15 12:26:41.793149 containerd[1716]: time="2025-05-15T12:26:41.793124865Z" level=info msg="connecting to shim 1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8" address="unix:///run/containerd/s/2d5475823158865dd43e5383ce0fcbae25284928e53f51690fd2360e4bcd4900" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:41.813061 systemd[1]: Started cri-containerd-1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8.scope - libcontainer container 1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8. May 15 12:26:41.835830 containerd[1716]: time="2025-05-15T12:26:41.835802391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-81f65144c0,Uid:e07fdda3436c42f8edc4b9caf48c4845,Namespace:kube-system,Attempt:0,} returns sandbox id \"b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0\"" May 15 12:26:41.839098 containerd[1716]: time="2025-05-15T12:26:41.839053099Z" level=info msg="CreateContainer within sandbox \"b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 15 12:26:41.888305 containerd[1716]: time="2025-05-15T12:26:41.888235539Z" level=info msg="connecting to shim c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70" address="unix:///run/containerd/s/e736941213e1684f3f24c58a32486b9f0ea2b49bff1825d5c66a8ab604edda3b" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:41.892144 kubelet[2831]: E0515 12:26:41.892122 2831 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:41.907056 systemd[1]: Started cri-containerd-c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70.scope - libcontainer container c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70. May 15 12:26:41.930757 containerd[1716]: time="2025-05-15T12:26:41.930691356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-81f65144c0,Uid:861855d9e736c1b3b0ac4d06a1b2019d,Namespace:kube-system,Attempt:0,} returns sandbox id \"1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8\"" May 15 12:26:41.932879 containerd[1716]: time="2025-05-15T12:26:41.932852575Z" level=info msg="CreateContainer within sandbox \"1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 15 12:26:42.135011 containerd[1716]: time="2025-05-15T12:26:42.134904643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-81f65144c0,Uid:7f0de386ac9f29ada88fbb9b6e4a5c39,Namespace:kube-system,Attempt:0,} returns sandbox id \"c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70\"" May 15 12:26:42.137311 containerd[1716]: time="2025-05-15T12:26:42.137289075Z" level=info msg="CreateContainer within sandbox \"c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 15 12:26:42.277693 containerd[1716]: time="2025-05-15T12:26:42.277651376Z" level=info msg="Container 683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:42.390470 containerd[1716]: time="2025-05-15T12:26:42.390440297Z" level=info msg="Container 44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:42.689006 containerd[1716]: time="2025-05-15T12:26:42.688939722Z" level=info msg="CreateContainer within sandbox \"b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10\"" May 15 12:26:42.689609 containerd[1716]: time="2025-05-15T12:26:42.689589873Z" level=info msg="StartContainer for \"683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10\"" May 15 12:26:42.690517 containerd[1716]: time="2025-05-15T12:26:42.690483458Z" level=info msg="connecting to shim 683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10" address="unix:///run/containerd/s/0a80a2d500da1e330c19db944f913c4f64591889f47e483d10486a89358fd129" protocol=ttrpc version=3 May 15 12:26:42.701267 containerd[1716]: time="2025-05-15T12:26:42.701176453Z" level=info msg="Container 8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:42.702127 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount161756593.mount: Deactivated successfully. May 15 12:26:42.714043 systemd[1]: Started cri-containerd-683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10.scope - libcontainer container 683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10. May 15 12:26:42.725371 kubelet[2831]: W0515 12:26:42.725312 2831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.32:6443: connect: connection refused May 15 12:26:42.725525 kubelet[2831]: E0515 12:26:42.725455 2831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.32:6443: connect: connection refused" logger="UnhandledError" May 15 12:26:42.776279 kubelet[2831]: E0515 12:26:42.776246 2831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-81f65144c0?timeout=10s\": dial tcp 10.200.8.32:6443: connect: connection refused" interval="3.2s" May 15 12:26:42.832724 containerd[1716]: time="2025-05-15T12:26:42.832682907Z" level=info msg="CreateContainer within sandbox \"1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087\"" May 15 12:26:42.833159 containerd[1716]: time="2025-05-15T12:26:42.833013273Z" level=info msg="StartContainer for \"683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10\" returns successfully" May 15 12:26:42.833720 containerd[1716]: time="2025-05-15T12:26:42.833701105Z" level=info msg="StartContainer for \"44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087\"" May 15 12:26:42.877069 containerd[1716]: time="2025-05-15T12:26:42.876588450Z" level=info msg="connecting to shim 44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087" address="unix:///run/containerd/s/2d5475823158865dd43e5383ce0fcbae25284928e53f51690fd2360e4bcd4900" protocol=ttrpc version=3 May 15 12:26:42.879309 containerd[1716]: time="2025-05-15T12:26:42.879272497Z" level=info msg="CreateContainer within sandbox \"c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05\"" May 15 12:26:42.880003 containerd[1716]: time="2025-05-15T12:26:42.879954041Z" level=info msg="StartContainer for \"8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05\"" May 15 12:26:42.881280 containerd[1716]: time="2025-05-15T12:26:42.881239343Z" level=info msg="connecting to shim 8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05" address="unix:///run/containerd/s/e736941213e1684f3f24c58a32486b9f0ea2b49bff1825d5c66a8ab604edda3b" protocol=ttrpc version=3 May 15 12:26:42.900061 systemd[1]: Started cri-containerd-44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087.scope - libcontainer container 44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087. May 15 12:26:42.902809 systemd[1]: Started cri-containerd-8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05.scope - libcontainer container 8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05. May 15 12:26:42.955159 containerd[1716]: time="2025-05-15T12:26:42.955062274Z" level=info msg="StartContainer for \"44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087\" returns successfully" May 15 12:26:42.990510 containerd[1716]: time="2025-05-15T12:26:42.990485554Z" level=info msg="StartContainer for \"8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05\" returns successfully" May 15 12:26:43.233897 kubelet[2831]: I0515 12:26:43.233877 2831 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:44.465338 kubelet[2831]: E0515 12:26:44.465245 2831 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4334.0.0-a-81f65144c0.183fb3019179ad96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-81f65144c0,UID:ci-4334.0.0-a-81f65144c0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-81f65144c0,},FirstTimestamp:2025-05-15 12:26:39.760051606 +0000 UTC m=+0.184079309,LastTimestamp:2025-05-15 12:26:39.760051606 +0000 UTC m=+0.184079309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-81f65144c0,}" May 15 12:26:44.521755 kubelet[2831]: I0515 12:26:44.521727 2831 kubelet_node_status.go:75] "Successfully registered node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:44.759627 kubelet[2831]: I0515 12:26:44.759446 2831 apiserver.go:52] "Watching apiserver" May 15 12:26:44.774214 kubelet[2831]: I0515 12:26:44.774183 2831 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 15 12:26:44.849762 kubelet[2831]: E0515 12:26:44.849611 2831 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4334.0.0-a-81f65144c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4334.0.0-a-81f65144c0" May 15 12:26:44.849762 kubelet[2831]: E0515 12:26:44.849628 2831 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:44.849888 kubelet[2831]: E0515 12:26:44.849816 2831 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:45.860722 kubelet[2831]: W0515 12:26:45.860689 2831 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:46.505794 systemd[1]: Reload requested from client PID 3095 ('systemctl') (unit session-9.scope)... May 15 12:26:46.505806 systemd[1]: Reloading... May 15 12:26:46.581009 zram_generator::config[3140]: No configuration found. May 15 12:26:46.651144 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:26:46.738684 systemd[1]: Reloading finished in 232 ms. May 15 12:26:46.769343 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:46.783592 systemd[1]: kubelet.service: Deactivated successfully. May 15 12:26:46.783789 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:46.783831 systemd[1]: kubelet.service: Consumed 458ms CPU time, 114.4M memory peak. May 15 12:26:46.785256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:53.147972 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:53.154260 (kubelet)[3207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:26:53.530410 kubelet[3207]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:53.530410 kubelet[3207]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:26:53.530410 kubelet[3207]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.201811 3207 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.216114 3207 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.216131 3207 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.217085 3207 server.go:929] "Client rotation is on, will bootstrap in background" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.218439 3207 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.220055 3207 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.225479 3207 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.230157 3207 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.230269 3207 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 15 12:26:53.530410 kubelet[3207]: I0515 12:26:53.230374 3207 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:26:53.531053 kubelet[3207]: I0515 12:26:53.230405 3207 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-81f65144c0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 12:26:53.531053 kubelet[3207]: I0515 12:26:53.230579 3207 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:26:53.531053 kubelet[3207]: I0515 12:26:53.230590 3207 container_manager_linux.go:300] "Creating device plugin manager" May 15 12:26:53.531053 kubelet[3207]: I0515 12:26:53.230630 3207 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:53.531053 kubelet[3207]: I0515 12:26:53.230722 3207 kubelet.go:408] "Attempting to sync node with API server" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.230732 3207 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.231108 3207 kubelet.go:314] "Adding apiserver pod source" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.231133 3207 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.234004 3207 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.234316 3207 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.234621 3207 server.go:1269] "Started kubelet" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.240543 3207 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.244255 3207 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.247186 3207 server.go:460] "Adding debug handlers to kubelet server" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.251957 3207 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.268164 3207 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.269083 3207 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.269108 3207 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.269123 3207 kubelet.go:2321] "Starting kubelet main sync loop" May 15 12:26:53.531274 kubelet[3207]: E0515 12:26:53.269153 3207 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.270177 3207 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 12:26:53.531274 kubelet[3207]: I0515 12:26:53.272571 3207 volume_manager.go:289] "Starting Kubelet Volume Manager" May 15 12:26:53.531274 kubelet[3207]: E0515 12:26:53.272684 3207 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-81f65144c0\" not found" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.273865 3207 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.273990 3207 reconciler.go:26] "Reconciler: start to sync state" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.283849 3207 factory.go:221] Registration of the containerd container factory successfully May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.283859 3207 factory.go:221] Registration of the systemd container factory successfully May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.283908 3207 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:26:53.531780 kubelet[3207]: E0515 12:26:53.299594 3207 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317701 3207 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317707 3207 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317719 3207 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317804 3207 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317809 3207 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.317820 3207 policy_none.go:49] "None policy: Start" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.318213 3207 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.318228 3207 state_mem.go:35] "Initializing new in-memory state store" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.318323 3207 state_mem.go:75] "Updated machine memory state" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.321088 3207 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:26:53.531780 kubelet[3207]: E0515 12:26:53.369861 3207 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:53.531780 kubelet[3207]: I0515 12:26:53.529397 3207 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 12:26:53.533368 kubelet[3207]: I0515 12:26:53.529417 3207 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:26:53.533368 kubelet[3207]: I0515 12:26:53.530944 3207 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:26:53.534716 kubelet[3207]: I0515 12:26:53.534685 3207 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:26:53.538356 kubelet[3207]: I0515 12:26:53.537550 3207 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 15 12:26:53.538650 containerd[1716]: time="2025-05-15T12:26:53.538585515Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 15 12:26:53.539858 kubelet[3207]: I0515 12:26:53.539841 3207 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 15 12:26:53.573904 kubelet[3207]: W0515 12:26:53.573890 3207 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:53.575271 kubelet[3207]: I0515 12:26:53.575043 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575271 kubelet[3207]: I0515 12:26:53.575075 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575271 kubelet[3207]: I0515 12:26:53.575094 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575271 kubelet[3207]: I0515 12:26:53.575112 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575271 kubelet[3207]: I0515 12:26:53.575129 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7f0de386ac9f29ada88fbb9b6e4a5c39-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-81f65144c0\" (UID: \"7f0de386ac9f29ada88fbb9b6e4a5c39\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575435 kubelet[3207]: I0515 12:26:53.575143 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575435 kubelet[3207]: I0515 12:26:53.575159 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e07fdda3436c42f8edc4b9caf48c4845-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" (UID: \"e07fdda3436c42f8edc4b9caf48c4845\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575435 kubelet[3207]: I0515 12:26:53.575174 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.575435 kubelet[3207]: I0515 12:26:53.575191 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/861855d9e736c1b3b0ac4d06a1b2019d-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-81f65144c0\" (UID: \"861855d9e736c1b3b0ac4d06a1b2019d\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.576949 kubelet[3207]: W0515 12:26:53.576879 3207 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:53.580057 kubelet[3207]: W0515 12:26:53.580037 3207 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:53.580123 kubelet[3207]: E0515 12:26:53.580087 3207 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4334.0.0-a-81f65144c0\" already exists" pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" May 15 12:26:53.644500 kubelet[3207]: I0515 12:26:53.644176 3207 kubelet_node_status.go:72] "Attempting to register node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:53.652685 kubelet[3207]: I0515 12:26:53.652461 3207 kubelet_node_status.go:111] "Node was previously registered" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:53.652685 kubelet[3207]: I0515 12:26:53.652512 3207 kubelet_node_status.go:75] "Successfully registered node" node="ci-4334.0.0-a-81f65144c0" May 15 12:26:54.232367 kubelet[3207]: I0515 12:26:54.232341 3207 apiserver.go:52] "Watching apiserver" May 15 12:26:54.244526 systemd[1]: Created slice kubepods-besteffort-podc0c387d7_69f6_4ce3_ad07_8725a31661cb.slice - libcontainer container kubepods-besteffort-podc0c387d7_69f6_4ce3_ad07_8725a31661cb.slice. May 15 12:26:54.274447 kubelet[3207]: I0515 12:26:54.274425 3207 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 15 12:26:54.275629 kubelet[3207]: I0515 12:26:54.275484 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4334.0.0-a-81f65144c0" podStartSLOduration=1.27547154 podStartE2EDuration="1.27547154s" podCreationTimestamp="2025-05-15 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:54.263307211 +0000 UTC m=+1.105764422" watchObservedRunningTime="2025-05-15 12:26:54.27547154 +0000 UTC m=+1.117928748" May 15 12:26:54.279549 kubelet[3207]: I0515 12:26:54.279105 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c0c387d7-69f6-4ce3-ad07-8725a31661cb-xtables-lock\") pod \"kube-proxy-wt6s5\" (UID: \"c0c387d7-69f6-4ce3-ad07-8725a31661cb\") " pod="kube-system/kube-proxy-wt6s5" May 15 12:26:54.279549 kubelet[3207]: I0515 12:26:54.279137 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0c387d7-69f6-4ce3-ad07-8725a31661cb-lib-modules\") pod \"kube-proxy-wt6s5\" (UID: \"c0c387d7-69f6-4ce3-ad07-8725a31661cb\") " pod="kube-system/kube-proxy-wt6s5" May 15 12:26:54.279549 kubelet[3207]: I0515 12:26:54.279157 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2qq\" (UniqueName: \"kubernetes.io/projected/c0c387d7-69f6-4ce3-ad07-8725a31661cb-kube-api-access-lz2qq\") pod \"kube-proxy-wt6s5\" (UID: \"c0c387d7-69f6-4ce3-ad07-8725a31661cb\") " pod="kube-system/kube-proxy-wt6s5" May 15 12:26:54.279549 kubelet[3207]: I0515 12:26:54.279176 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c0c387d7-69f6-4ce3-ad07-8725a31661cb-kube-proxy\") pod \"kube-proxy-wt6s5\" (UID: \"c0c387d7-69f6-4ce3-ad07-8725a31661cb\") " pod="kube-system/kube-proxy-wt6s5" May 15 12:26:54.294736 kubelet[3207]: I0515 12:26:54.294696 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4334.0.0-a-81f65144c0" podStartSLOduration=1.294684215 podStartE2EDuration="1.294684215s" podCreationTimestamp="2025-05-15 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:54.281960367 +0000 UTC m=+1.124417578" watchObservedRunningTime="2025-05-15 12:26:54.294684215 +0000 UTC m=+1.137141424" May 15 12:26:54.354939 kubelet[3207]: I0515 12:26:54.354510 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4334.0.0-a-81f65144c0" podStartSLOduration=9.354496907 podStartE2EDuration="9.354496907s" podCreationTimestamp="2025-05-15 12:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:54.297267619 +0000 UTC m=+1.139724828" watchObservedRunningTime="2025-05-15 12:26:54.354496907 +0000 UTC m=+1.196954119" May 15 12:26:54.553498 containerd[1716]: time="2025-05-15T12:26:54.553429878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wt6s5,Uid:c0c387d7-69f6-4ce3-ad07-8725a31661cb,Namespace:kube-system,Attempt:0,}" May 15 12:26:54.849904 containerd[1716]: time="2025-05-15T12:26:54.849275158Z" level=info msg="connecting to shim 6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa" address="unix:///run/containerd/s/b43f7560c4f40c8e395e8d121d7a8bf45a6252507f68a1eb0e9a922c11ab7b50" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:54.889067 systemd[1]: Started cri-containerd-6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa.scope - libcontainer container 6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa. May 15 12:26:55.015184 containerd[1716]: time="2025-05-15T12:26:55.015135792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wt6s5,Uid:c0c387d7-69f6-4ce3-ad07-8725a31661cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa\"" May 15 12:26:55.017841 containerd[1716]: time="2025-05-15T12:26:55.017816580Z" level=info msg="CreateContainer within sandbox \"6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 15 12:26:55.149968 containerd[1716]: time="2025-05-15T12:26:55.148890749Z" level=info msg="Container 0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:55.280594 systemd[1]: Created slice kubepods-besteffort-pod3f25bfb3_c1b5_4102_9cd5_981c21bb97f0.slice - libcontainer container kubepods-besteffort-pod3f25bfb3_c1b5_4102_9cd5_981c21bb97f0.slice. May 15 12:26:55.284094 kubelet[3207]: I0515 12:26:55.284071 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3f25bfb3-c1b5-4102-9cd5-981c21bb97f0-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-h692m\" (UID: \"3f25bfb3-c1b5-4102-9cd5-981c21bb97f0\") " pod="tigera-operator/tigera-operator-6f6897fdc5-h692m" May 15 12:26:55.286939 kubelet[3207]: I0515 12:26:55.286043 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tq4\" (UniqueName: \"kubernetes.io/projected/3f25bfb3-c1b5-4102-9cd5-981c21bb97f0-kube-api-access-b2tq4\") pod \"tigera-operator-6f6897fdc5-h692m\" (UID: \"3f25bfb3-c1b5-4102-9cd5-981c21bb97f0\") " pod="tigera-operator/tigera-operator-6f6897fdc5-h692m" May 15 12:26:55.291461 containerd[1716]: time="2025-05-15T12:26:55.291428408Z" level=info msg="CreateContainer within sandbox \"6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf\"" May 15 12:26:55.292715 containerd[1716]: time="2025-05-15T12:26:55.292377849Z" level=info msg="StartContainer for \"0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf\"" May 15 12:26:55.295520 containerd[1716]: time="2025-05-15T12:26:55.295221051Z" level=info msg="connecting to shim 0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf" address="unix:///run/containerd/s/b43f7560c4f40c8e395e8d121d7a8bf45a6252507f68a1eb0e9a922c11ab7b50" protocol=ttrpc version=3 May 15 12:26:55.324042 systemd[1]: Started cri-containerd-0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf.scope - libcontainer container 0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf. May 15 12:26:55.491609 containerd[1716]: time="2025-05-15T12:26:55.491560637Z" level=info msg="StartContainer for \"0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf\" returns successfully" May 15 12:26:55.588580 containerd[1716]: time="2025-05-15T12:26:55.588560281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-h692m,Uid:3f25bfb3-c1b5-4102-9cd5-981c21bb97f0,Namespace:tigera-operator,Attempt:0,}" May 15 12:26:56.139931 containerd[1716]: time="2025-05-15T12:26:56.139535275Z" level=info msg="connecting to shim 84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d" address="unix:///run/containerd/s/9e4082fd549978a7eb10a3bc736a874808b438de179a6bb90d9870084e63bf7d" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:56.159075 systemd[1]: Started cri-containerd-84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d.scope - libcontainer container 84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d. May 15 12:26:56.192314 containerd[1716]: time="2025-05-15T12:26:56.192275138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-h692m,Uid:3f25bfb3-c1b5-4102-9cd5-981c21bb97f0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d\"" May 15 12:26:56.193855 containerd[1716]: time="2025-05-15T12:26:56.193804464Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 15 12:26:56.332997 kubelet[3207]: I0515 12:26:56.332956 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-wt6s5" podStartSLOduration=3.332939553 podStartE2EDuration="3.332939553s" podCreationTimestamp="2025-05-15 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:56.332429141 +0000 UTC m=+3.174886357" watchObservedRunningTime="2025-05-15 12:26:56.332939553 +0000 UTC m=+3.175396765" May 15 12:26:58.046899 sudo[2209]: pam_unix(sudo:session): session closed for user root May 15 12:26:58.150755 sshd[2208]: Connection closed by 10.200.16.10 port 36626 May 15 12:26:58.151176 sshd-session[2206]: pam_unix(sshd:session): session closed for user core May 15 12:26:58.154178 systemd[1]: sshd@6-10.200.8.32:22-10.200.16.10:36626.service: Deactivated successfully. May 15 12:26:58.155815 systemd[1]: session-9.scope: Deactivated successfully. May 15 12:26:58.156021 systemd[1]: session-9.scope: Consumed 3.554s CPU time, 226.4M memory peak. May 15 12:26:58.157320 systemd-logind[1703]: Session 9 logged out. Waiting for processes to exit. May 15 12:26:58.158542 systemd-logind[1703]: Removed session 9. May 15 12:26:59.303270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount664777327.mount: Deactivated successfully. May 15 12:27:00.177198 containerd[1716]: time="2025-05-15T12:27:00.177154853Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:00.179213 containerd[1716]: time="2025-05-15T12:27:00.179175789Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 15 12:27:00.240041 containerd[1716]: time="2025-05-15T12:27:00.239977021Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:00.243538 containerd[1716]: time="2025-05-15T12:27:00.243486365Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:00.244021 containerd[1716]: time="2025-05-15T12:27:00.243928668Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 4.050098328s" May 15 12:27:00.244021 containerd[1716]: time="2025-05-15T12:27:00.243954859Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 15 12:27:00.245885 containerd[1716]: time="2025-05-15T12:27:00.245675946Z" level=info msg="CreateContainer within sandbox \"84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 15 12:27:00.391393 containerd[1716]: time="2025-05-15T12:27:00.391293278Z" level=info msg="Container b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:00.526239 containerd[1716]: time="2025-05-15T12:27:00.526210770Z" level=info msg="CreateContainer within sandbox \"84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71\"" May 15 12:27:00.526597 containerd[1716]: time="2025-05-15T12:27:00.526558567Z" level=info msg="StartContainer for \"b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71\"" May 15 12:27:00.527511 containerd[1716]: time="2025-05-15T12:27:00.527486746Z" level=info msg="connecting to shim b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71" address="unix:///run/containerd/s/9e4082fd549978a7eb10a3bc736a874808b438de179a6bb90d9870084e63bf7d" protocol=ttrpc version=3 May 15 12:27:00.546060 systemd[1]: Started cri-containerd-b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71.scope - libcontainer container b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71. May 15 12:27:00.570627 containerd[1716]: time="2025-05-15T12:27:00.570604609Z" level=info msg="StartContainer for \"b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71\" returns successfully" May 15 12:27:01.337158 kubelet[3207]: I0515 12:27:01.337103 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-h692m" podStartSLOduration=2.285882838 podStartE2EDuration="6.337087029s" podCreationTimestamp="2025-05-15 12:26:55 +0000 UTC" firstStartedPulling="2025-05-15 12:26:56.193313625 +0000 UTC m=+3.035770829" lastFinishedPulling="2025-05-15 12:27:00.244517823 +0000 UTC m=+7.086975020" observedRunningTime="2025-05-15 12:27:01.336736873 +0000 UTC m=+8.179194084" watchObservedRunningTime="2025-05-15 12:27:01.337087029 +0000 UTC m=+8.179544238" May 15 12:27:03.487966 systemd[1]: Created slice kubepods-besteffort-pod14877444_15f8_4a2c_a797_c8814e763063.slice - libcontainer container kubepods-besteffort-pod14877444_15f8_4a2c_a797_c8814e763063.slice. May 15 12:27:03.539029 kubelet[3207]: I0515 12:27:03.538999 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjjn\" (UniqueName: \"kubernetes.io/projected/14877444-15f8-4a2c-a797-c8814e763063-kube-api-access-9jjjn\") pod \"calico-typha-5dc5f58c57-k7mbf\" (UID: \"14877444-15f8-4a2c-a797-c8814e763063\") " pod="calico-system/calico-typha-5dc5f58c57-k7mbf" May 15 12:27:03.539318 kubelet[3207]: I0515 12:27:03.539037 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14877444-15f8-4a2c-a797-c8814e763063-tigera-ca-bundle\") pod \"calico-typha-5dc5f58c57-k7mbf\" (UID: \"14877444-15f8-4a2c-a797-c8814e763063\") " pod="calico-system/calico-typha-5dc5f58c57-k7mbf" May 15 12:27:03.539318 kubelet[3207]: I0515 12:27:03.539053 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/14877444-15f8-4a2c-a797-c8814e763063-typha-certs\") pod \"calico-typha-5dc5f58c57-k7mbf\" (UID: \"14877444-15f8-4a2c-a797-c8814e763063\") " pod="calico-system/calico-typha-5dc5f58c57-k7mbf" May 15 12:27:03.604769 systemd[1]: Created slice kubepods-besteffort-pod0c30f140_51f0_4568_9be8_fac34ef2092f.slice - libcontainer container kubepods-besteffort-pod0c30f140_51f0_4568_9be8_fac34ef2092f.slice. May 15 12:27:03.639924 kubelet[3207]: I0515 12:27:03.639721 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0c30f140-51f0-4568-9be8-fac34ef2092f-node-certs\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.639924 kubelet[3207]: I0515 12:27:03.639750 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-var-run-calico\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.639924 kubelet[3207]: I0515 12:27:03.639766 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-cni-bin-dir\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.639924 kubelet[3207]: I0515 12:27:03.639781 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c30f140-51f0-4568-9be8-fac34ef2092f-tigera-ca-bundle\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.639924 kubelet[3207]: I0515 12:27:03.639794 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-var-lib-calico\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640101 kubelet[3207]: I0515 12:27:03.639810 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlsx\" (UniqueName: \"kubernetes.io/projected/0c30f140-51f0-4568-9be8-fac34ef2092f-kube-api-access-qrlsx\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640101 kubelet[3207]: I0515 12:27:03.639844 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-cni-log-dir\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640101 kubelet[3207]: I0515 12:27:03.639858 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-flexvol-driver-host\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640101 kubelet[3207]: I0515 12:27:03.639872 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-policysync\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640101 kubelet[3207]: I0515 12:27:03.639889 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-cni-net-dir\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.640199 kubelet[3207]: I0515 12:27:03.639902 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-xtables-lock\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.642032 kubelet[3207]: I0515 12:27:03.641991 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c30f140-51f0-4568-9be8-fac34ef2092f-lib-modules\") pod \"calico-node-cl9bz\" (UID: \"0c30f140-51f0-4568-9be8-fac34ef2092f\") " pod="calico-system/calico-node-cl9bz" May 15 12:27:03.716557 kubelet[3207]: E0515 12:27:03.716390 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:03.742971 kubelet[3207]: I0515 12:27:03.742884 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcbe7eb9-f017-4039-b0fc-ef4f90e95554-kubelet-dir\") pod \"csi-node-driver-x9zg6\" (UID: \"dcbe7eb9-f017-4039-b0fc-ef4f90e95554\") " pod="calico-system/csi-node-driver-x9zg6" May 15 12:27:03.744265 kubelet[3207]: I0515 12:27:03.744139 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4rd\" (UniqueName: \"kubernetes.io/projected/dcbe7eb9-f017-4039-b0fc-ef4f90e95554-kube-api-access-2t4rd\") pod \"csi-node-driver-x9zg6\" (UID: \"dcbe7eb9-f017-4039-b0fc-ef4f90e95554\") " pod="calico-system/csi-node-driver-x9zg6" May 15 12:27:03.744444 kubelet[3207]: E0515 12:27:03.744413 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.744539 kubelet[3207]: W0515 12:27:03.744485 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.744632 kubelet[3207]: E0515 12:27:03.744615 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.744730 kubelet[3207]: E0515 12:27:03.744713 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.744730 kubelet[3207]: W0515 12:27:03.744720 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.744859 kubelet[3207]: E0515 12:27:03.744772 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.745107 kubelet[3207]: E0515 12:27:03.745051 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.745107 kubelet[3207]: W0515 12:27:03.745094 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.745387 kubelet[3207]: E0515 12:27:03.745339 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.745582 kubelet[3207]: E0515 12:27:03.745562 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.745582 kubelet[3207]: W0515 12:27:03.745571 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.745810 kubelet[3207]: E0515 12:27:03.745792 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.746025 kubelet[3207]: E0515 12:27:03.746003 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.746025 kubelet[3207]: W0515 12:27:03.746013 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.746163 kubelet[3207]: E0515 12:27:03.746072 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.746409 kubelet[3207]: E0515 12:27:03.746392 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.746480 kubelet[3207]: W0515 12:27:03.746447 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.746591 kubelet[3207]: E0515 12:27:03.746572 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.746707 kubelet[3207]: E0515 12:27:03.746640 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.746707 kubelet[3207]: W0515 12:27:03.746645 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.746707 kubelet[3207]: E0515 12:27:03.746653 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.747020 kubelet[3207]: E0515 12:27:03.746992 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.747020 kubelet[3207]: W0515 12:27:03.747010 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.747169 kubelet[3207]: E0515 12:27:03.747086 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.747314 kubelet[3207]: E0515 12:27:03.747306 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.747363 kubelet[3207]: W0515 12:27:03.747349 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.747408 kubelet[3207]: E0515 12:27:03.747401 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.747620 kubelet[3207]: E0515 12:27:03.747575 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.747620 kubelet[3207]: W0515 12:27:03.747583 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.747620 kubelet[3207]: E0515 12:27:03.747595 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.747837 kubelet[3207]: E0515 12:27:03.747811 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.747837 kubelet[3207]: W0515 12:27:03.747818 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.747837 kubelet[3207]: E0515 12:27:03.747827 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.748105 kubelet[3207]: E0515 12:27:03.748082 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.748105 kubelet[3207]: W0515 12:27:03.748091 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.748228 kubelet[3207]: E0515 12:27:03.748144 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.748440 kubelet[3207]: E0515 12:27:03.748388 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.748669 kubelet[3207]: W0515 12:27:03.748490 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.748669 kubelet[3207]: E0515 12:27:03.748635 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.748992 kubelet[3207]: E0515 12:27:03.748968 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.748992 kubelet[3207]: W0515 12:27:03.748979 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.750196 kubelet[3207]: E0515 12:27:03.749115 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.750559 kubelet[3207]: E0515 12:27:03.750533 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.750559 kubelet[3207]: W0515 12:27:03.750546 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.750723 kubelet[3207]: E0515 12:27:03.750669 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.750836 kubelet[3207]: E0515 12:27:03.750830 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.750882 kubelet[3207]: W0515 12:27:03.750868 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.751045 kubelet[3207]: E0515 12:27:03.751029 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.751166 kubelet[3207]: E0515 12:27:03.751149 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.751166 kubelet[3207]: W0515 12:27:03.751157 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.751246 kubelet[3207]: E0515 12:27:03.751239 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.751398 kubelet[3207]: E0515 12:27:03.751382 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.751398 kubelet[3207]: W0515 12:27:03.751389 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.751497 kubelet[3207]: E0515 12:27:03.751461 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.751630 kubelet[3207]: E0515 12:27:03.751613 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.751630 kubelet[3207]: W0515 12:27:03.751621 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.751708 kubelet[3207]: E0515 12:27:03.751683 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.751817 kubelet[3207]: E0515 12:27:03.751812 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.751862 kubelet[3207]: W0515 12:27:03.751832 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.751984 kubelet[3207]: E0515 12:27:03.751940 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.752086 kubelet[3207]: E0515 12:27:03.752080 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.752131 kubelet[3207]: W0515 12:27:03.752117 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.752587 kubelet[3207]: E0515 12:27:03.752542 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.752767 kubelet[3207]: E0515 12:27:03.752702 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.752767 kubelet[3207]: W0515 12:27:03.752710 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.752832 kubelet[3207]: E0515 12:27:03.752825 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.752903 kubelet[3207]: E0515 12:27:03.752891 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.752903 kubelet[3207]: W0515 12:27:03.752896 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.753036 kubelet[3207]: E0515 12:27:03.753028 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.753183 kubelet[3207]: E0515 12:27:03.753176 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.753269 kubelet[3207]: W0515 12:27:03.753220 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.753269 kubelet[3207]: E0515 12:27:03.753236 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.753437 kubelet[3207]: E0515 12:27:03.753406 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.753437 kubelet[3207]: W0515 12:27:03.753428 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.753634 kubelet[3207]: E0515 12:27:03.753582 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.753872 kubelet[3207]: E0515 12:27:03.753851 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.754010 kubelet[3207]: W0515 12:27:03.753979 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.754306 kubelet[3207]: E0515 12:27:03.754281 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.754429 kubelet[3207]: E0515 12:27:03.754422 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.754475 kubelet[3207]: W0515 12:27:03.754463 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.754595 kubelet[3207]: E0515 12:27:03.754575 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.754744 kubelet[3207]: E0515 12:27:03.754724 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.754744 kubelet[3207]: W0515 12:27:03.754732 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.755018 kubelet[3207]: E0515 12:27:03.755002 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.755134 kubelet[3207]: E0515 12:27:03.755125 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.755271 kubelet[3207]: W0515 12:27:03.755184 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.755518 kubelet[3207]: E0515 12:27:03.755306 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.755666 kubelet[3207]: E0515 12:27:03.755658 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.755708 kubelet[3207]: W0515 12:27:03.755701 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.755817 kubelet[3207]: E0515 12:27:03.755813 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.755850 kubelet[3207]: W0515 12:27:03.755845 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.755995 kubelet[3207]: E0515 12:27:03.755990 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.756031 kubelet[3207]: W0515 12:27:03.756026 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.756172 kubelet[3207]: E0515 12:27:03.756130 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.756172 kubelet[3207]: W0515 12:27:03.756137 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.756492 kubelet[3207]: E0515 12:27:03.756478 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.756536 kubelet[3207]: W0515 12:27:03.756492 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.756536 kubelet[3207]: E0515 12:27:03.756504 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.756942 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.756964 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.757045 kubelet[3207]: I0515 12:27:03.756968 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/dcbe7eb9-f017-4039-b0fc-ef4f90e95554-varrun\") pod \"csi-node-driver-x9zg6\" (UID: \"dcbe7eb9-f017-4039-b0fc-ef4f90e95554\") " pod="calico-system/csi-node-driver-x9zg6" May 15 12:27:03.757045 kubelet[3207]: W0515 12:27:03.756971 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.756981 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.757008 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.757020 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757045 kubelet[3207]: E0515 12:27:03.757031 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757404 kubelet[3207]: E0515 12:27:03.757292 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.757404 kubelet[3207]: W0515 12:27:03.757301 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.757404 kubelet[3207]: E0515 12:27:03.757311 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757404 kubelet[3207]: I0515 12:27:03.757327 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcbe7eb9-f017-4039-b0fc-ef4f90e95554-socket-dir\") pod \"csi-node-driver-x9zg6\" (UID: \"dcbe7eb9-f017-4039-b0fc-ef4f90e95554\") " pod="calico-system/csi-node-driver-x9zg6" May 15 12:27:03.757587 kubelet[3207]: E0515 12:27:03.757534 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.757587 kubelet[3207]: W0515 12:27:03.757541 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.757587 kubelet[3207]: E0515 12:27:03.757552 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.757587 kubelet[3207]: I0515 12:27:03.757566 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcbe7eb9-f017-4039-b0fc-ef4f90e95554-registration-dir\") pod \"csi-node-driver-x9zg6\" (UID: \"dcbe7eb9-f017-4039-b0fc-ef4f90e95554\") " pod="calico-system/csi-node-driver-x9zg6" May 15 12:27:03.757856 kubelet[3207]: E0515 12:27:03.757797 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.757856 kubelet[3207]: W0515 12:27:03.757805 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.757856 kubelet[3207]: E0515 12:27:03.757813 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.760274 kubelet[3207]: E0515 12:27:03.760259 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.760274 kubelet[3207]: W0515 12:27:03.760273 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760287 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760384 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761082 kubelet[3207]: W0515 12:27:03.760389 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760395 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760462 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761082 kubelet[3207]: W0515 12:27:03.760466 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760471 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760537 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761082 kubelet[3207]: W0515 12:27:03.760541 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761082 kubelet[3207]: E0515 12:27:03.760546 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761303 kubelet[3207]: E0515 12:27:03.760630 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761303 kubelet[3207]: W0515 12:27:03.760634 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761303 kubelet[3207]: E0515 12:27:03.760639 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761756 kubelet[3207]: E0515 12:27:03.761580 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761756 kubelet[3207]: W0515 12:27:03.761591 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761833 kubelet[3207]: E0515 12:27:03.761770 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.761833 kubelet[3207]: W0515 12:27:03.761776 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.761833 kubelet[3207]: E0515 12:27:03.761786 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.761950 kubelet[3207]: E0515 12:27:03.761926 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.762083 kubelet[3207]: E0515 12:27:03.762076 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.762131 kubelet[3207]: W0515 12:27:03.762120 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.762178 kubelet[3207]: E0515 12:27:03.762172 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.762411 kubelet[3207]: E0515 12:27:03.762340 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.762411 kubelet[3207]: W0515 12:27:03.762347 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.762411 kubelet[3207]: E0515 12:27:03.762355 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.763577 kubelet[3207]: E0515 12:27:03.763546 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.763577 kubelet[3207]: W0515 12:27:03.763564 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.763765 kubelet[3207]: E0515 12:27:03.763581 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.764131 kubelet[3207]: E0515 12:27:03.763860 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.764131 kubelet[3207]: W0515 12:27:03.763867 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.764131 kubelet[3207]: E0515 12:27:03.764024 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.764131 kubelet[3207]: W0515 12:27:03.764029 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.764131 kubelet[3207]: E0515 12:27:03.764116 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.764131 kubelet[3207]: W0515 12:27:03.764120 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.764270 kubelet[3207]: E0515 12:27:03.764184 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.764270 kubelet[3207]: W0515 12:27:03.764188 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.764270 kubelet[3207]: E0515 12:27:03.764195 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.764270 kubelet[3207]: E0515 12:27:03.764257 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.764270 kubelet[3207]: W0515 12:27:03.764261 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.764270 kubelet[3207]: E0515 12:27:03.764267 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.764376 kubelet[3207]: E0515 12:27:03.764280 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.764376 kubelet[3207]: E0515 12:27:03.764291 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.764376 kubelet[3207]: E0515 12:27:03.764299 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.779923 kubelet[3207]: E0515 12:27:03.779519 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.779923 kubelet[3207]: W0515 12:27:03.779535 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.779923 kubelet[3207]: E0515 12:27:03.779546 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.791112 containerd[1716]: time="2025-05-15T12:27:03.791030362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dc5f58c57-k7mbf,Uid:14877444-15f8-4a2c-a797-c8814e763063,Namespace:calico-system,Attempt:0,}" May 15 12:27:03.858178 kubelet[3207]: E0515 12:27:03.858164 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858178 kubelet[3207]: W0515 12:27:03.858176 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858317 kubelet[3207]: E0515 12:27:03.858187 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858317 kubelet[3207]: E0515 12:27:03.858290 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858317 kubelet[3207]: W0515 12:27:03.858295 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858317 kubelet[3207]: E0515 12:27:03.858310 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858420 kubelet[3207]: E0515 12:27:03.858405 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858420 kubelet[3207]: W0515 12:27:03.858410 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858420 kubelet[3207]: E0515 12:27:03.858416 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858531 kubelet[3207]: E0515 12:27:03.858515 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858531 kubelet[3207]: W0515 12:27:03.858520 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858531 kubelet[3207]: E0515 12:27:03.858526 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858638 kubelet[3207]: E0515 12:27:03.858630 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858638 kubelet[3207]: W0515 12:27:03.858634 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858638 kubelet[3207]: E0515 12:27:03.858640 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858772 kubelet[3207]: E0515 12:27:03.858755 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858772 kubelet[3207]: W0515 12:27:03.858765 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858827 kubelet[3207]: E0515 12:27:03.858775 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.858874 kubelet[3207]: E0515 12:27:03.858865 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.858874 kubelet[3207]: W0515 12:27:03.858872 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.858955 kubelet[3207]: E0515 12:27:03.858878 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859012 kubelet[3207]: E0515 12:27:03.858996 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859012 kubelet[3207]: W0515 12:27:03.859005 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859055 kubelet[3207]: E0515 12:27:03.859015 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859131 kubelet[3207]: E0515 12:27:03.859121 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859131 kubelet[3207]: W0515 12:27:03.859128 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859189 kubelet[3207]: E0515 12:27:03.859136 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859240 kubelet[3207]: E0515 12:27:03.859233 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859240 kubelet[3207]: W0515 12:27:03.859239 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859285 kubelet[3207]: E0515 12:27:03.859250 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859349 kubelet[3207]: E0515 12:27:03.859340 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859349 kubelet[3207]: W0515 12:27:03.859347 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859349 kubelet[3207]: E0515 12:27:03.859357 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859464 kubelet[3207]: E0515 12:27:03.859432 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859464 kubelet[3207]: W0515 12:27:03.859436 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859464 kubelet[3207]: E0515 12:27:03.859449 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859564 kubelet[3207]: E0515 12:27:03.859555 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859564 kubelet[3207]: W0515 12:27:03.859561 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859637 kubelet[3207]: E0515 12:27:03.859569 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859657 kubelet[3207]: E0515 12:27:03.859639 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859657 kubelet[3207]: W0515 12:27:03.859643 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859657 kubelet[3207]: E0515 12:27:03.859651 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859780 kubelet[3207]: E0515 12:27:03.859714 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859780 kubelet[3207]: W0515 12:27:03.859719 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859780 kubelet[3207]: E0515 12:27:03.859724 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859858 kubelet[3207]: E0515 12:27:03.859807 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859858 kubelet[3207]: W0515 12:27:03.859811 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859967 kubelet[3207]: E0515 12:27:03.859876 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859967 kubelet[3207]: W0515 12:27:03.859879 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.859967 kubelet[3207]: E0515 12:27:03.859923 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859967 kubelet[3207]: E0515 12:27:03.859948 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.859967 kubelet[3207]: E0515 12:27:03.859958 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.859967 kubelet[3207]: W0515 12:27:03.859962 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860165 kubelet[3207]: E0515 12:27:03.859972 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860165 kubelet[3207]: E0515 12:27:03.860068 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860165 kubelet[3207]: W0515 12:27:03.860073 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860165 kubelet[3207]: E0515 12:27:03.860081 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860165 kubelet[3207]: E0515 12:27:03.860150 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860165 kubelet[3207]: W0515 12:27:03.860154 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860411 kubelet[3207]: E0515 12:27:03.860165 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860411 kubelet[3207]: E0515 12:27:03.860243 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860411 kubelet[3207]: W0515 12:27:03.860247 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860411 kubelet[3207]: E0515 12:27:03.860262 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860411 kubelet[3207]: E0515 12:27:03.860408 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860558 kubelet[3207]: W0515 12:27:03.860415 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860558 kubelet[3207]: E0515 12:27:03.860429 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860558 kubelet[3207]: E0515 12:27:03.860505 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860558 kubelet[3207]: W0515 12:27:03.860510 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860558 kubelet[3207]: E0515 12:27:03.860516 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860686 kubelet[3207]: E0515 12:27:03.860632 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860686 kubelet[3207]: W0515 12:27:03.860636 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860686 kubelet[3207]: E0515 12:27:03.860647 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.860793 kubelet[3207]: E0515 12:27:03.860782 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.860793 kubelet[3207]: W0515 12:27:03.860787 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.860850 kubelet[3207]: E0515 12:27:03.860793 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.910411 containerd[1716]: time="2025-05-15T12:27:03.910379395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cl9bz,Uid:0c30f140-51f0-4568-9be8-fac34ef2092f,Namespace:calico-system,Attempt:0,}" May 15 12:27:03.920645 kubelet[3207]: E0515 12:27:03.920600 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.920645 kubelet[3207]: W0515 12:27:03.920612 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.920645 kubelet[3207]: E0515 12:27:03.920623 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.238444 containerd[1716]: time="2025-05-15T12:27:04.238269900Z" level=info msg="connecting to shim bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49" address="unix:///run/containerd/s/355817190dbf43010091b9820e17bc4d47a99a08036e5589b3378a60c55abfc9" namespace=k8s.io protocol=ttrpc version=3 May 15 12:27:04.259050 systemd[1]: Started cri-containerd-bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49.scope - libcontainer container bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49. May 15 12:27:04.331890 containerd[1716]: time="2025-05-15T12:27:04.331850408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dc5f58c57-k7mbf,Uid:14877444-15f8-4a2c-a797-c8814e763063,Namespace:calico-system,Attempt:0,} returns sandbox id \"bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49\"" May 15 12:27:04.333001 containerd[1716]: time="2025-05-15T12:27:04.332980398Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 15 12:27:04.493093 containerd[1716]: time="2025-05-15T12:27:04.492845773Z" level=info msg="connecting to shim 8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d" address="unix:///run/containerd/s/992d9eb7e692e3b165af9814df3c66907176648619eaf21066a082cb2f80db45" namespace=k8s.io protocol=ttrpc version=3 May 15 12:27:04.513198 systemd[1]: Started cri-containerd-8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d.scope - libcontainer container 8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d. May 15 12:27:04.534593 containerd[1716]: time="2025-05-15T12:27:04.534566122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cl9bz,Uid:0c30f140-51f0-4568-9be8-fac34ef2092f,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\"" May 15 12:27:05.273572 kubelet[3207]: E0515 12:27:05.273174 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:07.270644 kubelet[3207]: E0515 12:27:07.269848 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:08.477005 containerd[1716]: time="2025-05-15T12:27:08.476889781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:08.538474 containerd[1716]: time="2025-05-15T12:27:08.538435851Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 15 12:27:08.541764 containerd[1716]: time="2025-05-15T12:27:08.541719512Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:08.587066 containerd[1716]: time="2025-05-15T12:27:08.587012569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:08.587637 containerd[1716]: time="2025-05-15T12:27:08.587538559Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 4.25451719s" May 15 12:27:08.587637 containerd[1716]: time="2025-05-15T12:27:08.587563391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 15 12:27:08.588333 containerd[1716]: time="2025-05-15T12:27:08.588298599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 15 12:27:08.598542 containerd[1716]: time="2025-05-15T12:27:08.598515738Z" level=info msg="CreateContainer within sandbox \"bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 12:27:08.842251 containerd[1716]: time="2025-05-15T12:27:08.842175643Z" level=info msg="Container a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:08.985849 containerd[1716]: time="2025-05-15T12:27:08.985824924Z" level=info msg="CreateContainer within sandbox \"bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476\"" May 15 12:27:08.986213 containerd[1716]: time="2025-05-15T12:27:08.986192481Z" level=info msg="StartContainer for \"a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476\"" May 15 12:27:08.988481 containerd[1716]: time="2025-05-15T12:27:08.988456417Z" level=info msg="connecting to shim a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476" address="unix:///run/containerd/s/355817190dbf43010091b9820e17bc4d47a99a08036e5589b3378a60c55abfc9" protocol=ttrpc version=3 May 15 12:27:09.007814 systemd[1]: Started cri-containerd-a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476.scope - libcontainer container a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476. May 15 12:27:09.084954 containerd[1716]: time="2025-05-15T12:27:09.084898224Z" level=info msg="StartContainer for \"a3f7b9bca8f479ce2c5d4cea12fbf0b2e23d245079f5730326c44d8cde924476\" returns successfully" May 15 12:27:09.270006 kubelet[3207]: E0515 12:27:09.269960 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:09.352434 kubelet[3207]: I0515 12:27:09.352361 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5dc5f58c57-k7mbf" podStartSLOduration=2.096851772 podStartE2EDuration="6.352339583s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:27:04.332678037 +0000 UTC m=+11.175135243" lastFinishedPulling="2025-05-15 12:27:08.58816585 +0000 UTC m=+15.430623054" observedRunningTime="2025-05-15 12:27:09.35149209 +0000 UTC m=+16.193949297" watchObservedRunningTime="2025-05-15 12:27:09.352339583 +0000 UTC m=+16.194796847" May 15 12:27:09.368983 kubelet[3207]: E0515 12:27:09.368944 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.368983 kubelet[3207]: W0515 12:27:09.368975 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369107 kubelet[3207]: E0515 12:27:09.368989 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369107 kubelet[3207]: E0515 12:27:09.369087 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369107 kubelet[3207]: W0515 12:27:09.369092 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369107 kubelet[3207]: E0515 12:27:09.369099 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369220 kubelet[3207]: E0515 12:27:09.369172 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369220 kubelet[3207]: W0515 12:27:09.369177 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369220 kubelet[3207]: E0515 12:27:09.369183 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369295 kubelet[3207]: E0515 12:27:09.369257 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369295 kubelet[3207]: W0515 12:27:09.369262 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369295 kubelet[3207]: E0515 12:27:09.369267 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369382 kubelet[3207]: E0515 12:27:09.369344 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369382 kubelet[3207]: W0515 12:27:09.369348 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369382 kubelet[3207]: E0515 12:27:09.369353 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369461 kubelet[3207]: E0515 12:27:09.369419 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369461 kubelet[3207]: W0515 12:27:09.369424 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369461 kubelet[3207]: E0515 12:27:09.369429 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369542 kubelet[3207]: E0515 12:27:09.369490 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369542 kubelet[3207]: W0515 12:27:09.369494 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369542 kubelet[3207]: E0515 12:27:09.369499 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369620 kubelet[3207]: E0515 12:27:09.369563 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369620 kubelet[3207]: W0515 12:27:09.369567 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369620 kubelet[3207]: E0515 12:27:09.369572 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369702 kubelet[3207]: E0515 12:27:09.369642 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369702 kubelet[3207]: W0515 12:27:09.369647 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369702 kubelet[3207]: E0515 12:27:09.369652 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369779 kubelet[3207]: E0515 12:27:09.369731 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369779 kubelet[3207]: W0515 12:27:09.369735 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369779 kubelet[3207]: E0515 12:27:09.369740 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.369860 kubelet[3207]: E0515 12:27:09.369813 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.369860 kubelet[3207]: W0515 12:27:09.369816 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.369860 kubelet[3207]: E0515 12:27:09.369822 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.370007 kubelet[3207]: E0515 12:27:09.369894 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.370007 kubelet[3207]: W0515 12:27:09.369898 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.370007 kubelet[3207]: E0515 12:27:09.369904 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.370007 kubelet[3207]: E0515 12:27:09.369986 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.370007 kubelet[3207]: W0515 12:27:09.370001 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.370007 kubelet[3207]: E0515 12:27:09.370007 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.370183 kubelet[3207]: E0515 12:27:09.370075 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.370183 kubelet[3207]: W0515 12:27:09.370079 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.370183 kubelet[3207]: E0515 12:27:09.370084 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.370183 kubelet[3207]: E0515 12:27:09.370149 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.370183 kubelet[3207]: W0515 12:27:09.370153 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.370183 kubelet[3207]: E0515 12:27:09.370157 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.390481 kubelet[3207]: E0515 12:27:09.390459 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.390481 kubelet[3207]: W0515 12:27:09.390478 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.390656 kubelet[3207]: E0515 12:27:09.390491 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.390656 kubelet[3207]: E0515 12:27:09.390599 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.390656 kubelet[3207]: W0515 12:27:09.390604 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.390656 kubelet[3207]: E0515 12:27:09.390611 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.390746 kubelet[3207]: E0515 12:27:09.390704 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.390746 kubelet[3207]: W0515 12:27:09.390713 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.390746 kubelet[3207]: E0515 12:27:09.390729 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.390816 kubelet[3207]: E0515 12:27:09.390810 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.390836 kubelet[3207]: W0515 12:27:09.390817 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.390836 kubelet[3207]: E0515 12:27:09.390831 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.390905 kubelet[3207]: E0515 12:27:09.390899 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.390905 kubelet[3207]: W0515 12:27:09.390904 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.390968 kubelet[3207]: E0515 12:27:09.390928 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391055 kubelet[3207]: E0515 12:27:09.391046 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391055 kubelet[3207]: W0515 12:27:09.391053 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391098 kubelet[3207]: E0515 12:27:09.391065 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391244 kubelet[3207]: E0515 12:27:09.391188 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391244 kubelet[3207]: W0515 12:27:09.391194 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391244 kubelet[3207]: E0515 12:27:09.391201 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391337 kubelet[3207]: E0515 12:27:09.391284 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391337 kubelet[3207]: W0515 12:27:09.391288 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391337 kubelet[3207]: E0515 12:27:09.391301 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391431 kubelet[3207]: E0515 12:27:09.391381 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391431 kubelet[3207]: W0515 12:27:09.391386 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391431 kubelet[3207]: E0515 12:27:09.391394 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391521 kubelet[3207]: E0515 12:27:09.391459 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391521 kubelet[3207]: W0515 12:27:09.391463 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391521 kubelet[3207]: E0515 12:27:09.391471 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391624 kubelet[3207]: E0515 12:27:09.391530 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391624 kubelet[3207]: W0515 12:27:09.391533 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391624 kubelet[3207]: E0515 12:27:09.391541 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391624 kubelet[3207]: E0515 12:27:09.391619 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391624 kubelet[3207]: W0515 12:27:09.391623 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391797 kubelet[3207]: E0515 12:27:09.391630 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391797 kubelet[3207]: E0515 12:27:09.391784 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391797 kubelet[3207]: W0515 12:27:09.391788 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391870 kubelet[3207]: E0515 12:27:09.391796 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391870 kubelet[3207]: E0515 12:27:09.391865 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391870 kubelet[3207]: W0515 12:27:09.391869 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.391973 kubelet[3207]: E0515 12:27:09.391874 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.391973 kubelet[3207]: E0515 12:27:09.391961 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.391973 kubelet[3207]: W0515 12:27:09.391965 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.392042 kubelet[3207]: E0515 12:27:09.391976 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.392086 kubelet[3207]: E0515 12:27:09.392073 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.392086 kubelet[3207]: W0515 12:27:09.392081 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.392131 kubelet[3207]: E0515 12:27:09.392092 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.392262 kubelet[3207]: E0515 12:27:09.392222 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.392262 kubelet[3207]: W0515 12:27:09.392229 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.392262 kubelet[3207]: E0515 12:27:09.392236 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:09.392514 kubelet[3207]: E0515 12:27:09.392499 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:09.392514 kubelet[3207]: W0515 12:27:09.392513 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:09.392571 kubelet[3207]: E0515 12:27:09.392523 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.344186 kubelet[3207]: I0515 12:27:10.344167 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:27:10.376598 kubelet[3207]: E0515 12:27:10.376581 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.376598 kubelet[3207]: W0515 12:27:10.376594 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.376598 kubelet[3207]: E0515 12:27:10.376607 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.376738 kubelet[3207]: E0515 12:27:10.376700 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.376738 kubelet[3207]: W0515 12:27:10.376704 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.376738 kubelet[3207]: E0515 12:27:10.376711 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.376804 kubelet[3207]: E0515 12:27:10.376789 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.376804 kubelet[3207]: W0515 12:27:10.376794 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.376804 kubelet[3207]: E0515 12:27:10.376799 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.376884 kubelet[3207]: E0515 12:27:10.376877 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.376907 kubelet[3207]: W0515 12:27:10.376883 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.376907 kubelet[3207]: E0515 12:27:10.376888 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377006 kubelet[3207]: E0515 12:27:10.376981 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377006 kubelet[3207]: W0515 12:27:10.377001 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377058 kubelet[3207]: E0515 12:27:10.377007 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377090 kubelet[3207]: E0515 12:27:10.377079 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377090 kubelet[3207]: W0515 12:27:10.377086 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377133 kubelet[3207]: E0515 12:27:10.377092 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377164 kubelet[3207]: E0515 12:27:10.377159 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377188 kubelet[3207]: W0515 12:27:10.377164 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377188 kubelet[3207]: E0515 12:27:10.377169 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377243 kubelet[3207]: E0515 12:27:10.377235 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377243 kubelet[3207]: W0515 12:27:10.377241 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377283 kubelet[3207]: E0515 12:27:10.377246 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377340 kubelet[3207]: E0515 12:27:10.377330 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377340 kubelet[3207]: W0515 12:27:10.377336 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377388 kubelet[3207]: E0515 12:27:10.377341 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377410 kubelet[3207]: E0515 12:27:10.377402 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377410 kubelet[3207]: W0515 12:27:10.377407 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377451 kubelet[3207]: E0515 12:27:10.377412 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377473 kubelet[3207]: E0515 12:27:10.377469 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377497 kubelet[3207]: W0515 12:27:10.377473 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377497 kubelet[3207]: E0515 12:27:10.377478 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377542 kubelet[3207]: E0515 12:27:10.377537 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377563 kubelet[3207]: W0515 12:27:10.377543 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377563 kubelet[3207]: E0515 12:27:10.377548 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377640 kubelet[3207]: E0515 12:27:10.377632 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377640 kubelet[3207]: W0515 12:27:10.377637 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377694 kubelet[3207]: E0515 12:27:10.377643 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377721 kubelet[3207]: E0515 12:27:10.377716 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377743 kubelet[3207]: W0515 12:27:10.377720 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377743 kubelet[3207]: E0515 12:27:10.377726 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.377798 kubelet[3207]: E0515 12:27:10.377794 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.377822 kubelet[3207]: W0515 12:27:10.377797 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.377822 kubelet[3207]: E0515 12:27:10.377803 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398067 kubelet[3207]: E0515 12:27:10.398039 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398067 kubelet[3207]: W0515 12:27:10.398063 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398147 kubelet[3207]: E0515 12:27:10.398074 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398184 kubelet[3207]: E0515 12:27:10.398178 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398184 kubelet[3207]: W0515 12:27:10.398182 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398230 kubelet[3207]: E0515 12:27:10.398189 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398326 kubelet[3207]: E0515 12:27:10.398310 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398326 kubelet[3207]: W0515 12:27:10.398318 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398381 kubelet[3207]: E0515 12:27:10.398336 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398512 kubelet[3207]: E0515 12:27:10.398491 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398512 kubelet[3207]: W0515 12:27:10.398511 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398561 kubelet[3207]: E0515 12:27:10.398524 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398663 kubelet[3207]: E0515 12:27:10.398640 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398663 kubelet[3207]: W0515 12:27:10.398660 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398713 kubelet[3207]: E0515 12:27:10.398674 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398793 kubelet[3207]: E0515 12:27:10.398775 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398793 kubelet[3207]: W0515 12:27:10.398790 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398851 kubelet[3207]: E0515 12:27:10.398799 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.398963 kubelet[3207]: E0515 12:27:10.398926 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.398963 kubelet[3207]: W0515 12:27:10.398933 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.398963 kubelet[3207]: E0515 12:27:10.398955 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399163 kubelet[3207]: E0515 12:27:10.399141 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399163 kubelet[3207]: W0515 12:27:10.399161 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399216 kubelet[3207]: E0515 12:27:10.399170 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399255 kubelet[3207]: E0515 12:27:10.399247 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399277 kubelet[3207]: W0515 12:27:10.399255 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399277 kubelet[3207]: E0515 12:27:10.399261 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399373 kubelet[3207]: E0515 12:27:10.399356 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399373 kubelet[3207]: W0515 12:27:10.399370 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399419 kubelet[3207]: E0515 12:27:10.399379 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399492 kubelet[3207]: E0515 12:27:10.399481 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399492 kubelet[3207]: W0515 12:27:10.399488 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399534 kubelet[3207]: E0515 12:27:10.399495 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399638 kubelet[3207]: E0515 12:27:10.399626 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399638 kubelet[3207]: W0515 12:27:10.399633 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399699 kubelet[3207]: E0515 12:27:10.399644 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399720 kubelet[3207]: E0515 12:27:10.399714 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399720 kubelet[3207]: W0515 12:27:10.399718 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399779 kubelet[3207]: E0515 12:27:10.399724 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399809 kubelet[3207]: E0515 12:27:10.399788 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399809 kubelet[3207]: W0515 12:27:10.399792 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399809 kubelet[3207]: E0515 12:27:10.399800 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.399895 kubelet[3207]: E0515 12:27:10.399880 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.399895 kubelet[3207]: W0515 12:27:10.399884 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.399895 kubelet[3207]: E0515 12:27:10.399892 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.400251 kubelet[3207]: E0515 12:27:10.400085 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.400251 kubelet[3207]: W0515 12:27:10.400097 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.400251 kubelet[3207]: E0515 12:27:10.400111 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.400425 kubelet[3207]: E0515 12:27:10.400416 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.400456 kubelet[3207]: W0515 12:27:10.400451 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.400501 kubelet[3207]: E0515 12:27:10.400490 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:10.400676 kubelet[3207]: E0515 12:27:10.400654 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:10.400676 kubelet[3207]: W0515 12:27:10.400674 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:10.400734 kubelet[3207]: E0515 12:27:10.400681 3207 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:11.270972 kubelet[3207]: E0515 12:27:11.270317 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:11.581628 containerd[1716]: time="2025-05-15T12:27:11.581551861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:11.583965 containerd[1716]: time="2025-05-15T12:27:11.583928236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 15 12:27:11.631038 containerd[1716]: time="2025-05-15T12:27:11.630992084Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:11.679754 containerd[1716]: time="2025-05-15T12:27:11.679696415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:11.680216 containerd[1716]: time="2025-05-15T12:27:11.680140048Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 3.09181461s" May 15 12:27:11.680216 containerd[1716]: time="2025-05-15T12:27:11.680164633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 15 12:27:11.682074 containerd[1716]: time="2025-05-15T12:27:11.682039684Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 12:27:11.840186 containerd[1716]: time="2025-05-15T12:27:11.840132135Z" level=info msg="Container ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:12.143629 containerd[1716]: time="2025-05-15T12:27:12.143544236Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\"" May 15 12:27:12.144381 containerd[1716]: time="2025-05-15T12:27:12.143956454Z" level=info msg="StartContainer for \"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\"" May 15 12:27:12.145304 containerd[1716]: time="2025-05-15T12:27:12.145239909Z" level=info msg="connecting to shim ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3" address="unix:///run/containerd/s/992d9eb7e692e3b165af9814df3c66907176648619eaf21066a082cb2f80db45" protocol=ttrpc version=3 May 15 12:27:12.165078 systemd[1]: Started cri-containerd-ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3.scope - libcontainer container ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3. May 15 12:27:12.193768 containerd[1716]: time="2025-05-15T12:27:12.193734458Z" level=info msg="StartContainer for \"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" returns successfully" May 15 12:27:12.196039 systemd[1]: cri-containerd-ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3.scope: Deactivated successfully. May 15 12:27:12.198545 containerd[1716]: time="2025-05-15T12:27:12.198515672Z" level=info msg="received exit event container_id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" pid:3889 exited_at:{seconds:1747312032 nanos:198265208}" May 15 12:27:12.198621 containerd[1716]: time="2025-05-15T12:27:12.198607400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" pid:3889 exited_at:{seconds:1747312032 nanos:198265208}" May 15 12:27:12.212865 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3-rootfs.mount: Deactivated successfully. May 15 12:27:13.269699 kubelet[3207]: E0515 12:27:13.269606 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:15.269938 kubelet[3207]: E0515 12:27:15.269725 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:17.270876 kubelet[3207]: E0515 12:27:17.270090 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:19.175999 kubelet[3207]: I0515 12:27:19.175823 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:27:19.270564 kubelet[3207]: E0515 12:27:19.270446 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:21.270945 kubelet[3207]: E0515 12:27:21.270181 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:22.199570 containerd[1716]: time="2025-05-15T12:27:22.199501939Z" level=error msg="failed to handle container TaskExit event container_id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" pid:3889 exited_at:{seconds:1747312032 nanos:198265208}" error="failed to stop container: failed to delete task: context deadline exceeded" May 15 12:27:23.270946 kubelet[3207]: E0515 12:27:23.269962 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:23.662751 containerd[1716]: time="2025-05-15T12:27:23.662630867Z" level=info msg="TaskExit event container_id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" id:\"ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3\" pid:3889 exited_at:{seconds:1747312032 nanos:198265208}" May 15 12:27:25.270356 kubelet[3207]: E0515 12:27:25.269547 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:26.032151 containerd[1716]: time="2025-05-15T12:27:25.662851185Z" level=error msg="get state for ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3" error="context deadline exceeded" May 15 12:27:26.032151 containerd[1716]: time="2025-05-15T12:27:25.662883907Z" level=warning msg="unknown status" status=0 May 15 12:27:27.270015 kubelet[3207]: E0515 12:27:27.269705 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:27.664413 containerd[1716]: time="2025-05-15T12:27:27.664277620Z" level=error msg="get state for ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3" error="context deadline exceeded" May 15 12:27:27.664413 containerd[1716]: time="2025-05-15T12:27:27.664339047Z" level=warning msg="unknown status" status=0 May 15 12:27:29.270426 kubelet[3207]: E0515 12:27:29.270251 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:29.665599 containerd[1716]: time="2025-05-15T12:27:29.665470301Z" level=error msg="get state for ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3" error="context deadline exceeded" May 15 12:27:29.665599 containerd[1716]: time="2025-05-15T12:27:29.665514766Z" level=warning msg="unknown status" status=0 May 15 12:27:31.271405 kubelet[3207]: E0515 12:27:31.270363 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:32.728894 containerd[1716]: time="2025-05-15T12:27:32.728801687Z" level=error msg="ttrpc: received message on inactive stream" stream=31 May 15 12:27:32.728894 containerd[1716]: time="2025-05-15T12:27:32.728869971Z" level=error msg="ttrpc: received message on inactive stream" stream=37 May 15 12:27:32.728894 containerd[1716]: time="2025-05-15T12:27:32.728883232Z" level=error msg="ttrpc: received message on inactive stream" stream=39 May 15 12:27:32.728894 containerd[1716]: time="2025-05-15T12:27:32.728892319Z" level=error msg="ttrpc: received message on inactive stream" stream=41 May 15 12:27:32.730125 containerd[1716]: time="2025-05-15T12:27:32.730091163Z" level=info msg="Ensure that container ef46ba10c984489897c7d718e539debba6565d1ca4986f85ec3356a2af6323d3 in task-service has been cleanup successfully" May 15 12:27:33.270632 kubelet[3207]: E0515 12:27:33.269947 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:33.383606 containerd[1716]: time="2025-05-15T12:27:33.383566457Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 15 12:27:35.270113 kubelet[3207]: E0515 12:27:35.270079 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:37.269934 kubelet[3207]: E0515 12:27:37.269640 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:39.270483 kubelet[3207]: E0515 12:27:39.270371 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:41.270620 kubelet[3207]: E0515 12:27:41.270256 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:43.270131 kubelet[3207]: E0515 12:27:43.270097 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:43.526305 containerd[1716]: time="2025-05-15T12:27:43.526220947Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:43.588092 containerd[1716]: time="2025-05-15T12:27:43.588062789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 15 12:27:43.590865 containerd[1716]: time="2025-05-15T12:27:43.590826237Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:43.636448 containerd[1716]: time="2025-05-15T12:27:43.636385265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:43.637091 containerd[1716]: time="2025-05-15T12:27:43.636927303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 10.25331184s" May 15 12:27:43.637091 containerd[1716]: time="2025-05-15T12:27:43.636952754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 15 12:27:43.638735 containerd[1716]: time="2025-05-15T12:27:43.638712097Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 12:27:43.783969 containerd[1716]: time="2025-05-15T12:27:43.781691546Z" level=info msg="Container aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:43.940462 containerd[1716]: time="2025-05-15T12:27:43.940442847Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\"" May 15 12:27:43.941551 containerd[1716]: time="2025-05-15T12:27:43.940732468Z" level=info msg="StartContainer for \"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\"" May 15 12:27:43.942208 containerd[1716]: time="2025-05-15T12:27:43.942182179Z" level=info msg="connecting to shim aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf" address="unix:///run/containerd/s/992d9eb7e692e3b165af9814df3c66907176648619eaf21066a082cb2f80db45" protocol=ttrpc version=3 May 15 12:27:43.965074 systemd[1]: Started cri-containerd-aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf.scope - libcontainer container aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf. May 15 12:27:43.994504 containerd[1716]: time="2025-05-15T12:27:43.994337201Z" level=info msg="StartContainer for \"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\" returns successfully" May 15 12:27:45.270279 kubelet[3207]: E0515 12:27:45.269366 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:47.270285 kubelet[3207]: E0515 12:27:47.269556 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:49.270729 kubelet[3207]: E0515 12:27:49.270348 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:51.270652 kubelet[3207]: E0515 12:27:51.270207 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:27:51.790698 containerd[1716]: time="2025-05-15T12:27:51.790662446Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:27:51.792246 systemd[1]: cri-containerd-aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf.scope: Deactivated successfully. May 15 12:27:51.792709 systemd[1]: cri-containerd-aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf.scope: Consumed 332ms CPU time, 170.1M memory peak, 154M written to disk. May 15 12:27:51.794203 containerd[1716]: time="2025-05-15T12:27:51.794138828Z" level=info msg="received exit event container_id:\"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\" id:\"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\" pid:3953 exited_at:{seconds:1747312071 nanos:793857688}" May 15 12:27:51.794333 containerd[1716]: time="2025-05-15T12:27:51.794297105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\" id:\"aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf\" pid:3953 exited_at:{seconds:1747312071 nanos:793857688}" May 15 12:27:51.811443 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-aac873df109fdc5fb060dad3ad691a98a42ec5b0e3667e2a13cd18530bafa5bf-rootfs.mount: Deactivated successfully. May 15 12:27:51.857456 kubelet[3207]: I0515 12:27:51.857439 3207 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 15 12:27:51.894408 systemd[1]: Created slice kubepods-burstable-podeb5df7ca_784b_4d32_8ab0_b2fcf0d8f4a3.slice - libcontainer container kubepods-burstable-podeb5df7ca_784b_4d32_8ab0_b2fcf0d8f4a3.slice. May 15 12:27:51.904029 systemd[1]: Created slice kubepods-burstable-pod879a1dfa_1019_4d50_8d7e_cf0e572f63f5.slice - libcontainer container kubepods-burstable-pod879a1dfa_1019_4d50_8d7e_cf0e572f63f5.slice. May 15 12:27:51.911852 systemd[1]: Created slice kubepods-besteffort-pod098a7a46_c8fb_45c8_bc2a_0777e39cd9d5.slice - libcontainer container kubepods-besteffort-pod098a7a46_c8fb_45c8_bc2a_0777e39cd9d5.slice. May 15 12:27:51.917172 systemd[1]: Created slice kubepods-besteffort-pod359bdd50_8681_48b5_afa4_914ea552e2b3.slice - libcontainer container kubepods-besteffort-pod359bdd50_8681_48b5_afa4_914ea552e2b3.slice. May 15 12:27:51.924101 systemd[1]: Created slice kubepods-besteffort-pod780b7c79_8417_4cc7_b89c_26a4ca3518c0.slice - libcontainer container kubepods-besteffort-pod780b7c79_8417_4cc7_b89c_26a4ca3518c0.slice. May 15 12:27:52.017683 kubelet[3207]: I0515 12:27:52.017639 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnjb\" (UniqueName: \"kubernetes.io/projected/eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3-kube-api-access-cjnjb\") pod \"coredns-6f6b679f8f-lvfdw\" (UID: \"eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3\") " pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:27:52.017797 kubelet[3207]: I0515 12:27:52.017719 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccmx\" (UniqueName: \"kubernetes.io/projected/780b7c79-8417-4cc7-b89c-26a4ca3518c0-kube-api-access-bccmx\") pod \"calico-apiserver-57b6bcfb55-pdhwm\" (UID: \"780b7c79-8417-4cc7-b89c-26a4ca3518c0\") " pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:27:52.017797 kubelet[3207]: I0515 12:27:52.017751 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxkcg\" (UniqueName: \"kubernetes.io/projected/879a1dfa-1019-4d50-8d7e-cf0e572f63f5-kube-api-access-cxkcg\") pod \"coredns-6f6b679f8f-dmc8v\" (UID: \"879a1dfa-1019-4d50-8d7e-cf0e572f63f5\") " pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:27:52.017797 kubelet[3207]: I0515 12:27:52.017773 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/879a1dfa-1019-4d50-8d7e-cf0e572f63f5-config-volume\") pod \"coredns-6f6b679f8f-dmc8v\" (UID: \"879a1dfa-1019-4d50-8d7e-cf0e572f63f5\") " pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:27:52.017797 kubelet[3207]: I0515 12:27:52.017790 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/780b7c79-8417-4cc7-b89c-26a4ca3518c0-calico-apiserver-certs\") pod \"calico-apiserver-57b6bcfb55-pdhwm\" (UID: \"780b7c79-8417-4cc7-b89c-26a4ca3518c0\") " pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:27:52.017896 kubelet[3207]: I0515 12:27:52.017809 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/098a7a46-c8fb-45c8-bc2a-0777e39cd9d5-calico-apiserver-certs\") pod \"calico-apiserver-57b6bcfb55-gpjjj\" (UID: \"098a7a46-c8fb-45c8-bc2a-0777e39cd9d5\") " pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:27:52.017896 kubelet[3207]: I0515 12:27:52.017836 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/359bdd50-8681-48b5-afa4-914ea552e2b3-tigera-ca-bundle\") pod \"calico-kube-controllers-6d87fb4d96-lvq8m\" (UID: \"359bdd50-8681-48b5-afa4-914ea552e2b3\") " pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:27:52.017896 kubelet[3207]: I0515 12:27:52.017852 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmsf\" (UniqueName: \"kubernetes.io/projected/359bdd50-8681-48b5-afa4-914ea552e2b3-kube-api-access-8hmsf\") pod \"calico-kube-controllers-6d87fb4d96-lvq8m\" (UID: \"359bdd50-8681-48b5-afa4-914ea552e2b3\") " pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:27:52.017896 kubelet[3207]: I0515 12:27:52.017870 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pfr\" (UniqueName: \"kubernetes.io/projected/098a7a46-c8fb-45c8-bc2a-0777e39cd9d5-kube-api-access-m5pfr\") pod \"calico-apiserver-57b6bcfb55-gpjjj\" (UID: \"098a7a46-c8fb-45c8-bc2a-0777e39cd9d5\") " pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:27:52.018021 kubelet[3207]: I0515 12:27:52.017887 3207 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3-config-volume\") pod \"coredns-6f6b679f8f-lvfdw\" (UID: \"eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3\") " pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:27:52.200416 containerd[1716]: time="2025-05-15T12:27:52.200066003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,}" May 15 12:27:52.209602 containerd[1716]: time="2025-05-15T12:27:52.209575024Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,}" May 15 12:27:52.215254 containerd[1716]: time="2025-05-15T12:27:52.215235105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,}" May 15 12:27:52.221899 containerd[1716]: time="2025-05-15T12:27:52.221872565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,}" May 15 12:27:52.227339 containerd[1716]: time="2025-05-15T12:27:52.227319224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,}" May 15 12:27:53.276184 systemd[1]: Created slice kubepods-besteffort-poddcbe7eb9_f017_4039_b0fc_ef4f90e95554.slice - libcontainer container kubepods-besteffort-poddcbe7eb9_f017_4039_b0fc_ef4f90e95554.slice. May 15 12:27:53.278469 containerd[1716]: time="2025-05-15T12:27:53.278440240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x9zg6,Uid:dcbe7eb9-f017-4039-b0fc-ef4f90e95554,Namespace:calico-system,Attempt:0,}" May 15 12:28:01.316554 containerd[1716]: time="2025-05-15T12:28:01.316502937Z" level=error msg="Failed to destroy network for sandbox \"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.318203 systemd[1]: run-netns-cni\x2dda62b19b\x2d52cc\x2d11b8\x2d9234\x2d2573216f0275.mount: Deactivated successfully. May 15 12:28:01.420568 containerd[1716]: time="2025-05-15T12:28:01.420531155Z" level=error msg="Failed to destroy network for sandbox \"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.430708 containerd[1716]: time="2025-05-15T12:28:01.430556192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 15 12:28:01.435172 containerd[1716]: time="2025-05-15T12:28:01.435130559Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.435485 kubelet[3207]: E0515 12:28:01.435444 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.436248 kubelet[3207]: E0515 12:28:01.435503 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:28:01.436248 kubelet[3207]: E0515 12:28:01.435530 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:28:01.436248 kubelet[3207]: E0515 12:28:01.435571 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-lvfdw_kube-system(eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-lvfdw_kube-system(eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3c638e9fe03f02fd84b20901fc2634fdb73dab5797aaec4c75f91bfb6e81b4a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-lvfdw" podUID="eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3" May 15 12:28:01.470385 containerd[1716]: time="2025-05-15T12:28:01.470359448Z" level=error msg="Failed to destroy network for sandbox \"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.560664 containerd[1716]: time="2025-05-15T12:28:01.560631106Z" level=error msg="Failed to destroy network for sandbox \"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.614004 containerd[1716]: time="2025-05-15T12:28:01.613902687Z" level=error msg="Failed to destroy network for sandbox \"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.658354 containerd[1716]: time="2025-05-15T12:28:01.658323341Z" level=error msg="Failed to destroy network for sandbox \"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.676586 containerd[1716]: time="2025-05-15T12:28:01.676529027Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.676705 kubelet[3207]: E0515 12:28:01.676677 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.676746 kubelet[3207]: E0515 12:28:01.676724 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:28:01.676791 kubelet[3207]: E0515 12:28:01.676741 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:28:01.676791 kubelet[3207]: E0515 12:28:01.676775 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dmc8v_kube-system(879a1dfa-1019-4d50-8d7e-cf0e572f63f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dmc8v_kube-system(879a1dfa-1019-4d50-8d7e-cf0e572f63f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"925e21031c08b06fda9173a944b2d1ca0005e6a58a16e7de1c361345887373eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dmc8v" podUID="879a1dfa-1019-4d50-8d7e-cf0e572f63f5" May 15 12:28:01.786134 containerd[1716]: time="2025-05-15T12:28:01.786103920Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.786327 kubelet[3207]: E0515 12:28:01.786299 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.786378 kubelet[3207]: E0515 12:28:01.786344 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:28:01.786378 kubelet[3207]: E0515 12:28:01.786364 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:28:01.786434 kubelet[3207]: E0515 12:28:01.786397 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6d87fb4d96-lvq8m_calico-system(359bdd50-8681-48b5-afa4-914ea552e2b3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6d87fb4d96-lvq8m_calico-system(359bdd50-8681-48b5-afa4-914ea552e2b3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6bc9caa21003606ce0e31acb386240bb777caa2c3de334c2aa38a8358907a577\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" podUID="359bdd50-8681-48b5-afa4-914ea552e2b3" May 15 12:28:01.788824 systemd[1]: run-netns-cni\x2d104518ba\x2db458\x2dd4f0\x2d0ef7\x2d542de57db062.mount: Deactivated successfully. May 15 12:28:01.788902 systemd[1]: run-netns-cni\x2d107581ad\x2d626c\x2d5282\x2d35a9\x2dca3f88914db1.mount: Deactivated successfully. May 15 12:28:01.788958 systemd[1]: run-netns-cni\x2d66327068\x2d09a4\x2dc367\x2d2be0\x2de74d63e0726d.mount: Deactivated successfully. May 15 12:28:01.789001 systemd[1]: run-netns-cni\x2db3577949\x2d32e0\x2d8f77\x2d6245\x2da45f809674de.mount: Deactivated successfully. May 15 12:28:01.789041 systemd[1]: run-netns-cni\x2deaa031bb\x2dfc90\x2d93fe\x2d5f06\x2d46e7dfae66d5.mount: Deactivated successfully. May 15 12:28:01.833516 containerd[1716]: time="2025-05-15T12:28:01.833456038Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.833643 kubelet[3207]: E0515 12:28:01.833606 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.833699 kubelet[3207]: E0515 12:28:01.833651 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:28:01.833699 kubelet[3207]: E0515 12:28:01.833669 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:28:01.833760 kubelet[3207]: E0515 12:28:01.833715 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57b6bcfb55-gpjjj_calico-apiserver(098a7a46-c8fb-45c8-bc2a-0777e39cd9d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57b6bcfb55-gpjjj_calico-apiserver(098a7a46-c8fb-45c8-bc2a-0777e39cd9d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c160c82aebf8383700121cc0f4990c7459a3b46305a78344fc15af9ef4c45f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" podUID="098a7a46-c8fb-45c8-bc2a-0777e39cd9d5" May 15 12:28:01.879368 containerd[1716]: time="2025-05-15T12:28:01.879284779Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.879902 kubelet[3207]: E0515 12:28:01.879457 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.879902 kubelet[3207]: E0515 12:28:01.879510 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:28:01.879902 kubelet[3207]: E0515 12:28:01.879527 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:28:01.880023 kubelet[3207]: E0515 12:28:01.879568 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57b6bcfb55-pdhwm_calico-apiserver(780b7c79-8417-4cc7-b89c-26a4ca3518c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57b6bcfb55-pdhwm_calico-apiserver(780b7c79-8417-4cc7-b89c-26a4ca3518c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b4718a905acc0c6f09a400bfb947a7c2fb0878ce8dd7cbc47a9718da60b80187\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" podUID="780b7c79-8417-4cc7-b89c-26a4ca3518c0" May 15 12:28:01.882474 containerd[1716]: time="2025-05-15T12:28:01.882418650Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x9zg6,Uid:dcbe7eb9-f017-4039-b0fc-ef4f90e95554,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.882689 kubelet[3207]: E0515 12:28:01.882581 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:01.882689 kubelet[3207]: E0515 12:28:01.882613 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x9zg6" May 15 12:28:01.882689 kubelet[3207]: E0515 12:28:01.882627 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x9zg6" May 15 12:28:01.882781 kubelet[3207]: E0515 12:28:01.882674 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x9zg6_calico-system(dcbe7eb9-f017-4039-b0fc-ef4f90e95554)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x9zg6_calico-system(dcbe7eb9-f017-4039-b0fc-ef4f90e95554)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca1b67c4bad48a6c367f41b8704ad71f77662b532137884ab4c3d1ea933abdf2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x9zg6" podUID="dcbe7eb9-f017-4039-b0fc-ef4f90e95554" May 15 12:28:12.270030 containerd[1716]: time="2025-05-15T12:28:12.269991192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,}" May 15 12:28:12.270431 containerd[1716]: time="2025-05-15T12:28:12.269991164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,}" May 15 12:28:13.271081 containerd[1716]: time="2025-05-15T12:28:13.270907535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,}" May 15 12:28:13.781155 containerd[1716]: time="2025-05-15T12:28:13.781073100Z" level=error msg="Failed to destroy network for sandbox \"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.783973 systemd[1]: run-netns-cni\x2db36678d7\x2d31ef\x2d0563\x2d2e16\x2da56e28e1991d.mount: Deactivated successfully. May 15 12:28:13.786124 containerd[1716]: time="2025-05-15T12:28:13.784439559Z" level=error msg="Failed to destroy network for sandbox \"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.790169 systemd[1]: run-netns-cni\x2de27f65b6\x2d78c7\x2d7002\x2d1d87\x2d2eb71e354541.mount: Deactivated successfully. May 15 12:28:13.814640 containerd[1716]: time="2025-05-15T12:28:13.814614267Z" level=error msg="Failed to destroy network for sandbox \"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.880241 containerd[1716]: time="2025-05-15T12:28:13.880107909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.880760 kubelet[3207]: E0515 12:28:13.880429 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.880760 kubelet[3207]: E0515 12:28:13.880663 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:28:13.880760 kubelet[3207]: E0515 12:28:13.880685 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-lvfdw" May 15 12:28:13.881522 kubelet[3207]: E0515 12:28:13.880731 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-lvfdw_kube-system(eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-lvfdw_kube-system(eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12eea0de4ecef5611868eb3dce273083bcf899b0f982ae21f49bb675d1bbd7ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-lvfdw" podUID="eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3" May 15 12:28:13.925120 containerd[1716]: time="2025-05-15T12:28:13.925090442Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.925441 kubelet[3207]: E0515 12:28:13.925418 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.925631 kubelet[3207]: E0515 12:28:13.925587 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:28:13.925631 kubelet[3207]: E0515 12:28:13.925608 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" May 15 12:28:13.925888 kubelet[3207]: E0515 12:28:13.925715 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6d87fb4d96-lvq8m_calico-system(359bdd50-8681-48b5-afa4-914ea552e2b3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6d87fb4d96-lvq8m_calico-system(359bdd50-8681-48b5-afa4-914ea552e2b3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c8a3668526099eb91b354379d62ffafcc373b207224518263b38b591d0dde3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" podUID="359bdd50-8681-48b5-afa4-914ea552e2b3" May 15 12:28:13.987144 containerd[1716]: time="2025-05-15T12:28:13.987107656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.987670 kubelet[3207]: E0515 12:28:13.987648 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:13.987831 kubelet[3207]: E0515 12:28:13.987747 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:28:13.987831 kubelet[3207]: E0515 12:28:13.987769 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dmc8v" May 15 12:28:13.988696 kubelet[3207]: E0515 12:28:13.988658 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dmc8v_kube-system(879a1dfa-1019-4d50-8d7e-cf0e572f63f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dmc8v_kube-system(879a1dfa-1019-4d50-8d7e-cf0e572f63f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ff38646b1eb7114b1831767c1b40fe5131fdbcf86d3528d5d063a2e770d052d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dmc8v" podUID="879a1dfa-1019-4d50-8d7e-cf0e572f63f5" May 15 12:28:14.270756 containerd[1716]: time="2025-05-15T12:28:14.270580035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:14.270824 containerd[1716]: time="2025-05-15T12:28:14.270805876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:14.391083 containerd[1716]: time="2025-05-15T12:28:14.391049986Z" level=error msg="Failed to destroy network for sandbox \"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.426988 containerd[1716]: time="2025-05-15T12:28:14.426952963Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.427791 kubelet[3207]: E0515 12:28:14.427096 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.427791 kubelet[3207]: E0515 12:28:14.427133 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:28:14.427791 kubelet[3207]: E0515 12:28:14.427153 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" May 15 12:28:14.427890 kubelet[3207]: E0515 12:28:14.427184 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57b6bcfb55-gpjjj_calico-apiserver(098a7a46-c8fb-45c8-bc2a-0777e39cd9d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57b6bcfb55-gpjjj_calico-apiserver(098a7a46-c8fb-45c8-bc2a-0777e39cd9d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ea4c777eacef8200eb01cd7c9ac6bdcfb460b2bdb3faee13aac481da97aa51d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" podUID="098a7a46-c8fb-45c8-bc2a-0777e39cd9d5" May 15 12:28:14.433675 containerd[1716]: time="2025-05-15T12:28:14.433647830Z" level=error msg="Failed to destroy network for sandbox \"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.472806 containerd[1716]: time="2025-05-15T12:28:14.472699477Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.473458 kubelet[3207]: E0515 12:28:14.473092 3207 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:14.473458 kubelet[3207]: E0515 12:28:14.473142 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:28:14.473458 kubelet[3207]: E0515 12:28:14.473159 3207 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" May 15 12:28:14.473596 kubelet[3207]: E0515 12:28:14.473199 3207 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57b6bcfb55-pdhwm_calico-apiserver(780b7c79-8417-4cc7-b89c-26a4ca3518c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57b6bcfb55-pdhwm_calico-apiserver(780b7c79-8417-4cc7-b89c-26a4ca3518c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14a102417bae9b3ad89cab8276ed2d172cead76a20abd6aec45ce4dc023c1196\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" podUID="780b7c79-8417-4cc7-b89c-26a4ca3518c0" May 15 12:28:14.536092 systemd[1]: run-netns-cni\x2d8ea85456\x2dea0d\x2d77d1\x2d8ec7\x2d531d89183bff.mount: Deactivated successfully. May 15 12:28:15.492340 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3078377776.mount: Deactivated successfully. May 15 12:28:15.680743 containerd[1716]: time="2025-05-15T12:28:15.680704746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:15.726647 containerd[1716]: time="2025-05-15T12:28:15.726614828Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 15 12:28:15.774280 containerd[1716]: time="2025-05-15T12:28:15.774145121Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:15.836849 containerd[1716]: time="2025-05-15T12:28:15.836789360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:15.837370 containerd[1716]: time="2025-05-15T12:28:15.837276100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 14.406576645s" May 15 12:28:15.837370 containerd[1716]: time="2025-05-15T12:28:15.837301989Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 15 12:28:15.845249 containerd[1716]: time="2025-05-15T12:28:15.844876533Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 12:28:16.028980 containerd[1716]: time="2025-05-15T12:28:16.028840983Z" level=info msg="Container e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:16.287671 containerd[1716]: time="2025-05-15T12:28:16.287589146Z" level=info msg="CreateContainer within sandbox \"8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\"" May 15 12:28:16.288150 containerd[1716]: time="2025-05-15T12:28:16.287941274Z" level=info msg="StartContainer for \"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\"" May 15 12:28:16.289351 containerd[1716]: time="2025-05-15T12:28:16.289321536Z" level=info msg="connecting to shim e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4" address="unix:///run/containerd/s/992d9eb7e692e3b165af9814df3c66907176648619eaf21066a082cb2f80db45" protocol=ttrpc version=3 May 15 12:28:16.309062 systemd[1]: Started cri-containerd-e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4.scope - libcontainer container e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4. May 15 12:28:16.339416 containerd[1716]: time="2025-05-15T12:28:16.339324114Z" level=info msg="StartContainer for \"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" returns successfully" May 15 12:28:16.439575 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 15 12:28:16.439630 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 15 12:28:16.527091 containerd[1716]: time="2025-05-15T12:28:16.527067200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"473db40da509a40f822507746167ca6719bfd1dc388f9e606c7286d582dc34e7\" pid:4392 exit_status:1 exited_at:{seconds:1747312096 nanos:526581257}" May 15 12:28:17.270784 containerd[1716]: time="2025-05-15T12:28:17.270738812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x9zg6,Uid:dcbe7eb9-f017-4039-b0fc-ef4f90e95554,Namespace:calico-system,Attempt:0,}" May 15 12:28:17.392541 systemd-networkd[1350]: cali6445deedfaa: Link UP May 15 12:28:17.393284 systemd-networkd[1350]: cali6445deedfaa: Gained carrier May 15 12:28:17.404155 containerd[1716]: 2025-05-15 12:28:17.291 [INFO][4423] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 12:28:17.404155 containerd[1716]: 2025-05-15 12:28:17.302 [INFO][4423] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0 csi-node-driver- calico-system dcbe7eb9-f017-4039-b0fc-ef4f90e95554 622 0 2025-05-15 12:27:03 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 csi-node-driver-x9zg6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6445deedfaa [] []}} ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-" May 15 12:28:17.404155 containerd[1716]: 2025-05-15 12:28:17.302 [INFO][4423] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.404155 containerd[1716]: 2025-05-15 12:28:17.321 [INFO][4435] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" HandleID="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Workload="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.326 [INFO][4435] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" HandleID="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Workload="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000308f80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-81f65144c0", "pod":"csi-node-driver-x9zg6", "timestamp":"2025-05-15 12:28:17.321030071 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.326 [INFO][4435] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.326 [INFO][4435] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.327 [INFO][4435] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.328 [INFO][4435] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.330 [INFO][4435] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.332 [INFO][4435] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.333 [INFO][4435] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.404712 containerd[1716]: 2025-05-15 12:28:17.335 [INFO][4435] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.335 [INFO][4435] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.335 [INFO][4435] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41 May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.339 [INFO][4435] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.344 [INFO][4435] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.65/26] block=192.168.38.64/26 handle="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.344 [INFO][4435] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.65/26] handle="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.344 [INFO][4435] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:17.405582 containerd[1716]: 2025-05-15 12:28:17.344 [INFO][4435] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.65/26] IPv6=[] ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" HandleID="k8s-pod-network.f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Workload="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.405816 kubelet[3207]: I0515 12:28:17.404867 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cl9bz" podStartSLOduration=3.101952674 podStartE2EDuration="1m14.404115023s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:27:04.535734605 +0000 UTC m=+11.378191805" lastFinishedPulling="2025-05-15 12:28:15.837896944 +0000 UTC m=+82.680354154" observedRunningTime="2025-05-15 12:28:16.488013849 +0000 UTC m=+83.330471063" watchObservedRunningTime="2025-05-15 12:28:17.404115023 +0000 UTC m=+84.246572233" May 15 12:28:17.406200 containerd[1716]: 2025-05-15 12:28:17.346 [INFO][4423] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"dcbe7eb9-f017-4039-b0fc-ef4f90e95554", ResourceVersion:"622", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"csi-node-driver-x9zg6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.38.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6445deedfaa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:17.406278 containerd[1716]: 2025-05-15 12:28:17.346 [INFO][4423] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.65/32] ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.406278 containerd[1716]: 2025-05-15 12:28:17.346 [INFO][4423] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6445deedfaa ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.406278 containerd[1716]: 2025-05-15 12:28:17.391 [INFO][4423] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.406350 containerd[1716]: 2025-05-15 12:28:17.391 [INFO][4423] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"dcbe7eb9-f017-4039-b0fc-ef4f90e95554", ResourceVersion:"622", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41", Pod:"csi-node-driver-x9zg6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.38.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6445deedfaa", MAC:"aa:db:58:1b:5d:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:17.406400 containerd[1716]: 2025-05-15 12:28:17.402 [INFO][4423] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" Namespace="calico-system" Pod="csi-node-driver-x9zg6" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-csi--node--driver--x9zg6-eth0" May 15 12:28:17.506234 containerd[1716]: time="2025-05-15T12:28:17.506205618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"17e1a7427908216e5869e616c6c43f62ab7dc37685dfd8bfa7299af5535edc73\" pid:4465 exit_status:1 exited_at:{seconds:1747312097 nanos:505980786}" May 15 12:28:18.091412 containerd[1716]: time="2025-05-15T12:28:18.091340787Z" level=info msg="connecting to shim f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41" address="unix:///run/containerd/s/edfd285cbb62dba457b148f68114dc339a42e2ed5340a2be42784f1340d92f06" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:18.114036 systemd[1]: Started cri-containerd-f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41.scope - libcontainer container f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41. May 15 12:28:18.134139 containerd[1716]: time="2025-05-15T12:28:18.134101154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x9zg6,Uid:dcbe7eb9-f017-4039-b0fc-ef4f90e95554,Namespace:calico-system,Attempt:0,} returns sandbox id \"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41\"" May 15 12:28:18.135297 containerd[1716]: time="2025-05-15T12:28:18.135244868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 15 12:28:18.249970 systemd-networkd[1350]: vxlan.calico: Link UP May 15 12:28:18.249979 systemd-networkd[1350]: vxlan.calico: Gained carrier May 15 12:28:19.100000 systemd-networkd[1350]: cali6445deedfaa: Gained IPv6LL May 15 12:28:19.484036 systemd-networkd[1350]: vxlan.calico: Gained IPv6LL May 15 12:28:23.434729 containerd[1716]: time="2025-05-15T12:28:23.434687359Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:23.480651 containerd[1716]: time="2025-05-15T12:28:23.480590133Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 15 12:28:23.483819 containerd[1716]: time="2025-05-15T12:28:23.483765918Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:23.529547 containerd[1716]: time="2025-05-15T12:28:23.529488446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:23.530128 containerd[1716]: time="2025-05-15T12:28:23.530055067Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 5.394753884s" May 15 12:28:23.530128 containerd[1716]: time="2025-05-15T12:28:23.530079421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 15 12:28:23.531764 containerd[1716]: time="2025-05-15T12:28:23.531738092Z" level=info msg="CreateContainer within sandbox \"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 15 12:28:23.682309 containerd[1716]: time="2025-05-15T12:28:23.682268123Z" level=info msg="Container 4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:23.836207 containerd[1716]: time="2025-05-15T12:28:23.836184159Z" level=info msg="CreateContainer within sandbox \"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3\"" May 15 12:28:23.836547 containerd[1716]: time="2025-05-15T12:28:23.836528358Z" level=info msg="StartContainer for \"4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3\"" May 15 12:28:23.837474 containerd[1716]: time="2025-05-15T12:28:23.837453008Z" level=info msg="connecting to shim 4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3" address="unix:///run/containerd/s/edfd285cbb62dba457b148f68114dc339a42e2ed5340a2be42784f1340d92f06" protocol=ttrpc version=3 May 15 12:28:23.858070 systemd[1]: Started cri-containerd-4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3.scope - libcontainer container 4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3. May 15 12:28:23.886144 containerd[1716]: time="2025-05-15T12:28:23.886127901Z" level=info msg="StartContainer for \"4fb559938e970037998fc2b7a595a6d68affcaa8bbdd5d306f5e30a59b67f7d3\" returns successfully" May 15 12:28:23.886922 containerd[1716]: time="2025-05-15T12:28:23.886889561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 15 12:28:25.271499 containerd[1716]: time="2025-05-15T12:28:25.271118171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:25.350983 systemd-networkd[1350]: caliddf8d8c93df: Link UP May 15 12:28:25.351154 systemd-networkd[1350]: caliddf8d8c93df: Gained carrier May 15 12:28:25.365138 containerd[1716]: 2025-05-15 12:28:25.302 [INFO][4755] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0 calico-apiserver-57b6bcfb55- calico-apiserver 098a7a46-c8fb-45c8-bc2a-0777e39cd9d5 803 0 2025-05-15 12:27:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57b6bcfb55 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 calico-apiserver-57b6bcfb55-gpjjj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliddf8d8c93df [] []}} ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-" May 15 12:28:25.365138 containerd[1716]: 2025-05-15 12:28:25.302 [INFO][4755] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365138 containerd[1716]: 2025-05-15 12:28:25.322 [INFO][4767] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" HandleID="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.328 [INFO][4767] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" HandleID="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332d00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-81f65144c0", "pod":"calico-apiserver-57b6bcfb55-gpjjj", "timestamp":"2025-05-15 12:28:25.322446467 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.328 [INFO][4767] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.328 [INFO][4767] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.328 [INFO][4767] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.329 [INFO][4767] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.332 [INFO][4767] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.334 [INFO][4767] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.335 [INFO][4767] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365312 containerd[1716]: 2025-05-15 12:28:25.337 [INFO][4767] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.337 [INFO][4767] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.338 [INFO][4767] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12 May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.343 [INFO][4767] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.347 [INFO][4767] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.66/26] block=192.168.38.64/26 handle="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.347 [INFO][4767] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.66/26] handle="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.347 [INFO][4767] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:25.365510 containerd[1716]: 2025-05-15 12:28:25.347 [INFO][4767] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.66/26] IPv6=[] ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" HandleID="k8s-pod-network.b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365640 containerd[1716]: 2025-05-15 12:28:25.348 [INFO][4755] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0", GenerateName:"calico-apiserver-57b6bcfb55-", Namespace:"calico-apiserver", SelfLink:"", UID:"098a7a46-c8fb-45c8-bc2a-0777e39cd9d5", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57b6bcfb55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"calico-apiserver-57b6bcfb55-gpjjj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliddf8d8c93df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:25.365692 containerd[1716]: 2025-05-15 12:28:25.348 [INFO][4755] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.66/32] ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365692 containerd[1716]: 2025-05-15 12:28:25.348 [INFO][4755] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliddf8d8c93df ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365692 containerd[1716]: 2025-05-15 12:28:25.351 [INFO][4755] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.365751 containerd[1716]: 2025-05-15 12:28:25.352 [INFO][4755] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0", GenerateName:"calico-apiserver-57b6bcfb55-", Namespace:"calico-apiserver", SelfLink:"", UID:"098a7a46-c8fb-45c8-bc2a-0777e39cd9d5", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57b6bcfb55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12", Pod:"calico-apiserver-57b6bcfb55-gpjjj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliddf8d8c93df", MAC:"66:67:bb:0f:72:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:25.365803 containerd[1716]: 2025-05-15 12:28:25.360 [INFO][4755] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-gpjjj" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--gpjjj-eth0" May 15 12:28:25.645863 containerd[1716]: time="2025-05-15T12:28:25.645760625Z" level=info msg="connecting to shim b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12" address="unix:///run/containerd/s/23b68556a74f3a8f074d55288a886e5d631483f477d08f9f9f2108b079084337" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:25.670049 systemd[1]: Started cri-containerd-b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12.scope - libcontainer container b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12. May 15 12:28:25.705383 containerd[1716]: time="2025-05-15T12:28:25.705359057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-gpjjj,Uid:098a7a46-c8fb-45c8-bc2a-0777e39cd9d5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12\"" May 15 12:28:26.270077 containerd[1716]: time="2025-05-15T12:28:26.270013304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,}" May 15 12:28:26.347845 systemd-networkd[1350]: cali0ed11ee3a45: Link UP May 15 12:28:26.348612 systemd-networkd[1350]: cali0ed11ee3a45: Gained carrier May 15 12:28:26.360307 containerd[1716]: 2025-05-15 12:28:26.299 [INFO][4836] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0 calico-kube-controllers-6d87fb4d96- calico-system 359bdd50-8681-48b5-afa4-914ea552e2b3 801 0 2025-05-15 12:27:03 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6d87fb4d96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 calico-kube-controllers-6d87fb4d96-lvq8m eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0ed11ee3a45 [] []}} ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-" May 15 12:28:26.360307 containerd[1716]: 2025-05-15 12:28:26.299 [INFO][4836] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.360307 containerd[1716]: 2025-05-15 12:28:26.318 [INFO][4848] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" HandleID="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.323 [INFO][4848] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" HandleID="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002927e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-81f65144c0", "pod":"calico-kube-controllers-6d87fb4d96-lvq8m", "timestamp":"2025-05-15 12:28:26.318357232 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.323 [INFO][4848] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.323 [INFO][4848] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.324 [INFO][4848] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.325 [INFO][4848] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.327 [INFO][4848] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.330 [INFO][4848] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.331 [INFO][4848] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361002 containerd[1716]: 2025-05-15 12:28:26.333 [INFO][4848] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.333 [INFO][4848] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.334 [INFO][4848] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91 May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.341 [INFO][4848] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.345 [INFO][4848] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.67/26] block=192.168.38.64/26 handle="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.345 [INFO][4848] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.67/26] handle="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.345 [INFO][4848] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:26.361334 containerd[1716]: 2025-05-15 12:28:26.345 [INFO][4848] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.67/26] IPv6=[] ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" HandleID="k8s-pod-network.d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.361678 containerd[1716]: 2025-05-15 12:28:26.346 [INFO][4836] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0", GenerateName:"calico-kube-controllers-6d87fb4d96-", Namespace:"calico-system", SelfLink:"", UID:"359bdd50-8681-48b5-afa4-914ea552e2b3", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d87fb4d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"calico-kube-controllers-6d87fb4d96-lvq8m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.38.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0ed11ee3a45", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:26.361831 containerd[1716]: 2025-05-15 12:28:26.346 [INFO][4836] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.67/32] ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.361831 containerd[1716]: 2025-05-15 12:28:26.346 [INFO][4836] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0ed11ee3a45 ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.361831 containerd[1716]: 2025-05-15 12:28:26.347 [INFO][4836] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.362046 containerd[1716]: 2025-05-15 12:28:26.348 [INFO][4836] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0", GenerateName:"calico-kube-controllers-6d87fb4d96-", Namespace:"calico-system", SelfLink:"", UID:"359bdd50-8681-48b5-afa4-914ea552e2b3", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d87fb4d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91", Pod:"calico-kube-controllers-6d87fb4d96-lvq8m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.38.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0ed11ee3a45", MAC:"e2:0f:bf:25:e2:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:26.362209 containerd[1716]: 2025-05-15 12:28:26.357 [INFO][4836] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" Namespace="calico-system" Pod="calico-kube-controllers-6d87fb4d96-lvq8m" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--kube--controllers--6d87fb4d96--lvq8m-eth0" May 15 12:28:26.633431 containerd[1716]: time="2025-05-15T12:28:26.633360816Z" level=info msg="connecting to shim d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91" address="unix:///run/containerd/s/f757e7330b8065d47319d617027468c394194a051a202d1300123206e9380349" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:26.651047 systemd[1]: Started cri-containerd-d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91.scope - libcontainer container d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91. May 15 12:28:26.687225 containerd[1716]: time="2025-05-15T12:28:26.687199926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d87fb4d96-lvq8m,Uid:359bdd50-8681-48b5-afa4-914ea552e2b3,Namespace:calico-system,Attempt:0,} returns sandbox id \"d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91\"" May 15 12:28:26.908062 systemd-networkd[1350]: caliddf8d8c93df: Gained IPv6LL May 15 12:28:27.676077 systemd-networkd[1350]: cali0ed11ee3a45: Gained IPv6LL May 15 12:28:28.270925 containerd[1716]: time="2025-05-15T12:28:28.270802986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,}" May 15 12:28:28.270925 containerd[1716]: time="2025-05-15T12:28:28.270833343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:28.271210 containerd[1716]: time="2025-05-15T12:28:28.271097556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,}" May 15 12:28:28.459602 systemd-networkd[1350]: cali724699b3438: Link UP May 15 12:28:28.460481 systemd-networkd[1350]: cali724699b3438: Gained carrier May 15 12:28:28.472019 containerd[1716]: 2025-05-15 12:28:28.406 [INFO][4914] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0 coredns-6f6b679f8f- kube-system eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3 795 0 2025-05-15 12:26:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 coredns-6f6b679f8f-lvfdw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali724699b3438 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-" May 15 12:28:28.472019 containerd[1716]: 2025-05-15 12:28:28.406 [INFO][4914] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472019 containerd[1716]: 2025-05-15 12:28:28.426 [INFO][4927] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" HandleID="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.432 [INFO][4927] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" HandleID="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b750), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-81f65144c0", "pod":"coredns-6f6b679f8f-lvfdw", "timestamp":"2025-05-15 12:28:28.426347144 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.432 [INFO][4927] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.432 [INFO][4927] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.433 [INFO][4927] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.434 [INFO][4927] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.437 [INFO][4927] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.440 [INFO][4927] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.441 [INFO][4927] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472248 containerd[1716]: 2025-05-15 12:28:28.443 [INFO][4927] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.443 [INFO][4927] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.444 [INFO][4927] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146 May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.449 [INFO][4927] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.456 [INFO][4927] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.68/26] block=192.168.38.64/26 handle="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.456 [INFO][4927] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.68/26] handle="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.456 [INFO][4927] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:28.472481 containerd[1716]: 2025-05-15 12:28:28.456 [INFO][4927] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.68/26] IPv6=[] ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" HandleID="k8s-pod-network.e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.457 [INFO][4914] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"coredns-6f6b679f8f-lvfdw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali724699b3438", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.457 [INFO][4914] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.68/32] ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.457 [INFO][4914] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali724699b3438 ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.460 [INFO][4914] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.461 [INFO][4914] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146", Pod:"coredns-6f6b679f8f-lvfdw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali724699b3438", MAC:"c6:a2:3e:35:5c:c8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.472644 containerd[1716]: 2025-05-15 12:28:28.469 [INFO][4914] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" Namespace="kube-system" Pod="coredns-6f6b679f8f-lvfdw" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--lvfdw-eth0" May 15 12:28:28.593960 systemd-networkd[1350]: cali71637819eda: Link UP May 15 12:28:28.594998 systemd-networkd[1350]: cali71637819eda: Gained carrier May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.501 [INFO][4943] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0 calico-apiserver-57b6bcfb55- calico-apiserver 780b7c79-8417-4cc7-b89c-26a4ca3518c0 802 0 2025-05-15 12:27:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57b6bcfb55 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 calico-apiserver-57b6bcfb55-pdhwm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali71637819eda [] []}} ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.523 [INFO][4943] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.550 [INFO][4968] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" HandleID="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.557 [INFO][4968] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" HandleID="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050740), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-81f65144c0", "pod":"calico-apiserver-57b6bcfb55-pdhwm", "timestamp":"2025-05-15 12:28:28.550902097 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.557 [INFO][4968] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.557 [INFO][4968] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.557 [INFO][4968] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.559 [INFO][4968] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.563 [INFO][4968] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.567 [INFO][4968] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.568 [INFO][4968] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.570 [INFO][4968] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.570 [INFO][4968] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.572 [INFO][4968] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77 May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.580 [INFO][4968] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.591 [INFO][4968] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.69/26] block=192.168.38.64/26 handle="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.591 [INFO][4968] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.69/26] handle="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.591 [INFO][4968] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:28.608628 containerd[1716]: 2025-05-15 12:28:28.591 [INFO][4968] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.69/26] IPv6=[] ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" HandleID="k8s-pod-network.9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Workload="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.592 [INFO][4943] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0", GenerateName:"calico-apiserver-57b6bcfb55-", Namespace:"calico-apiserver", SelfLink:"", UID:"780b7c79-8417-4cc7-b89c-26a4ca3518c0", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57b6bcfb55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"calico-apiserver-57b6bcfb55-pdhwm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali71637819eda", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.592 [INFO][4943] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.69/32] ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.592 [INFO][4943] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71637819eda ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.595 [INFO][4943] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.596 [INFO][4943] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0", GenerateName:"calico-apiserver-57b6bcfb55-", Namespace:"calico-apiserver", SelfLink:"", UID:"780b7c79-8417-4cc7-b89c-26a4ca3518c0", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57b6bcfb55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77", Pod:"calico-apiserver-57b6bcfb55-pdhwm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali71637819eda", MAC:"5a:d9:b9:b6:91:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.609132 containerd[1716]: 2025-05-15 12:28:28.606 [INFO][4943] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" Namespace="calico-apiserver" Pod="calico-apiserver-57b6bcfb55-pdhwm" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-calico--apiserver--57b6bcfb55--pdhwm-eth0" May 15 12:28:28.683369 systemd-networkd[1350]: calie3922411a31: Link UP May 15 12:28:28.683763 systemd-networkd[1350]: calie3922411a31: Gained carrier May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.554 [INFO][4958] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0 coredns-6f6b679f8f- kube-system 879a1dfa-1019-4d50-8d7e-cf0e572f63f5 799 0 2025-05-15 12:26:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-81f65144c0 coredns-6f6b679f8f-dmc8v eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie3922411a31 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.555 [INFO][4958] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.587 [INFO][4978] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" HandleID="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.658 [INFO][4978] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" HandleID="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000293110), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-81f65144c0", "pod":"coredns-6f6b679f8f-dmc8v", "timestamp":"2025-05-15 12:28:28.58727671 +0000 UTC"}, Hostname:"ci-4334.0.0-a-81f65144c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.658 [INFO][4978] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.658 [INFO][4978] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.658 [INFO][4978] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-81f65144c0' May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.659 [INFO][4978] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.661 [INFO][4978] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.666 [INFO][4978] ipam/ipam.go 489: Trying affinity for 192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.667 [INFO][4978] ipam/ipam.go 155: Attempting to load block cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.669 [INFO][4978] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.669 [INFO][4978] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.670 [INFO][4978] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.673 [INFO][4978] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.680 [INFO][4978] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.38.70/26] block=192.168.38.64/26 handle="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.680 [INFO][4978] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.38.70/26] handle="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" host="ci-4334.0.0-a-81f65144c0" May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.680 [INFO][4978] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:28.697698 containerd[1716]: 2025-05-15 12:28:28.680 [INFO][4978] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.38.70/26] IPv6=[] ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" HandleID="k8s-pod-network.83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Workload="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.681 [INFO][4958] cni-plugin/k8s.go 386: Populated endpoint ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"879a1dfa-1019-4d50-8d7e-cf0e572f63f5", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"", Pod:"coredns-6f6b679f8f-dmc8v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie3922411a31", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.681 [INFO][4958] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.38.70/32] ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.681 [INFO][4958] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie3922411a31 ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.683 [INFO][4958] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.684 [INFO][4958] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"879a1dfa-1019-4d50-8d7e-cf0e572f63f5", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-81f65144c0", ContainerID:"83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf", Pod:"coredns-6f6b679f8f-dmc8v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie3922411a31", MAC:"ae:1d:39:ec:8e:af", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:28.699155 containerd[1716]: 2025-05-15 12:28:28.696 [INFO][4958] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" Namespace="kube-system" Pod="coredns-6f6b679f8f-dmc8v" WorkloadEndpoint="ci--4334.0.0--a--81f65144c0-k8s-coredns--6f6b679f8f--dmc8v-eth0" May 15 12:28:29.389636 containerd[1716]: time="2025-05-15T12:28:29.389596554Z" level=info msg="connecting to shim 9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77" address="unix:///run/containerd/s/0648982fc0643359f56ba731efbb7a0a54d2932162c1a3551dde7910e9987481" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:29.409067 systemd[1]: Started cri-containerd-9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77.scope - libcontainer container 9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77. May 15 12:28:29.438639 containerd[1716]: time="2025-05-15T12:28:29.438575368Z" level=info msg="connecting to shim e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146" address="unix:///run/containerd/s/fcb3498793bf572bc9498e1dad1bd87f65e695dde2ab420fa2fc61980984e32c" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:29.465022 systemd[1]: Started cri-containerd-e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146.scope - libcontainer container e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146. May 15 12:28:29.532092 systemd-networkd[1350]: cali724699b3438: Gained IPv6LL May 15 12:28:29.587375 containerd[1716]: time="2025-05-15T12:28:29.587352308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57b6bcfb55-pdhwm,Uid:780b7c79-8417-4cc7-b89c-26a4ca3518c0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77\"" May 15 12:28:29.684427 containerd[1716]: time="2025-05-15T12:28:29.684353225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-lvfdw,Uid:eb5df7ca-784b-4d32-8ab0-b2fcf0d8f4a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146\"" May 15 12:28:29.686499 containerd[1716]: time="2025-05-15T12:28:29.686455805Z" level=info msg="CreateContainer within sandbox \"e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:28:29.881883 containerd[1716]: time="2025-05-15T12:28:29.881854028Z" level=info msg="connecting to shim 83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf" address="unix:///run/containerd/s/5ab5a03183393d476a3199503015499ea3853fb641b14863eb253225483b78d7" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:29.900033 systemd[1]: Started cri-containerd-83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf.scope - libcontainer container 83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf. May 15 12:28:29.915999 systemd-networkd[1350]: calie3922411a31: Gained IPv6LL May 15 12:28:30.126746 containerd[1716]: time="2025-05-15T12:28:30.126710933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dmc8v,Uid:879a1dfa-1019-4d50-8d7e-cf0e572f63f5,Namespace:kube-system,Attempt:0,} returns sandbox id \"83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf\"" May 15 12:28:30.556037 systemd-networkd[1350]: cali71637819eda: Gained IPv6LL May 15 12:28:30.779699 containerd[1716]: time="2025-05-15T12:28:30.779664358Z" level=info msg="CreateContainer within sandbox \"83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:28:30.939141 containerd[1716]: time="2025-05-15T12:28:30.939109354Z" level=info msg="Container a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:31.128660 containerd[1716]: time="2025-05-15T12:28:31.128580343Z" level=info msg="CreateContainer within sandbox \"e6ceb4b87bd4b9e61246f41fd92281c95ab19914da3a57470587b7bf70ac7146\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef\"" May 15 12:28:31.129151 containerd[1716]: time="2025-05-15T12:28:31.129134577Z" level=info msg="StartContainer for \"a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef\"" May 15 12:28:31.130135 containerd[1716]: time="2025-05-15T12:28:31.130108294Z" level=info msg="connecting to shim a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef" address="unix:///run/containerd/s/fcb3498793bf572bc9498e1dad1bd87f65e695dde2ab420fa2fc61980984e32c" protocol=ttrpc version=3 May 15 12:28:31.147030 systemd[1]: Started cri-containerd-a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef.scope - libcontainer container a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef. May 15 12:28:31.178570 containerd[1716]: time="2025-05-15T12:28:31.178332531Z" level=info msg="Container 4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:31.179136 containerd[1716]: time="2025-05-15T12:28:31.179113819Z" level=info msg="StartContainer for \"a3647fd1a4b0ff62f008c4c4d6804f7f5eb07230379d90e1b81b5dbe83804aef\" returns successfully" May 15 12:28:31.377727 containerd[1716]: time="2025-05-15T12:28:31.377708401Z" level=info msg="CreateContainer within sandbox \"83a51856a3083a906ea3a42505756438995433479fac3eb7ef4a0de954cd50bf\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe\"" May 15 12:28:31.378088 containerd[1716]: time="2025-05-15T12:28:31.378073278Z" level=info msg="StartContainer for \"4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe\"" May 15 12:28:31.379156 containerd[1716]: time="2025-05-15T12:28:31.379012306Z" level=info msg="connecting to shim 4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe" address="unix:///run/containerd/s/5ab5a03183393d476a3199503015499ea3853fb641b14863eb253225483b78d7" protocol=ttrpc version=3 May 15 12:28:31.395056 systemd[1]: Started cri-containerd-4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe.scope - libcontainer container 4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe. May 15 12:28:31.424762 containerd[1716]: time="2025-05-15T12:28:31.424731664Z" level=info msg="StartContainer for \"4ae20b02679d5db6957bf7007eaa8220652aad4664726f5715949f55c0126cbe\" returns successfully" May 15 12:28:31.507448 kubelet[3207]: I0515 12:28:31.506878 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-lvfdw" podStartSLOduration=98.506861216 podStartE2EDuration="1m38.506861216s" podCreationTimestamp="2025-05-15 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:31.506685428 +0000 UTC m=+98.349142635" watchObservedRunningTime="2025-05-15 12:28:31.506861216 +0000 UTC m=+98.349318425" May 15 12:28:31.521116 kubelet[3207]: I0515 12:28:31.521080 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-dmc8v" podStartSLOduration=98.521069508 podStartE2EDuration="1m38.521069508s" podCreationTimestamp="2025-05-15 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:31.520516179 +0000 UTC m=+98.362973387" watchObservedRunningTime="2025-05-15 12:28:31.521069508 +0000 UTC m=+98.363526719" May 15 12:28:31.650940 containerd[1716]: time="2025-05-15T12:28:31.650803088Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"3001e39435fc1d94434dd9396fa99d5af258601939135703bc5d11fd041aa6ae\" pid:5237 exited_at:{seconds:1747312111 nanos:650437308}" May 15 12:28:37.180997 containerd[1716]: time="2025-05-15T12:28:37.180950646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:37.184557 containerd[1716]: time="2025-05-15T12:28:37.184529945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 15 12:28:37.228221 containerd[1716]: time="2025-05-15T12:28:37.228151803Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:37.274672 containerd[1716]: time="2025-05-15T12:28:37.274605711Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:37.275282 containerd[1716]: time="2025-05-15T12:28:37.275191058Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 13.388276699s" May 15 12:28:37.275282 containerd[1716]: time="2025-05-15T12:28:37.275220941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 15 12:28:37.276034 containerd[1716]: time="2025-05-15T12:28:37.275974961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:28:37.277057 containerd[1716]: time="2025-05-15T12:28:37.277012538Z" level=info msg="CreateContainer within sandbox \"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 15 12:28:37.482452 containerd[1716]: time="2025-05-15T12:28:37.482347724Z" level=info msg="Container 14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:37.581116 containerd[1716]: time="2025-05-15T12:28:37.581077946Z" level=info msg="CreateContainer within sandbox \"f44aaeb69d7b636757f02c6323fc4ead482e25e043643f8172cffa79a9a8ea41\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e\"" May 15 12:28:37.581566 containerd[1716]: time="2025-05-15T12:28:37.581541598Z" level=info msg="StartContainer for \"14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e\"" May 15 12:28:37.583084 containerd[1716]: time="2025-05-15T12:28:37.583058587Z" level=info msg="connecting to shim 14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e" address="unix:///run/containerd/s/edfd285cbb62dba457b148f68114dc339a42e2ed5340a2be42784f1340d92f06" protocol=ttrpc version=3 May 15 12:28:37.604107 systemd[1]: Started cri-containerd-14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e.scope - libcontainer container 14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e. May 15 12:28:37.634522 containerd[1716]: time="2025-05-15T12:28:37.634495371Z" level=info msg="StartContainer for \"14c6f2639a19e4350059dcad86d073fe4ffb896cba2439caa3a18bd23b9dbb6e\" returns successfully" May 15 12:28:38.604232 kubelet[3207]: I0515 12:28:38.604207 3207 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 15 12:28:38.604232 kubelet[3207]: I0515 12:28:38.604239 3207 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 15 12:28:44.338154 systemd[1]: Started sshd@7-10.200.8.32:22-10.200.16.10:49298.service - OpenSSH per-connection server daemon (10.200.16.10:49298). May 15 12:28:44.973207 sshd[5305]: Accepted publickey for core from 10.200.16.10 port 49298 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:44.974471 sshd-session[5305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:44.978493 systemd-logind[1703]: New session 10 of user core. May 15 12:28:44.984047 systemd[1]: Started session-10.scope - Session 10 of User core. May 15 12:28:45.519978 sshd[5311]: Connection closed by 10.200.16.10 port 49298 May 15 12:28:45.520330 sshd-session[5305]: pam_unix(sshd:session): session closed for user core May 15 12:28:45.522974 systemd[1]: sshd@7-10.200.8.32:22-10.200.16.10:49298.service: Deactivated successfully. May 15 12:28:45.524581 systemd[1]: session-10.scope: Deactivated successfully. May 15 12:28:45.525633 systemd-logind[1703]: Session 10 logged out. Waiting for processes to exit. May 15 12:28:45.526679 systemd-logind[1703]: Removed session 10. May 15 12:28:50.652472 systemd[1]: Started sshd@8-10.200.8.32:22-10.200.16.10:49254.service - OpenSSH per-connection server daemon (10.200.16.10:49254). May 15 12:28:50.673920 containerd[1716]: time="2025-05-15T12:28:50.673856226Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:51.287226 sshd[5333]: Accepted publickey for core from 10.200.16.10 port 49254 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:51.288225 sshd-session[5333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:51.292102 systemd-logind[1703]: New session 11 of user core. May 15 12:28:51.300064 systemd[1]: Started session-11.scope - Session 11 of User core. May 15 12:28:51.484201 containerd[1716]: time="2025-05-15T12:28:51.484149598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 15 12:28:51.781504 sshd[5335]: Connection closed by 10.200.16.10 port 49254 May 15 12:28:51.781871 sshd-session[5333]: pam_unix(sshd:session): session closed for user core May 15 12:28:51.784552 systemd[1]: sshd@8-10.200.8.32:22-10.200.16.10:49254.service: Deactivated successfully. May 15 12:28:51.786210 systemd[1]: session-11.scope: Deactivated successfully. May 15 12:28:51.787025 systemd-logind[1703]: Session 11 logged out. Waiting for processes to exit. May 15 12:28:51.788226 systemd-logind[1703]: Removed session 11. May 15 12:28:53.083297 containerd[1716]: time="2025-05-15T12:28:53.083222546Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:54.682405 containerd[1716]: time="2025-05-15T12:28:54.681317624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:54.682405 containerd[1716]: time="2025-05-15T12:28:54.682162024Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 17.406155055s" May 15 12:28:54.682405 containerd[1716]: time="2025-05-15T12:28:54.682189256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:28:54.683973 containerd[1716]: time="2025-05-15T12:28:54.683946816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 15 12:28:54.684208 containerd[1716]: time="2025-05-15T12:28:54.684186947Z" level=info msg="CreateContainer within sandbox \"b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:28:56.893603 systemd[1]: Started sshd@9-10.200.8.32:22-10.200.16.10:49270.service - OpenSSH per-connection server daemon (10.200.16.10:49270). May 15 12:28:57.530711 sshd[5354]: Accepted publickey for core from 10.200.16.10 port 49270 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:57.531754 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:57.537292 systemd-logind[1703]: New session 12 of user core. May 15 12:28:57.541049 systemd[1]: Started session-12.scope - Session 12 of User core. May 15 12:28:58.021612 sshd[5356]: Connection closed by 10.200.16.10 port 49270 May 15 12:28:58.021959 sshd-session[5354]: pam_unix(sshd:session): session closed for user core May 15 12:28:58.024485 systemd[1]: sshd@9-10.200.8.32:22-10.200.16.10:49270.service: Deactivated successfully. May 15 12:28:58.026107 systemd[1]: session-12.scope: Deactivated successfully. May 15 12:28:58.026778 systemd-logind[1703]: Session 12 logged out. Waiting for processes to exit. May 15 12:28:58.027854 systemd-logind[1703]: Removed session 12. May 15 12:29:01.579053 containerd[1716]: time="2025-05-15T12:29:01.578995724Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"e07270dee09491e45b758f9f7b9ac627415f447f817f95351b7ca4485636f2e7\" pid:5387 exited_at:{seconds:1747312141 nanos:578706181}" May 15 12:29:03.134980 systemd[1]: Started sshd@10-10.200.8.32:22-10.200.16.10:41716.service - OpenSSH per-connection server daemon (10.200.16.10:41716). May 15 12:29:03.775232 sshd[5400]: Accepted publickey for core from 10.200.16.10 port 41716 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:03.776219 sshd-session[5400]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:03.780143 systemd-logind[1703]: New session 13 of user core. May 15 12:29:04.430794 sshd[5402]: Connection closed by 10.200.16.10 port 41716 May 15 12:29:04.297463 sshd-session[5400]: pam_unix(sshd:session): session closed for user core May 15 12:29:03.784059 systemd[1]: Started session-13.scope - Session 13 of User core. May 15 12:29:04.299622 systemd[1]: sshd@10-10.200.8.32:22-10.200.16.10:41716.service: Deactivated successfully. May 15 12:29:04.301290 systemd[1]: session-13.scope: Deactivated successfully. May 15 12:29:04.302802 systemd-logind[1703]: Session 13 logged out. Waiting for processes to exit. May 15 12:29:04.303607 systemd-logind[1703]: Removed session 13. May 15 12:29:05.428312 containerd[1716]: time="2025-05-15T12:29:05.428269520Z" level=info msg="Container 562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:05.571054 containerd[1716]: time="2025-05-15T12:29:05.571019111Z" level=info msg="CreateContainer within sandbox \"b3b667f30e77cf39d8f7ee19b2917e4e28b247dc505a4c90ea3b0d7d55f00c12\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b\"" May 15 12:29:05.571442 containerd[1716]: time="2025-05-15T12:29:05.571408156Z" level=info msg="StartContainer for \"562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b\"" May 15 12:29:05.572781 containerd[1716]: time="2025-05-15T12:29:05.572739845Z" level=info msg="connecting to shim 562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b" address="unix:///run/containerd/s/23b68556a74f3a8f074d55288a886e5d631483f477d08f9f9f2108b079084337" protocol=ttrpc version=3 May 15 12:29:05.596035 systemd[1]: Started cri-containerd-562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b.scope - libcontainer container 562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b. May 15 12:29:05.635580 containerd[1716]: time="2025-05-15T12:29:05.635561140Z" level=info msg="StartContainer for \"562ceb9f09aca4ea0122d95f648745003261890f61ed7e63a186e8ce17d7743b\" returns successfully" May 15 12:29:06.578144 kubelet[3207]: I0515 12:29:06.578087 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-x9zg6" podStartSLOduration=104.43727592 podStartE2EDuration="2m3.578069649s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:28:18.135057087 +0000 UTC m=+84.977514284" lastFinishedPulling="2025-05-15 12:28:37.275850808 +0000 UTC m=+104.118308013" observedRunningTime="2025-05-15 12:28:38.527082738 +0000 UTC m=+105.369539950" watchObservedRunningTime="2025-05-15 12:29:06.578069649 +0000 UTC m=+133.420526857" May 15 12:29:06.578545 kubelet[3207]: I0515 12:29:06.578262 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57b6bcfb55-gpjjj" podStartSLOduration=94.601577558 podStartE2EDuration="2m3.578257166s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:28:25.706270225 +0000 UTC m=+92.548727429" lastFinishedPulling="2025-05-15 12:28:54.682949826 +0000 UTC m=+121.525407037" observedRunningTime="2025-05-15 12:29:06.57654259 +0000 UTC m=+133.418999800" watchObservedRunningTime="2025-05-15 12:29:06.578257166 +0000 UTC m=+133.420714376" May 15 12:29:09.414099 systemd[1]: Started sshd@11-10.200.8.32:22-10.200.16.10:42206.service - OpenSSH per-connection server daemon (10.200.16.10:42206). May 15 12:29:10.052293 sshd[5453]: Accepted publickey for core from 10.200.16.10 port 42206 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:10.053213 sshd-session[5453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:10.060088 systemd-logind[1703]: New session 14 of user core. May 15 12:29:10.066064 systemd[1]: Started session-14.scope - Session 14 of User core. May 15 12:29:10.547231 sshd[5458]: Connection closed by 10.200.16.10 port 42206 May 15 12:29:10.547638 sshd-session[5453]: pam_unix(sshd:session): session closed for user core May 15 12:29:10.550265 systemd[1]: sshd@11-10.200.8.32:22-10.200.16.10:42206.service: Deactivated successfully. May 15 12:29:10.551861 systemd[1]: session-14.scope: Deactivated successfully. May 15 12:29:10.552565 systemd-logind[1703]: Session 14 logged out. Waiting for processes to exit. May 15 12:29:10.553666 systemd-logind[1703]: Removed session 14. May 15 12:29:13.523902 containerd[1716]: time="2025-05-15T12:29:13.523844175Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:13.526056 containerd[1716]: time="2025-05-15T12:29:13.526031477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 15 12:29:13.571293 containerd[1716]: time="2025-05-15T12:29:13.571244535Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:13.634394 containerd[1716]: time="2025-05-15T12:29:13.634357431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:13.634830 containerd[1716]: time="2025-05-15T12:29:13.634801395Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 18.950721393s" May 15 12:29:13.634883 containerd[1716]: time="2025-05-15T12:29:13.634829147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 15 12:29:13.635655 containerd[1716]: time="2025-05-15T12:29:13.635607964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:29:13.645932 containerd[1716]: time="2025-05-15T12:29:13.645767850Z" level=info msg="CreateContainer within sandbox \"d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 12:29:13.828359 containerd[1716]: time="2025-05-15T12:29:13.826832340Z" level=info msg="Container 8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:13.938280 containerd[1716]: time="2025-05-15T12:29:13.938257639Z" level=info msg="CreateContainer within sandbox \"d166d16cc59d6c28accf8f6a5d9fec059da2761ba6c4ae91fe1d32b51813df91\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\"" May 15 12:29:13.938642 containerd[1716]: time="2025-05-15T12:29:13.938622160Z" level=info msg="StartContainer for \"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\"" May 15 12:29:13.939568 containerd[1716]: time="2025-05-15T12:29:13.939528444Z" level=info msg="connecting to shim 8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a" address="unix:///run/containerd/s/f757e7330b8065d47319d617027468c394194a051a202d1300123206e9380349" protocol=ttrpc version=3 May 15 12:29:13.959055 systemd[1]: Started cri-containerd-8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a.scope - libcontainer container 8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a. May 15 12:29:13.998620 containerd[1716]: time="2025-05-15T12:29:13.998584534Z" level=info msg="StartContainer for \"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" returns successfully" May 15 12:29:14.520626 containerd[1716]: time="2025-05-15T12:29:14.520564697Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:14.584926 containerd[1716]: time="2025-05-15T12:29:14.584633106Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 12:29:14.587990 containerd[1716]: time="2025-05-15T12:29:14.587967667Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 952.33541ms" May 15 12:29:14.588176 containerd[1716]: time="2025-05-15T12:29:14.588159811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:29:14.590287 containerd[1716]: time="2025-05-15T12:29:14.590203519Z" level=info msg="CreateContainer within sandbox \"9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:29:14.623965 containerd[1716]: time="2025-05-15T12:29:14.623944502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"c4bdef536dab8d55709960aa786e4b3ad40e882bfab23fc99d35527b3ad6b08f\" pid:5521 exited_at:{seconds:1747312154 nanos:623622986}" May 15 12:29:14.636441 kubelet[3207]: I0515 12:29:14.636391 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6d87fb4d96-lvq8m" podStartSLOduration=84.689010391 podStartE2EDuration="2m11.636375686s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:28:26.688107141 +0000 UTC m=+93.530564343" lastFinishedPulling="2025-05-15 12:29:13.635472424 +0000 UTC m=+140.477929638" observedRunningTime="2025-05-15 12:29:14.596253733 +0000 UTC m=+141.438710943" watchObservedRunningTime="2025-05-15 12:29:14.636375686 +0000 UTC m=+141.478833001" May 15 12:29:14.776883 containerd[1716]: time="2025-05-15T12:29:14.774526603Z" level=info msg="Container 17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:14.933515 containerd[1716]: time="2025-05-15T12:29:14.933484913Z" level=info msg="CreateContainer within sandbox \"9bff7a2971678588f9513ff87178816b9a849fe7a52d11f97f99fe09897dbc77\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f\"" May 15 12:29:14.934007 containerd[1716]: time="2025-05-15T12:29:14.933961638Z" level=info msg="StartContainer for \"17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f\"" May 15 12:29:14.935335 containerd[1716]: time="2025-05-15T12:29:14.935289053Z" level=info msg="connecting to shim 17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f" address="unix:///run/containerd/s/0648982fc0643359f56ba731efbb7a0a54d2932162c1a3551dde7910e9987481" protocol=ttrpc version=3 May 15 12:29:14.956070 systemd[1]: Started cri-containerd-17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f.scope - libcontainer container 17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f. May 15 12:29:14.995262 containerd[1716]: time="2025-05-15T12:29:14.995208346Z" level=info msg="StartContainer for \"17c5dc9a248f875594b54639ca7d9640a15c993f738784ce6537f31e1e51916f\" returns successfully" May 15 12:29:15.670706 systemd[1]: Started sshd@12-10.200.8.32:22-10.200.16.10:42214.service - OpenSSH per-connection server daemon (10.200.16.10:42214). May 15 12:29:15.754603 kubelet[3207]: I0515 12:29:15.754531 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57b6bcfb55-pdhwm" podStartSLOduration=87.754155509 podStartE2EDuration="2m12.754515117s" podCreationTimestamp="2025-05-15 12:27:03 +0000 UTC" firstStartedPulling="2025-05-15 12:28:29.588343721 +0000 UTC m=+96.430800921" lastFinishedPulling="2025-05-15 12:29:14.588703329 +0000 UTC m=+141.431160529" observedRunningTime="2025-05-15 12:29:15.596572763 +0000 UTC m=+142.439029973" watchObservedRunningTime="2025-05-15 12:29:15.754515117 +0000 UTC m=+142.596972328" May 15 12:29:16.305688 sshd[5567]: Accepted publickey for core from 10.200.16.10 port 42214 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:16.306714 sshd-session[5567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:16.310991 systemd-logind[1703]: New session 15 of user core. May 15 12:29:16.314048 systemd[1]: Started session-15.scope - Session 15 of User core. May 15 12:29:16.853654 sshd[5571]: Connection closed by 10.200.16.10 port 42214 May 15 12:29:16.854079 sshd-session[5567]: pam_unix(sshd:session): session closed for user core May 15 12:29:16.856709 systemd[1]: sshd@12-10.200.8.32:22-10.200.16.10:42214.service: Deactivated successfully. May 15 12:29:16.858431 systemd[1]: session-15.scope: Deactivated successfully. May 15 12:29:16.859476 systemd-logind[1703]: Session 15 logged out. Waiting for processes to exit. May 15 12:29:16.860583 systemd-logind[1703]: Removed session 15. May 15 12:29:17.177392 update_engine[1709]: I20250515 12:29:17.177294 1709 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 15 12:29:17.177392 update_engine[1709]: I20250515 12:29:17.177337 1709 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 15 12:29:17.177800 update_engine[1709]: I20250515 12:29:17.177461 1709 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 15 12:29:17.177800 update_engine[1709]: I20250515 12:29:17.177763 1709 omaha_request_params.cc:62] Current group set to developer May 15 12:29:17.178318 update_engine[1709]: I20250515 12:29:17.177858 1709 update_attempter.cc:499] Already updated boot flags. Skipping. May 15 12:29:17.178318 update_engine[1709]: I20250515 12:29:17.177865 1709 update_attempter.cc:643] Scheduling an action processor start. May 15 12:29:17.178318 update_engine[1709]: I20250515 12:29:17.178242 1709 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:29:17.178414 update_engine[1709]: I20250515 12:29:17.178321 1709 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 15 12:29:17.178414 update_engine[1709]: I20250515 12:29:17.178380 1709 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:29:17.178414 update_engine[1709]: I20250515 12:29:17.178385 1709 omaha_request_action.cc:272] Request: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: May 15 12:29:17.178414 update_engine[1709]: I20250515 12:29:17.178390 1709 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:29:17.179376 locksmithd[1805]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 15 12:29:17.179994 update_engine[1709]: I20250515 12:29:17.179955 1709 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:29:17.180550 update_engine[1709]: I20250515 12:29:17.180508 1709 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:29:17.215232 update_engine[1709]: E20250515 12:29:17.215201 1709 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:29:17.215316 update_engine[1709]: I20250515 12:29:17.215273 1709 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 15 12:29:21.974273 systemd[1]: Started sshd@13-10.200.8.32:22-10.200.16.10:48208.service - OpenSSH per-connection server daemon (10.200.16.10:48208). May 15 12:29:22.255236 containerd[1716]: time="2025-05-15T12:29:22.255124825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"2e866b1eb142f013d61d5b0c1759d6ccde6977440dd8ad91670cadd05f18e66d\" pid:5606 exited_at:{seconds:1747312162 nanos:254287965}" May 15 12:29:22.617556 sshd[5591]: Accepted publickey for core from 10.200.16.10 port 48208 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:22.618635 sshd-session[5591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:22.622785 systemd-logind[1703]: New session 16 of user core. May 15 12:29:22.630068 systemd[1]: Started session-16.scope - Session 16 of User core. May 15 12:29:23.111360 sshd[5615]: Connection closed by 10.200.16.10 port 48208 May 15 12:29:23.111793 sshd-session[5591]: pam_unix(sshd:session): session closed for user core May 15 12:29:23.114123 systemd[1]: sshd@13-10.200.8.32:22-10.200.16.10:48208.service: Deactivated successfully. May 15 12:29:23.115832 systemd[1]: session-16.scope: Deactivated successfully. May 15 12:29:23.117496 systemd-logind[1703]: Session 16 logged out. Waiting for processes to exit. May 15 12:29:23.118394 systemd-logind[1703]: Removed session 16. May 15 12:29:27.180595 update_engine[1709]: I20250515 12:29:27.180513 1709 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:29:27.181035 update_engine[1709]: I20250515 12:29:27.180801 1709 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:29:27.181151 update_engine[1709]: I20250515 12:29:27.181113 1709 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:29:27.217190 update_engine[1709]: E20250515 12:29:27.217145 1709 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:29:27.217302 update_engine[1709]: I20250515 12:29:27.217212 1709 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 15 12:29:28.230004 systemd[1]: Started sshd@14-10.200.8.32:22-10.200.16.10:48210.service - OpenSSH per-connection server daemon (10.200.16.10:48210). May 15 12:29:28.863222 sshd[5631]: Accepted publickey for core from 10.200.16.10 port 48210 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:28.864146 sshd-session[5631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:28.868056 systemd-logind[1703]: New session 17 of user core. May 15 12:29:28.873064 systemd[1]: Started session-17.scope - Session 17 of User core. May 15 12:29:29.354396 sshd[5633]: Connection closed by 10.200.16.10 port 48210 May 15 12:29:29.354770 sshd-session[5631]: pam_unix(sshd:session): session closed for user core May 15 12:29:29.357310 systemd[1]: sshd@14-10.200.8.32:22-10.200.16.10:48210.service: Deactivated successfully. May 15 12:29:29.358891 systemd[1]: session-17.scope: Deactivated successfully. May 15 12:29:29.359564 systemd-logind[1703]: Session 17 logged out. Waiting for processes to exit. May 15 12:29:29.360721 systemd-logind[1703]: Removed session 17. May 15 12:29:31.579012 containerd[1716]: time="2025-05-15T12:29:31.578971935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"1e1573c0bb929054393773a562520ca935505bf73284b0cd048e805dc76ac02b\" pid:5657 exited_at:{seconds:1747312171 nanos:578580737}" May 15 12:29:34.465737 systemd[1]: Started sshd@15-10.200.8.32:22-10.200.16.10:34922.service - OpenSSH per-connection server daemon (10.200.16.10:34922). May 15 12:29:35.103280 sshd[5670]: Accepted publickey for core from 10.200.16.10 port 34922 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:35.104424 sshd-session[5670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:35.108521 systemd-logind[1703]: New session 18 of user core. May 15 12:29:35.110068 systemd[1]: Started session-18.scope - Session 18 of User core. May 15 12:29:35.597751 sshd[5672]: Connection closed by 10.200.16.10 port 34922 May 15 12:29:35.598113 sshd-session[5670]: pam_unix(sshd:session): session closed for user core May 15 12:29:35.600997 systemd-logind[1703]: Session 18 logged out. Waiting for processes to exit. May 15 12:29:35.601460 systemd[1]: sshd@15-10.200.8.32:22-10.200.16.10:34922.service: Deactivated successfully. May 15 12:29:35.604663 systemd[1]: session-18.scope: Deactivated successfully. May 15 12:29:35.609067 systemd-logind[1703]: Removed session 18. May 15 12:29:35.712132 systemd[1]: Started sshd@16-10.200.8.32:22-10.200.16.10:34930.service - OpenSSH per-connection server daemon (10.200.16.10:34930). May 15 12:29:36.345843 sshd[5685]: Accepted publickey for core from 10.200.16.10 port 34930 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:36.347735 sshd-session[5685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:36.352569 systemd-logind[1703]: New session 19 of user core. May 15 12:29:36.359032 systemd[1]: Started session-19.scope - Session 19 of User core. May 15 12:29:36.878661 sshd[5687]: Connection closed by 10.200.16.10 port 34930 May 15 12:29:36.879081 sshd-session[5685]: pam_unix(sshd:session): session closed for user core May 15 12:29:36.881645 systemd[1]: sshd@16-10.200.8.32:22-10.200.16.10:34930.service: Deactivated successfully. May 15 12:29:36.883106 systemd[1]: session-19.scope: Deactivated successfully. May 15 12:29:36.883731 systemd-logind[1703]: Session 19 logged out. Waiting for processes to exit. May 15 12:29:36.885054 systemd-logind[1703]: Removed session 19. May 15 12:29:36.994349 systemd[1]: Started sshd@17-10.200.8.32:22-10.200.16.10:34946.service - OpenSSH per-connection server daemon (10.200.16.10:34946). May 15 12:29:37.177595 update_engine[1709]: I20250515 12:29:37.177238 1709 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:29:37.177595 update_engine[1709]: I20250515 12:29:37.177492 1709 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:29:37.177905 update_engine[1709]: I20250515 12:29:37.177760 1709 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:29:37.216933 update_engine[1709]: E20250515 12:29:37.214114 1709 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:29:37.216933 update_engine[1709]: I20250515 12:29:37.214179 1709 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 15 12:29:37.643068 sshd[5697]: Accepted publickey for core from 10.200.16.10 port 34946 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:37.644033 sshd-session[5697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:37.648022 systemd-logind[1703]: New session 20 of user core. May 15 12:29:37.656068 systemd[1]: Started session-20.scope - Session 20 of User core. May 15 12:29:38.153030 sshd[5699]: Connection closed by 10.200.16.10 port 34946 May 15 12:29:38.153400 sshd-session[5697]: pam_unix(sshd:session): session closed for user core May 15 12:29:38.155965 systemd[1]: sshd@17-10.200.8.32:22-10.200.16.10:34946.service: Deactivated successfully. May 15 12:29:38.157602 systemd[1]: session-20.scope: Deactivated successfully. May 15 12:29:38.158283 systemd-logind[1703]: Session 20 logged out. Waiting for processes to exit. May 15 12:29:38.159450 systemd-logind[1703]: Removed session 20. May 15 12:29:43.273425 systemd[1]: Started sshd@18-10.200.8.32:22-10.200.16.10:57426.service - OpenSSH per-connection server daemon (10.200.16.10:57426). May 15 12:29:43.908097 sshd[5717]: Accepted publickey for core from 10.200.16.10 port 57426 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:43.909434 sshd-session[5717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:43.912874 systemd-logind[1703]: New session 21 of user core. May 15 12:29:43.918071 systemd[1]: Started session-21.scope - Session 21 of User core. May 15 12:29:44.423645 sshd[5719]: Connection closed by 10.200.16.10 port 57426 May 15 12:29:44.424077 sshd-session[5717]: pam_unix(sshd:session): session closed for user core May 15 12:29:44.426235 systemd[1]: sshd@18-10.200.8.32:22-10.200.16.10:57426.service: Deactivated successfully. May 15 12:29:44.427886 systemd[1]: session-21.scope: Deactivated successfully. May 15 12:29:44.429524 systemd-logind[1703]: Session 21 logged out. Waiting for processes to exit. May 15 12:29:44.430337 systemd-logind[1703]: Removed session 21. May 15 12:29:47.178814 update_engine[1709]: I20250515 12:29:47.178759 1709 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:29:47.179193 update_engine[1709]: I20250515 12:29:47.178994 1709 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:29:47.179265 update_engine[1709]: I20250515 12:29:47.179229 1709 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:29:47.200871 update_engine[1709]: E20250515 12:29:47.200834 1709 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:29:47.200989 update_engine[1709]: I20250515 12:29:47.200891 1709 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:29:47.200989 update_engine[1709]: I20250515 12:29:47.200897 1709 omaha_request_action.cc:617] Omaha request response: May 15 12:29:47.200989 update_engine[1709]: E20250515 12:29:47.200975 1709 omaha_request_action.cc:636] Omaha request network transfer failed. May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.200992 1709 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.200996 1709 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.201000 1709 update_attempter.cc:306] Processing Done. May 15 12:29:47.201057 update_engine[1709]: E20250515 12:29:47.201014 1709 update_attempter.cc:619] Update failed. May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.201020 1709 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.201025 1709 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 15 12:29:47.201057 update_engine[1709]: I20250515 12:29:47.201030 1709 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 15 12:29:47.201189 update_engine[1709]: I20250515 12:29:47.201104 1709 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:29:47.201189 update_engine[1709]: I20250515 12:29:47.201125 1709 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:29:47.201189 update_engine[1709]: I20250515 12:29:47.201129 1709 omaha_request_action.cc:272] Request: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: May 15 12:29:47.201189 update_engine[1709]: I20250515 12:29:47.201134 1709 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:29:47.201369 update_engine[1709]: I20250515 12:29:47.201253 1709 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:29:47.201459 update_engine[1709]: I20250515 12:29:47.201423 1709 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:29:47.201683 locksmithd[1805]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 15 12:29:47.224908 update_engine[1709]: E20250515 12:29:47.224872 1709 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224931 1709 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224938 1709 omaha_request_action.cc:617] Omaha request response: May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224943 1709 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224947 1709 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224951 1709 update_attempter.cc:306] Processing Done. May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224954 1709 update_attempter.cc:310] Error event sent. May 15 12:29:47.225008 update_engine[1709]: I20250515 12:29:47.224962 1709 update_check_scheduler.cc:74] Next update check in 47m0s May 15 12:29:47.225279 locksmithd[1805]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 15 12:29:49.536621 systemd[1]: Started sshd@19-10.200.8.32:22-10.200.16.10:39878.service - OpenSSH per-connection server daemon (10.200.16.10:39878). May 15 12:29:50.170528 sshd[5733]: Accepted publickey for core from 10.200.16.10 port 39878 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:50.171813 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:50.175110 systemd-logind[1703]: New session 22 of user core. May 15 12:29:50.180039 systemd[1]: Started session-22.scope - Session 22 of User core. May 15 12:29:50.667465 sshd[5747]: Connection closed by 10.200.16.10 port 39878 May 15 12:29:50.667833 sshd-session[5733]: pam_unix(sshd:session): session closed for user core May 15 12:29:50.670034 systemd[1]: sshd@19-10.200.8.32:22-10.200.16.10:39878.service: Deactivated successfully. May 15 12:29:50.671639 systemd[1]: session-22.scope: Deactivated successfully. May 15 12:29:50.672755 systemd-logind[1703]: Session 22 logged out. Waiting for processes to exit. May 15 12:29:50.674082 systemd-logind[1703]: Removed session 22. May 15 12:29:52.254637 containerd[1716]: time="2025-05-15T12:29:52.254573704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"14496fd55d6c70feded8c84e92d4410d14c67ebf1df974ede2f96a7f9ecfdc73\" pid:5776 exited_at:{seconds:1747312192 nanos:254168741}" May 15 12:29:54.777796 containerd[1716]: time="2025-05-15T12:29:54.777716348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"7bc5310f33142a36bb0264c39ab458ffdef495a9809aeecc81dffbfd46e54bcf\" pid:5802 exited_at:{seconds:1747312194 nanos:777381374}" May 15 12:29:55.792638 systemd[1]: Started sshd@20-10.200.8.32:22-10.200.16.10:39890.service - OpenSSH per-connection server daemon (10.200.16.10:39890). May 15 12:29:56.427263 sshd[5814]: Accepted publickey for core from 10.200.16.10 port 39890 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:56.428477 sshd-session[5814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:56.432684 systemd-logind[1703]: New session 23 of user core. May 15 12:29:56.439085 systemd[1]: Started session-23.scope - Session 23 of User core. May 15 12:29:56.921506 sshd[5816]: Connection closed by 10.200.16.10 port 39890 May 15 12:29:56.921965 sshd-session[5814]: pam_unix(sshd:session): session closed for user core May 15 12:29:56.923993 systemd[1]: sshd@20-10.200.8.32:22-10.200.16.10:39890.service: Deactivated successfully. May 15 12:29:56.925622 systemd[1]: session-23.scope: Deactivated successfully. May 15 12:29:56.927316 systemd-logind[1703]: Session 23 logged out. Waiting for processes to exit. May 15 12:29:56.928110 systemd-logind[1703]: Removed session 23. May 15 12:30:01.577136 containerd[1716]: time="2025-05-15T12:30:01.577023012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"79971c7bb2730ae3bf5bc2bafe0a2075ffadb8c3aaa13efa890de89aaa8a0119\" pid:5843 exit_status:1 exited_at:{seconds:1747312201 nanos:576765492}" May 15 12:30:02.040101 systemd[1]: Started sshd@21-10.200.8.32:22-10.200.16.10:32912.service - OpenSSH per-connection server daemon (10.200.16.10:32912). May 15 12:30:02.673888 sshd[5855]: Accepted publickey for core from 10.200.16.10 port 32912 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:02.674857 sshd-session[5855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:02.678612 systemd-logind[1703]: New session 24 of user core. May 15 12:30:02.683020 systemd[1]: Started session-24.scope - Session 24 of User core. May 15 12:30:03.184063 sshd[5857]: Connection closed by 10.200.16.10 port 32912 May 15 12:30:03.184452 sshd-session[5855]: pam_unix(sshd:session): session closed for user core May 15 12:30:03.187191 systemd[1]: sshd@21-10.200.8.32:22-10.200.16.10:32912.service: Deactivated successfully. May 15 12:30:03.188721 systemd[1]: session-24.scope: Deactivated successfully. May 15 12:30:03.189416 systemd-logind[1703]: Session 24 logged out. Waiting for processes to exit. May 15 12:30:03.190774 systemd-logind[1703]: Removed session 24. May 15 12:30:08.300082 systemd[1]: Started sshd@22-10.200.8.32:22-10.200.16.10:32926.service - OpenSSH per-connection server daemon (10.200.16.10:32926). May 15 12:30:08.938362 sshd[5869]: Accepted publickey for core from 10.200.16.10 port 32926 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:08.939411 sshd-session[5869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:08.943309 systemd-logind[1703]: New session 25 of user core. May 15 12:30:08.953041 systemd[1]: Started session-25.scope - Session 25 of User core. May 15 12:30:09.446844 sshd[5871]: Connection closed by 10.200.16.10 port 32926 May 15 12:30:09.447240 sshd-session[5869]: pam_unix(sshd:session): session closed for user core May 15 12:30:09.450017 systemd[1]: sshd@22-10.200.8.32:22-10.200.16.10:32926.service: Deactivated successfully. May 15 12:30:09.451620 systemd[1]: session-25.scope: Deactivated successfully. May 15 12:30:09.452305 systemd-logind[1703]: Session 25 logged out. Waiting for processes to exit. May 15 12:30:09.453430 systemd-logind[1703]: Removed session 25. May 15 12:30:14.560645 systemd[1]: Started sshd@23-10.200.8.32:22-10.200.16.10:55760.service - OpenSSH per-connection server daemon (10.200.16.10:55760). May 15 12:30:15.193296 sshd[5883]: Accepted publickey for core from 10.200.16.10 port 55760 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:15.194253 sshd-session[5883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:15.198418 systemd-logind[1703]: New session 26 of user core. May 15 12:30:15.206098 systemd[1]: Started session-26.scope - Session 26 of User core. May 15 12:30:15.686079 sshd[5885]: Connection closed by 10.200.16.10 port 55760 May 15 12:30:15.686484 sshd-session[5883]: pam_unix(sshd:session): session closed for user core May 15 12:30:15.688702 systemd[1]: sshd@23-10.200.8.32:22-10.200.16.10:55760.service: Deactivated successfully. May 15 12:30:15.690277 systemd[1]: session-26.scope: Deactivated successfully. May 15 12:30:15.691425 systemd-logind[1703]: Session 26 logged out. Waiting for processes to exit. May 15 12:30:15.692470 systemd-logind[1703]: Removed session 26. May 15 12:30:20.801711 systemd[1]: Started sshd@24-10.200.8.32:22-10.200.16.10:44172.service - OpenSSH per-connection server daemon (10.200.16.10:44172). May 15 12:30:21.433660 sshd[5897]: Accepted publickey for core from 10.200.16.10 port 44172 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:21.434868 sshd-session[5897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:21.438908 systemd-logind[1703]: New session 27 of user core. May 15 12:30:21.444043 systemd[1]: Started session-27.scope - Session 27 of User core. May 15 12:30:21.924194 sshd[5899]: Connection closed by 10.200.16.10 port 44172 May 15 12:30:21.924459 sshd-session[5897]: pam_unix(sshd:session): session closed for user core May 15 12:30:21.926528 systemd[1]: sshd@24-10.200.8.32:22-10.200.16.10:44172.service: Deactivated successfully. May 15 12:30:21.928086 systemd[1]: session-27.scope: Deactivated successfully. May 15 12:30:21.929362 systemd-logind[1703]: Session 27 logged out. Waiting for processes to exit. May 15 12:30:21.930572 systemd-logind[1703]: Removed session 27. May 15 12:30:22.255751 containerd[1716]: time="2025-05-15T12:30:22.255701116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"c6470c0d0f5a0ccb6a8cd5d990f34ac3e9dc91fdf2de324ee8a7233fdaf5fee0\" pid:5922 exited_at:{seconds:1747312222 nanos:255475356}" May 15 12:30:27.046197 systemd[1]: Started sshd@25-10.200.8.32:22-10.200.16.10:44184.service - OpenSSH per-connection server daemon (10.200.16.10:44184). May 15 12:30:27.684882 sshd[5934]: Accepted publickey for core from 10.200.16.10 port 44184 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:27.685980 sshd-session[5934]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:27.690099 systemd-logind[1703]: New session 28 of user core. May 15 12:30:27.695059 systemd[1]: Started session-28.scope - Session 28 of User core. May 15 12:30:28.176853 sshd[5936]: Connection closed by 10.200.16.10 port 44184 May 15 12:30:28.177258 sshd-session[5934]: pam_unix(sshd:session): session closed for user core May 15 12:30:28.179393 systemd[1]: sshd@25-10.200.8.32:22-10.200.16.10:44184.service: Deactivated successfully. May 15 12:30:28.181029 systemd[1]: session-28.scope: Deactivated successfully. May 15 12:30:28.182178 systemd-logind[1703]: Session 28 logged out. Waiting for processes to exit. May 15 12:30:28.183328 systemd-logind[1703]: Removed session 28. May 15 12:30:28.288265 systemd[1]: Started sshd@26-10.200.8.32:22-10.200.16.10:44194.service - OpenSSH per-connection server daemon (10.200.16.10:44194). May 15 12:30:28.926331 sshd[5948]: Accepted publickey for core from 10.200.16.10 port 44194 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:28.927244 sshd-session[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:28.930963 systemd-logind[1703]: New session 29 of user core. May 15 12:30:28.940023 systemd[1]: Started session-29.scope - Session 29 of User core. May 15 12:30:29.499832 sshd[5950]: Connection closed by 10.200.16.10 port 44194 May 15 12:30:29.500344 sshd-session[5948]: pam_unix(sshd:session): session closed for user core May 15 12:30:29.502867 systemd[1]: sshd@26-10.200.8.32:22-10.200.16.10:44194.service: Deactivated successfully. May 15 12:30:29.504419 systemd[1]: session-29.scope: Deactivated successfully. May 15 12:30:29.505112 systemd-logind[1703]: Session 29 logged out. Waiting for processes to exit. May 15 12:30:29.506120 systemd-logind[1703]: Removed session 29. May 15 12:30:29.612362 systemd[1]: Started sshd@27-10.200.8.32:22-10.200.16.10:35466.service - OpenSSH per-connection server daemon (10.200.16.10:35466). May 15 12:30:30.244389 sshd[5960]: Accepted publickey for core from 10.200.16.10 port 35466 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:30.245397 sshd-session[5960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:30.249425 systemd-logind[1703]: New session 30 of user core. May 15 12:30:30.255073 systemd[1]: Started session-30.scope - Session 30 of User core. May 15 12:30:31.605535 containerd[1716]: time="2025-05-15T12:30:31.605488980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"8a2bd47bfca192bc3c6fd8a5278831a7ecd93d5006ba6ca0bd4ca547df8a2931\" pid:5983 exited_at:{seconds:1747312231 nanos:605132693}" May 15 12:30:32.312043 sshd[5962]: Connection closed by 10.200.16.10 port 35466 May 15 12:30:32.312299 sshd-session[5960]: pam_unix(sshd:session): session closed for user core May 15 12:30:32.315031 systemd[1]: sshd@27-10.200.8.32:22-10.200.16.10:35466.service: Deactivated successfully. May 15 12:30:32.316686 systemd[1]: session-30.scope: Deactivated successfully. May 15 12:30:32.316898 systemd[1]: session-30.scope: Consumed 369ms CPU time, 68.9M memory peak. May 15 12:30:32.318436 systemd-logind[1703]: Session 30 logged out. Waiting for processes to exit. May 15 12:30:32.319864 systemd-logind[1703]: Removed session 30. May 15 12:30:32.428437 systemd[1]: Started sshd@28-10.200.8.32:22-10.200.16.10:35468.service - OpenSSH per-connection server daemon (10.200.16.10:35468). May 15 12:30:33.062536 sshd[6002]: Accepted publickey for core from 10.200.16.10 port 35468 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:33.063519 sshd-session[6002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:33.066969 systemd-logind[1703]: New session 31 of user core. May 15 12:30:33.070030 systemd[1]: Started session-31.scope - Session 31 of User core. May 15 12:30:33.641005 sshd[6004]: Connection closed by 10.200.16.10 port 35468 May 15 12:30:33.641373 sshd-session[6002]: pam_unix(sshd:session): session closed for user core May 15 12:30:33.643980 systemd[1]: sshd@28-10.200.8.32:22-10.200.16.10:35468.service: Deactivated successfully. May 15 12:30:33.645580 systemd[1]: session-31.scope: Deactivated successfully. May 15 12:30:33.646312 systemd-logind[1703]: Session 31 logged out. Waiting for processes to exit. May 15 12:30:33.647423 systemd-logind[1703]: Removed session 31. May 15 12:30:33.752370 systemd[1]: Started sshd@29-10.200.8.32:22-10.200.16.10:35482.service - OpenSSH per-connection server daemon (10.200.16.10:35482). May 15 12:30:34.385398 sshd[6014]: Accepted publickey for core from 10.200.16.10 port 35482 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:34.386847 sshd-session[6014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:34.391222 systemd-logind[1703]: New session 32 of user core. May 15 12:30:34.399174 systemd[1]: Started session-32.scope - Session 32 of User core. May 15 12:30:34.892699 sshd[6016]: Connection closed by 10.200.16.10 port 35482 May 15 12:30:34.893139 sshd-session[6014]: pam_unix(sshd:session): session closed for user core May 15 12:30:34.895696 systemd[1]: sshd@29-10.200.8.32:22-10.200.16.10:35482.service: Deactivated successfully. May 15 12:30:34.897251 systemd[1]: session-32.scope: Deactivated successfully. May 15 12:30:34.897935 systemd-logind[1703]: Session 32 logged out. Waiting for processes to exit. May 15 12:30:34.899094 systemd-logind[1703]: Removed session 32. May 15 12:30:40.010777 systemd[1]: Started sshd@30-10.200.8.32:22-10.200.16.10:38894.service - OpenSSH per-connection server daemon (10.200.16.10:38894). May 15 12:30:40.655067 sshd[6028]: Accepted publickey for core from 10.200.16.10 port 38894 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:40.656262 sshd-session[6028]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:40.660629 systemd-logind[1703]: New session 33 of user core. May 15 12:30:40.667034 systemd[1]: Started session-33.scope - Session 33 of User core. May 15 12:30:41.163845 sshd[6030]: Connection closed by 10.200.16.10 port 38894 May 15 12:30:41.164271 sshd-session[6028]: pam_unix(sshd:session): session closed for user core May 15 12:30:41.166890 systemd[1]: sshd@30-10.200.8.32:22-10.200.16.10:38894.service: Deactivated successfully. May 15 12:30:41.168572 systemd[1]: session-33.scope: Deactivated successfully. May 15 12:30:41.169333 systemd-logind[1703]: Session 33 logged out. Waiting for processes to exit. May 15 12:30:41.170387 systemd-logind[1703]: Removed session 33. May 15 12:30:46.277727 systemd[1]: Started sshd@31-10.200.8.32:22-10.200.16.10:38910.service - OpenSSH per-connection server daemon (10.200.16.10:38910). May 15 12:30:46.916441 sshd[6042]: Accepted publickey for core from 10.200.16.10 port 38910 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:46.917479 sshd-session[6042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:46.921974 systemd-logind[1703]: New session 34 of user core. May 15 12:30:46.926091 systemd[1]: Started session-34.scope - Session 34 of User core. May 15 12:30:47.409025 sshd[6044]: Connection closed by 10.200.16.10 port 38910 May 15 12:30:47.409450 sshd-session[6042]: pam_unix(sshd:session): session closed for user core May 15 12:30:47.412134 systemd[1]: sshd@31-10.200.8.32:22-10.200.16.10:38910.service: Deactivated successfully. May 15 12:30:47.413814 systemd[1]: session-34.scope: Deactivated successfully. May 15 12:30:47.414469 systemd-logind[1703]: Session 34 logged out. Waiting for processes to exit. May 15 12:30:47.415720 systemd-logind[1703]: Removed session 34. May 15 12:30:52.253900 containerd[1716]: time="2025-05-15T12:30:52.253849898Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"bba03f8e1b1dfa2d23b74eb07229f3bb6ade6f1837b6d9ca214268bd72f5cf1f\" pid:6067 exited_at:{seconds:1747312252 nanos:253375856}" May 15 12:30:52.525579 systemd[1]: Started sshd@32-10.200.8.32:22-10.200.16.10:43052.service - OpenSSH per-connection server daemon (10.200.16.10:43052). May 15 12:30:53.161513 sshd[6078]: Accepted publickey for core from 10.200.16.10 port 43052 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:53.162677 sshd-session[6078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:53.166816 systemd-logind[1703]: New session 35 of user core. May 15 12:30:53.173054 systemd[1]: Started session-35.scope - Session 35 of User core. May 15 12:30:53.655095 sshd[6080]: Connection closed by 10.200.16.10 port 43052 May 15 12:30:53.655506 sshd-session[6078]: pam_unix(sshd:session): session closed for user core May 15 12:30:53.658277 systemd[1]: sshd@32-10.200.8.32:22-10.200.16.10:43052.service: Deactivated successfully. May 15 12:30:53.659980 systemd[1]: session-35.scope: Deactivated successfully. May 15 12:30:53.660597 systemd-logind[1703]: Session 35 logged out. Waiting for processes to exit. May 15 12:30:53.661716 systemd-logind[1703]: Removed session 35. May 15 12:30:54.777890 containerd[1716]: time="2025-05-15T12:30:54.777855814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"cc42cc0751a510c2c36d6bb55ec223e230ea117cfd1957c9bb715fec79035c08\" pid:6105 exited_at:{seconds:1747312254 nanos:777594416}" May 15 12:30:58.769796 systemd[1]: Started sshd@33-10.200.8.32:22-10.200.16.10:53262.service - OpenSSH per-connection server daemon (10.200.16.10:53262). May 15 12:30:59.403698 sshd[6124]: Accepted publickey for core from 10.200.16.10 port 53262 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:59.404721 sshd-session[6124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:59.408700 systemd-logind[1703]: New session 36 of user core. May 15 12:30:59.413086 systemd[1]: Started session-36.scope - Session 36 of User core. May 15 12:30:59.892517 sshd[6126]: Connection closed by 10.200.16.10 port 53262 May 15 12:30:59.892908 sshd-session[6124]: pam_unix(sshd:session): session closed for user core May 15 12:30:59.895491 systemd[1]: sshd@33-10.200.8.32:22-10.200.16.10:53262.service: Deactivated successfully. May 15 12:30:59.897184 systemd[1]: session-36.scope: Deactivated successfully. May 15 12:30:59.897840 systemd-logind[1703]: Session 36 logged out. Waiting for processes to exit. May 15 12:30:59.899122 systemd-logind[1703]: Removed session 36. May 15 12:31:01.579455 containerd[1716]: time="2025-05-15T12:31:01.579358114Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"892990fb7d5bbfa2c9804bb025745c7a09e6ad2e97cfe503386e0a8e4320ce00\" pid:6149 exited_at:{seconds:1747312261 nanos:579112961}" May 15 12:31:05.013042 systemd[1]: Started sshd@34-10.200.8.32:22-10.200.16.10:53276.service - OpenSSH per-connection server daemon (10.200.16.10:53276). May 15 12:31:05.647072 sshd[6162]: Accepted publickey for core from 10.200.16.10 port 53276 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:05.648107 sshd-session[6162]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:05.652635 systemd-logind[1703]: New session 37 of user core. May 15 12:31:05.660039 systemd[1]: Started session-37.scope - Session 37 of User core. May 15 12:31:06.137887 sshd[6164]: Connection closed by 10.200.16.10 port 53276 May 15 12:31:06.138287 sshd-session[6162]: pam_unix(sshd:session): session closed for user core May 15 12:31:06.141141 systemd[1]: sshd@34-10.200.8.32:22-10.200.16.10:53276.service: Deactivated successfully. May 15 12:31:06.142745 systemd[1]: session-37.scope: Deactivated successfully. May 15 12:31:06.143339 systemd-logind[1703]: Session 37 logged out. Waiting for processes to exit. May 15 12:31:06.144381 systemd-logind[1703]: Removed session 37. May 15 12:31:11.263123 systemd[1]: Started sshd@35-10.200.8.32:22-10.200.16.10:47578.service - OpenSSH per-connection server daemon (10.200.16.10:47578). May 15 12:31:12.029692 sshd[6178]: Accepted publickey for core from 10.200.16.10 port 47578 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:12.030775 sshd-session[6178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:12.034831 systemd-logind[1703]: New session 38 of user core. May 15 12:31:12.044045 systemd[1]: Started session-38.scope - Session 38 of User core. May 15 12:31:12.866015 sshd[6183]: Connection closed by 10.200.16.10 port 47578 May 15 12:31:12.866530 sshd-session[6178]: pam_unix(sshd:session): session closed for user core May 15 12:31:12.869066 systemd[1]: sshd@35-10.200.8.32:22-10.200.16.10:47578.service: Deactivated successfully. May 15 12:31:12.870859 systemd[1]: session-38.scope: Deactivated successfully. May 15 12:31:12.872258 systemd-logind[1703]: Session 38 logged out. Waiting for processes to exit. May 15 12:31:12.873288 systemd-logind[1703]: Removed session 38. May 15 12:31:17.990966 systemd[1]: Started sshd@36-10.200.8.32:22-10.200.16.10:47586.service - OpenSSH per-connection server daemon (10.200.16.10:47586). May 15 12:31:18.690071 sshd[6195]: Accepted publickey for core from 10.200.16.10 port 47586 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:18.691051 sshd-session[6195]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:18.694963 systemd-logind[1703]: New session 39 of user core. May 15 12:31:18.700032 systemd[1]: Started session-39.scope - Session 39 of User core. May 15 12:31:19.238144 sshd[6197]: Connection closed by 10.200.16.10 port 47586 May 15 12:31:19.238585 sshd-session[6195]: pam_unix(sshd:session): session closed for user core May 15 12:31:19.240754 systemd[1]: sshd@36-10.200.8.32:22-10.200.16.10:47586.service: Deactivated successfully. May 15 12:31:19.242978 systemd-logind[1703]: Session 39 logged out. Waiting for processes to exit. May 15 12:31:19.243453 systemd[1]: session-39.scope: Deactivated successfully. May 15 12:31:19.244746 systemd-logind[1703]: Removed session 39. May 15 12:31:22.253718 containerd[1716]: time="2025-05-15T12:31:22.253657383Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"3f6e9a7c15dcebc246c278b681b4fd5cc9e8a029b0b36a9036d8d237ab30e6ed\" pid:6220 exited_at:{seconds:1747312282 nanos:253449288}" May 15 12:31:24.353762 systemd[1]: Started sshd@37-10.200.8.32:22-10.200.16.10:41754.service - OpenSSH per-connection server daemon (10.200.16.10:41754). May 15 12:31:24.987117 sshd[6230]: Accepted publickey for core from 10.200.16.10 port 41754 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:24.988361 sshd-session[6230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:24.992713 systemd-logind[1703]: New session 40 of user core. May 15 12:31:24.997069 systemd[1]: Started session-40.scope - Session 40 of User core. May 15 12:31:25.493250 sshd[6232]: Connection closed by 10.200.16.10 port 41754 May 15 12:31:25.493650 sshd-session[6230]: pam_unix(sshd:session): session closed for user core May 15 12:31:25.496268 systemd[1]: sshd@37-10.200.8.32:22-10.200.16.10:41754.service: Deactivated successfully. May 15 12:31:25.497849 systemd[1]: session-40.scope: Deactivated successfully. May 15 12:31:25.498513 systemd-logind[1703]: Session 40 logged out. Waiting for processes to exit. May 15 12:31:25.499565 systemd-logind[1703]: Removed session 40. May 15 12:31:30.612966 systemd[1]: Started sshd@38-10.200.8.32:22-10.200.16.10:37144.service - OpenSSH per-connection server daemon (10.200.16.10:37144). May 15 12:31:31.245856 sshd[6264]: Accepted publickey for core from 10.200.16.10 port 37144 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:31.246889 sshd-session[6264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:31.250946 systemd-logind[1703]: New session 41 of user core. May 15 12:31:31.255033 systemd[1]: Started session-41.scope - Session 41 of User core. May 15 12:31:31.577348 containerd[1716]: time="2025-05-15T12:31:31.577273214Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"acfaf52e5f63ede40321c4ecefc8721f6d564276bc32ca080f6411cc5aa2e4a1\" pid:6279 exited_at:{seconds:1747312291 nanos:577003087}" May 15 12:31:31.733377 sshd[6266]: Connection closed by 10.200.16.10 port 37144 May 15 12:31:31.733728 sshd-session[6264]: pam_unix(sshd:session): session closed for user core May 15 12:31:31.736249 systemd[1]: sshd@38-10.200.8.32:22-10.200.16.10:37144.service: Deactivated successfully. May 15 12:31:31.737836 systemd[1]: session-41.scope: Deactivated successfully. May 15 12:31:31.738493 systemd-logind[1703]: Session 41 logged out. Waiting for processes to exit. May 15 12:31:31.739569 systemd-logind[1703]: Removed session 41. May 15 12:31:36.845688 systemd[1]: Started sshd@39-10.200.8.32:22-10.200.16.10:37160.service - OpenSSH per-connection server daemon (10.200.16.10:37160). May 15 12:31:37.490507 sshd[6302]: Accepted publickey for core from 10.200.16.10 port 37160 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:37.491528 sshd-session[6302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:37.495665 systemd-logind[1703]: New session 42 of user core. May 15 12:31:37.501039 systemd[1]: Started session-42.scope - Session 42 of User core. May 15 12:31:37.978849 sshd[6304]: Connection closed by 10.200.16.10 port 37160 May 15 12:31:37.979315 sshd-session[6302]: pam_unix(sshd:session): session closed for user core May 15 12:31:37.982114 systemd[1]: sshd@39-10.200.8.32:22-10.200.16.10:37160.service: Deactivated successfully. May 15 12:31:37.983793 systemd[1]: session-42.scope: Deactivated successfully. May 15 12:31:37.984446 systemd-logind[1703]: Session 42 logged out. Waiting for processes to exit. May 15 12:31:37.985489 systemd-logind[1703]: Removed session 42. May 15 12:31:41.836175 containerd[1716]: time="2025-05-15T12:31:41.836087258Z" level=warning msg="container event discarded" container=b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0 type=CONTAINER_CREATED_EVENT May 15 12:31:41.847362 containerd[1716]: time="2025-05-15T12:31:41.847334448Z" level=warning msg="container event discarded" container=b4c23316b8ee16cf38b2d86ebb3ab7cf8d3f3cb96ad9bc939d270d2abaf82ce0 type=CONTAINER_STARTED_EVENT May 15 12:31:41.940925 containerd[1716]: time="2025-05-15T12:31:41.940847835Z" level=warning msg="container event discarded" container=1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8 type=CONTAINER_CREATED_EVENT May 15 12:31:41.940925 containerd[1716]: time="2025-05-15T12:31:41.940887277Z" level=warning msg="container event discarded" container=1a28b751e64105ca0b16a261c72ab0e22ce5010241cd0d838b719d831bc421d8 type=CONTAINER_STARTED_EVENT May 15 12:31:42.145385 containerd[1716]: time="2025-05-15T12:31:42.145258363Z" level=warning msg="container event discarded" container=c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70 type=CONTAINER_CREATED_EVENT May 15 12:31:42.145385 containerd[1716]: time="2025-05-15T12:31:42.145294115Z" level=warning msg="container event discarded" container=c10b16b2465d5d32f4a3b462ef12bf32d7cab6d56ab5e2f6e26493701474de70 type=CONTAINER_STARTED_EVENT May 15 12:31:42.699019 containerd[1716]: time="2025-05-15T12:31:42.698975002Z" level=warning msg="container event discarded" container=683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10 type=CONTAINER_CREATED_EVENT May 15 12:31:42.841283 containerd[1716]: time="2025-05-15T12:31:42.841249425Z" level=warning msg="container event discarded" container=683f5b08fe830711a77f3f0a0996890092c837715f1f6734be88e80710324e10 type=CONTAINER_STARTED_EVENT May 15 12:31:42.841283 containerd[1716]: time="2025-05-15T12:31:42.841276814Z" level=warning msg="container event discarded" container=44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087 type=CONTAINER_CREATED_EVENT May 15 12:31:42.887599 containerd[1716]: time="2025-05-15T12:31:42.887543260Z" level=warning msg="container event discarded" container=8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05 type=CONTAINER_CREATED_EVENT May 15 12:31:42.964805 containerd[1716]: time="2025-05-15T12:31:42.964773675Z" level=warning msg="container event discarded" container=44117c3898153288a6404a45488113672b3118b283aaea260cd3f9e096248087 type=CONTAINER_STARTED_EVENT May 15 12:31:43.000009 containerd[1716]: time="2025-05-15T12:31:42.999965263Z" level=warning msg="container event discarded" container=8f3722b5db72c080a787ea60944f2d16c34987af1c60e2959208ad52b667fc05 type=CONTAINER_STARTED_EVENT May 15 12:31:43.095944 systemd[1]: Started sshd@40-10.200.8.32:22-10.200.16.10:49010.service - OpenSSH per-connection server daemon (10.200.16.10:49010). May 15 12:31:43.733100 sshd[6316]: Accepted publickey for core from 10.200.16.10 port 49010 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:43.734061 sshd-session[6316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:43.738188 systemd-logind[1703]: New session 43 of user core. May 15 12:31:43.744056 systemd[1]: Started session-43.scope - Session 43 of User core. May 15 12:31:44.230440 sshd[6318]: Connection closed by 10.200.16.10 port 49010 May 15 12:31:44.230836 sshd-session[6316]: pam_unix(sshd:session): session closed for user core May 15 12:31:44.233015 systemd[1]: sshd@40-10.200.8.32:22-10.200.16.10:49010.service: Deactivated successfully. May 15 12:31:44.234554 systemd[1]: session-43.scope: Deactivated successfully. May 15 12:31:44.235722 systemd-logind[1703]: Session 43 logged out. Waiting for processes to exit. May 15 12:31:44.237213 systemd-logind[1703]: Removed session 43. May 15 12:31:49.344084 systemd[1]: Started sshd@41-10.200.8.32:22-10.200.16.10:36852.service - OpenSSH per-connection server daemon (10.200.16.10:36852). May 15 12:31:49.978519 sshd[6330]: Accepted publickey for core from 10.200.16.10 port 36852 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:49.979699 sshd-session[6330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:49.983706 systemd-logind[1703]: New session 44 of user core. May 15 12:31:49.988063 systemd[1]: Started session-44.scope - Session 44 of User core. May 15 12:31:50.487762 sshd[6332]: Connection closed by 10.200.16.10 port 36852 May 15 12:31:50.488188 sshd-session[6330]: pam_unix(sshd:session): session closed for user core May 15 12:31:50.491149 systemd[1]: sshd@41-10.200.8.32:22-10.200.16.10:36852.service: Deactivated successfully. May 15 12:31:50.492829 systemd[1]: session-44.scope: Deactivated successfully. May 15 12:31:50.493470 systemd-logind[1703]: Session 44 logged out. Waiting for processes to exit. May 15 12:31:50.494605 systemd-logind[1703]: Removed session 44. May 15 12:31:52.255944 containerd[1716]: time="2025-05-15T12:31:52.255892288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"8119fed5e144d346dd05877849fbdf3b4c6aa686730605541ad3785c2807d4d5\" pid:6355 exited_at:{seconds:1747312312 nanos:255671057}" May 15 12:31:54.785566 containerd[1716]: time="2025-05-15T12:31:54.785509667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8675fde546ffc70ad2100e0c07c53b23e5dd18b8e8ff4df5333b9e5a3f78821a\" id:\"bb52a26c0e95bb42c3b21c332a59aa2556dcfa423d35f7621635ad8ce11cf2f2\" pid:6379 exited_at:{seconds:1747312314 nanos:785304197}" May 15 12:31:55.025440 containerd[1716]: time="2025-05-15T12:31:55.025365808Z" level=warning msg="container event discarded" container=6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa type=CONTAINER_CREATED_EVENT May 15 12:31:55.025440 containerd[1716]: time="2025-05-15T12:31:55.025431918Z" level=warning msg="container event discarded" container=6b659de022fe05caedb9b3a29aa0d7bb432a7254d9b20feca62734c324e213fa type=CONTAINER_STARTED_EVENT May 15 12:31:55.298665 containerd[1716]: time="2025-05-15T12:31:55.298616332Z" level=warning msg="container event discarded" container=0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf type=CONTAINER_CREATED_EVENT May 15 12:31:55.501056 containerd[1716]: time="2025-05-15T12:31:55.501008940Z" level=warning msg="container event discarded" container=0c97717a4e0b7596c46848919a7fde570505171a2e6bac8fc380ea107d8700bf type=CONTAINER_STARTED_EVENT May 15 12:31:55.597023 systemd[1]: Started sshd@42-10.200.8.32:22-10.200.16.10:36856.service - OpenSSH per-connection server daemon (10.200.16.10:36856). May 15 12:31:56.202422 containerd[1716]: time="2025-05-15T12:31:56.202357473Z" level=warning msg="container event discarded" container=84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d type=CONTAINER_CREATED_EVENT May 15 12:31:56.202422 containerd[1716]: time="2025-05-15T12:31:56.202418295Z" level=warning msg="container event discarded" container=84a43c4d7332065689abf7dae653fc772597ea18243ccbf1b434c8d103b9888d type=CONTAINER_STARTED_EVENT May 15 12:31:56.230644 sshd[6392]: Accepted publickey for core from 10.200.16.10 port 36856 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:56.231611 sshd-session[6392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:56.235585 systemd-logind[1703]: New session 45 of user core. May 15 12:31:56.240049 systemd[1]: Started session-45.scope - Session 45 of User core. May 15 12:31:56.719654 sshd[6394]: Connection closed by 10.200.16.10 port 36856 May 15 12:31:56.720130 sshd-session[6392]: pam_unix(sshd:session): session closed for user core May 15 12:31:56.722778 systemd[1]: sshd@42-10.200.8.32:22-10.200.16.10:36856.service: Deactivated successfully. May 15 12:31:56.724320 systemd[1]: session-45.scope: Deactivated successfully. May 15 12:31:56.725032 systemd-logind[1703]: Session 45 logged out. Waiting for processes to exit. May 15 12:31:56.726091 systemd-logind[1703]: Removed session 45. May 15 12:32:00.536779 containerd[1716]: time="2025-05-15T12:32:00.536722465Z" level=warning msg="container event discarded" container=b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71 type=CONTAINER_CREATED_EVENT May 15 12:32:00.579948 containerd[1716]: time="2025-05-15T12:32:00.579876794Z" level=warning msg="container event discarded" container=b97679f1b4575a35fc97237ba19277662b37d83ce1cb64234841e3f908013d71 type=CONTAINER_STARTED_EVENT May 15 12:32:01.580740 containerd[1716]: time="2025-05-15T12:32:01.580703661Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e2dc574ef2016e1c9f3cd9704b2529564c6fff5b1e2a976899f9cd0e318c6bd4\" id:\"578d8589f1499803cef77bcd63c20f236d209d13d0cd394c41715904117eb496\" pid:6417 exited_at:{seconds:1747312321 nanos:580498093}" May 15 12:32:01.836803 systemd[1]: Started sshd@43-10.200.8.32:22-10.200.16.10:59006.service - OpenSSH per-connection server daemon (10.200.16.10:59006). May 15 12:32:02.475022 sshd[6431]: Accepted publickey for core from 10.200.16.10 port 59006 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:32:02.476003 sshd-session[6431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:32:02.480427 systemd-logind[1703]: New session 46 of user core. May 15 12:32:02.485052 systemd[1]: Started session-46.scope - Session 46 of User core. May 15 12:32:02.978610 sshd[6434]: Connection closed by 10.200.16.10 port 59006 May 15 12:32:02.979076 sshd-session[6431]: pam_unix(sshd:session): session closed for user core May 15 12:32:02.981790 systemd[1]: sshd@43-10.200.8.32:22-10.200.16.10:59006.service: Deactivated successfully. May 15 12:32:02.983316 systemd[1]: session-46.scope: Deactivated successfully. May 15 12:32:02.984012 systemd-logind[1703]: Session 46 logged out. Waiting for processes to exit. May 15 12:32:02.985002 systemd-logind[1703]: Removed session 46. May 15 12:32:04.342501 containerd[1716]: time="2025-05-15T12:32:04.342435123Z" level=warning msg="container event discarded" container=bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49 type=CONTAINER_CREATED_EVENT May 15 12:32:04.342825 containerd[1716]: time="2025-05-15T12:32:04.342558943Z" level=warning msg="container event discarded" container=bd2bd3d7fd86b56b0c5b28003df7185004844e172f189b1021fe5dd1a563aa49 type=CONTAINER_STARTED_EVENT May 15 12:32:04.545039 containerd[1716]: time="2025-05-15T12:32:04.544990055Z" level=warning msg="container event discarded" container=8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d type=CONTAINER_CREATED_EVENT May 15 12:32:04.545039 containerd[1716]: time="2025-05-15T12:32:04.545029215Z" level=warning msg="container event discarded" container=8d00a366dca96e12befd8c725eae1308151badbed1ca5ac221c0142e34fc1d2d type=CONTAINER_STARTED_EVENT