May 15 12:23:38.931672 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 15 10:42:41 -00 2025 May 15 12:23:38.931699 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:38.931709 kernel: BIOS-provided physical RAM map: May 15 12:23:38.931716 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 15 12:23:38.931723 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved May 15 12:23:38.931730 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable May 15 12:23:38.931739 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc4fff] reserved May 15 12:23:38.931746 kernel: BIOS-e820: [mem 0x000000003ffc5000-0x000000003ffd1fff] usable May 15 12:23:38.931753 kernel: BIOS-e820: [mem 0x000000003ffd2000-0x000000003fffafff] ACPI data May 15 12:23:38.931760 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS May 15 12:23:38.931767 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable May 15 12:23:38.931774 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable May 15 12:23:38.931781 kernel: printk: legacy bootconsole [earlyser0] enabled May 15 12:23:38.931788 kernel: NX (Execute Disable) protection: active May 15 12:23:38.931798 kernel: APIC: Static calls initialized May 15 12:23:38.931805 kernel: efi: EFI v2.7 by Microsoft May 15 12:23:38.931813 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ebb9a98 RNG=0x3ffd2018 May 15 12:23:38.931820 kernel: random: crng init done May 15 12:23:38.931827 kernel: secureboot: Secure boot disabled May 15 12:23:38.931834 kernel: SMBIOS 3.1.0 present. May 15 12:23:38.931842 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 11/21/2024 May 15 12:23:38.931850 kernel: DMI: Memory slots populated: 2/2 May 15 12:23:38.931858 kernel: Hypervisor detected: Microsoft Hyper-V May 15 12:23:38.931866 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 May 15 12:23:38.931873 kernel: Hyper-V: Nested features: 0x3e0101 May 15 12:23:38.931881 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 May 15 12:23:38.931888 kernel: Hyper-V: Using hypercall for remote TLB flush May 15 12:23:38.931896 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 15 12:23:38.931904 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 15 12:23:38.931912 kernel: tsc: Detected 2299.999 MHz processor May 15 12:23:38.931920 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 15 12:23:38.931928 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 15 12:23:38.931936 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 May 15 12:23:38.931945 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 15 12:23:38.931954 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 15 12:23:38.931962 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved May 15 12:23:38.931969 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 May 15 12:23:38.931976 kernel: Using GB pages for direct mapping May 15 12:23:38.931983 kernel: ACPI: Early table checksum verification disabled May 15 12:23:38.931990 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) May 15 12:23:38.932002 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932009 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932017 kernel: ACPI: DSDT 0x000000003FFD6000 01E11C (v02 MSFTVM DSDT01 00000001 INTL 20230628) May 15 12:23:38.932024 kernel: ACPI: FACS 0x000000003FFFE000 000040 May 15 12:23:38.932031 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932038 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932047 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932053 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) May 15 12:23:38.932060 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) May 15 12:23:38.932067 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 15 12:23:38.932075 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] May 15 12:23:38.932082 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff411b] May 15 12:23:38.932089 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] May 15 12:23:38.932096 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] May 15 12:23:38.932104 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] May 15 12:23:38.932113 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] May 15 12:23:38.932120 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] May 15 12:23:38.932127 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] May 15 12:23:38.932134 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] May 15 12:23:38.932142 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] May 15 12:23:38.932149 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] May 15 12:23:38.932156 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] May 15 12:23:38.932164 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] May 15 12:23:38.932192 kernel: Zone ranges: May 15 12:23:38.932202 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 15 12:23:38.932210 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 15 12:23:38.932217 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] May 15 12:23:38.932224 kernel: Device empty May 15 12:23:38.932231 kernel: Movable zone start for each node May 15 12:23:38.932239 kernel: Early memory node ranges May 15 12:23:38.932246 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 15 12:23:38.932254 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] May 15 12:23:38.932261 kernel: node 0: [mem 0x000000003ffc5000-0x000000003ffd1fff] May 15 12:23:38.932269 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] May 15 12:23:38.932277 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] May 15 12:23:38.932284 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] May 15 12:23:38.932291 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 15 12:23:38.932300 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 15 12:23:38.932307 kernel: On node 0, zone DMA32: 132 pages in unavailable ranges May 15 12:23:38.932314 kernel: On node 0, zone DMA32: 45 pages in unavailable ranges May 15 12:23:38.932321 kernel: ACPI: PM-Timer IO Port: 0x408 May 15 12:23:38.932329 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 15 12:23:38.932337 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 15 12:23:38.932344 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 15 12:23:38.932352 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 May 15 12:23:38.932359 kernel: TSC deadline timer available May 15 12:23:38.932366 kernel: CPU topo: Max. logical packages: 1 May 15 12:23:38.932373 kernel: CPU topo: Max. logical dies: 1 May 15 12:23:38.932380 kernel: CPU topo: Max. dies per package: 1 May 15 12:23:38.932387 kernel: CPU topo: Max. threads per core: 2 May 15 12:23:38.932394 kernel: CPU topo: Num. cores per package: 1 May 15 12:23:38.932403 kernel: CPU topo: Num. threads per package: 2 May 15 12:23:38.932410 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 15 12:23:38.932417 kernel: [mem 0x40000000-0xffffffff] available for PCI devices May 15 12:23:38.932424 kernel: Booting paravirtualized kernel on Hyper-V May 15 12:23:38.932432 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 15 12:23:38.932439 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 15 12:23:38.932446 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 15 12:23:38.932453 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 15 12:23:38.932461 kernel: pcpu-alloc: [0] 0 1 May 15 12:23:38.932469 kernel: Hyper-V: PV spinlocks enabled May 15 12:23:38.932477 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 15 12:23:38.932485 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:38.932493 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 15 12:23:38.932500 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) May 15 12:23:38.932507 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 15 12:23:38.932514 kernel: Fallback order for Node 0: 0 May 15 12:23:38.932521 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2096878 May 15 12:23:38.932530 kernel: Policy zone: Normal May 15 12:23:38.932537 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 15 12:23:38.932544 kernel: software IO TLB: area num 2. May 15 12:23:38.932551 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 15 12:23:38.932558 kernel: ftrace: allocating 40065 entries in 157 pages May 15 12:23:38.932566 kernel: ftrace: allocated 157 pages with 5 groups May 15 12:23:38.932572 kernel: Dynamic Preempt: voluntary May 15 12:23:38.932579 kernel: rcu: Preemptible hierarchical RCU implementation. May 15 12:23:38.932587 kernel: rcu: RCU event tracing is enabled. May 15 12:23:38.932596 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 15 12:23:38.932608 kernel: Trampoline variant of Tasks RCU enabled. May 15 12:23:38.932616 kernel: Rude variant of Tasks RCU enabled. May 15 12:23:38.932625 kernel: Tracing variant of Tasks RCU enabled. May 15 12:23:38.932632 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 15 12:23:38.932640 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 15 12:23:38.932647 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.932655 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.932663 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:23:38.932671 kernel: Using NULL legacy PIC May 15 12:23:38.932679 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 May 15 12:23:38.932688 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 15 12:23:38.932696 kernel: Console: colour dummy device 80x25 May 15 12:23:38.932704 kernel: printk: legacy console [tty1] enabled May 15 12:23:38.932712 kernel: printk: legacy console [ttyS0] enabled May 15 12:23:38.932720 kernel: printk: legacy bootconsole [earlyser0] disabled May 15 12:23:38.932727 kernel: ACPI: Core revision 20240827 May 15 12:23:38.932736 kernel: Failed to register legacy timer interrupt May 15 12:23:38.932743 kernel: APIC: Switch to symmetric I/O mode setup May 15 12:23:38.932751 kernel: x2apic enabled May 15 12:23:38.932758 kernel: APIC: Switched APIC routing to: physical x2apic May 15 12:23:38.932766 kernel: Hyper-V: Host Build 10.0.26100.1221-1-0 May 15 12:23:38.932773 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 15 12:23:38.932781 kernel: Hyper-V: Disabling IBT because of Hyper-V bug May 15 12:23:38.932789 kernel: Hyper-V: Using IPI hypercalls May 15 12:23:38.932797 kernel: APIC: send_IPI() replaced with hv_send_ipi() May 15 12:23:38.932806 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() May 15 12:23:38.932813 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() May 15 12:23:38.932821 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() May 15 12:23:38.932829 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() May 15 12:23:38.932836 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() May 15 12:23:38.932844 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 15 12:23:38.932852 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4599.99 BogoMIPS (lpj=2299999) May 15 12:23:38.932860 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 15 12:23:38.932867 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 15 12:23:38.932876 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 15 12:23:38.932884 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 15 12:23:38.932891 kernel: Spectre V2 : Mitigation: Retpolines May 15 12:23:38.932899 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 15 12:23:38.932907 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 15 12:23:38.932914 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! May 15 12:23:38.932922 kernel: RETBleed: Vulnerable May 15 12:23:38.932930 kernel: Speculative Store Bypass: Vulnerable May 15 12:23:38.932937 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 15 12:23:38.932945 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 15 12:23:38.932953 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 15 12:23:38.932961 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' May 15 12:23:38.932969 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' May 15 12:23:38.932976 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' May 15 12:23:38.932984 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' May 15 12:23:38.932992 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' May 15 12:23:38.932999 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' May 15 12:23:38.933007 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 15 12:23:38.933014 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 May 15 12:23:38.933022 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 May 15 12:23:38.933029 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 May 15 12:23:38.933038 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 May 15 12:23:38.933045 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 May 15 12:23:38.933053 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 May 15 12:23:38.933060 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. May 15 12:23:38.933068 kernel: Freeing SMP alternatives memory: 32K May 15 12:23:38.933075 kernel: pid_max: default: 32768 minimum: 301 May 15 12:23:38.933083 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 15 12:23:38.933090 kernel: landlock: Up and running. May 15 12:23:38.933098 kernel: SELinux: Initializing. May 15 12:23:38.933105 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:23:38.933112 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:23:38.933119 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) May 15 12:23:38.933127 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. May 15 12:23:38.933135 kernel: signal: max sigframe size: 11952 May 15 12:23:38.933142 kernel: rcu: Hierarchical SRCU implementation. May 15 12:23:38.933149 kernel: rcu: Max phase no-delay instances is 400. May 15 12:23:38.933157 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 15 12:23:38.933164 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 15 12:23:38.933190 kernel: smp: Bringing up secondary CPUs ... May 15 12:23:38.933203 kernel: smpboot: x86: Booting SMP configuration: May 15 12:23:38.933210 kernel: .... node #0, CPUs: #1 May 15 12:23:38.933218 kernel: smp: Brought up 1 node, 2 CPUs May 15 12:23:38.933225 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) May 15 12:23:38.933233 kernel: Memory: 8082316K/8387512K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 299988K reserved, 0K cma-reserved) May 15 12:23:38.933240 kernel: devtmpfs: initialized May 15 12:23:38.933247 kernel: x86/mm: Memory block size: 128MB May 15 12:23:38.933253 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) May 15 12:23:38.933260 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 15 12:23:38.933268 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 15 12:23:38.933275 kernel: pinctrl core: initialized pinctrl subsystem May 15 12:23:38.933284 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 15 12:23:38.933291 kernel: audit: initializing netlink subsys (disabled) May 15 12:23:38.933299 kernel: audit: type=2000 audit(1747311816.030:1): state=initialized audit_enabled=0 res=1 May 15 12:23:38.933307 kernel: thermal_sys: Registered thermal governor 'step_wise' May 15 12:23:38.933314 kernel: thermal_sys: Registered thermal governor 'user_space' May 15 12:23:38.933322 kernel: cpuidle: using governor menu May 15 12:23:38.933329 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 15 12:23:38.933337 kernel: dca service started, version 1.12.1 May 15 12:23:38.933344 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] May 15 12:23:38.933352 kernel: e820: reserve RAM buffer [mem 0x3ffd2000-0x3fffffff] May 15 12:23:38.933359 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 15 12:23:38.933366 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 15 12:23:38.933373 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 15 12:23:38.933381 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 15 12:23:38.933388 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 15 12:23:38.933396 kernel: ACPI: Added _OSI(Module Device) May 15 12:23:38.933404 kernel: ACPI: Added _OSI(Processor Device) May 15 12:23:38.933411 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 15 12:23:38.933419 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 15 12:23:38.933426 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 15 12:23:38.933432 kernel: ACPI: Interpreter enabled May 15 12:23:38.933438 kernel: ACPI: PM: (supports S0 S5) May 15 12:23:38.933445 kernel: ACPI: Using IOAPIC for interrupt routing May 15 12:23:38.933453 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 15 12:23:38.933460 kernel: PCI: Ignoring E820 reservations for host bridge windows May 15 12:23:38.933467 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F May 15 12:23:38.933474 kernel: iommu: Default domain type: Translated May 15 12:23:38.933483 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 15 12:23:38.933490 kernel: efivars: Registered efivars operations May 15 12:23:38.933498 kernel: PCI: Using ACPI for IRQ routing May 15 12:23:38.933505 kernel: PCI: System does not support PCI May 15 12:23:38.933513 kernel: vgaarb: loaded May 15 12:23:38.933521 kernel: clocksource: Switched to clocksource tsc-early May 15 12:23:38.933529 kernel: VFS: Disk quotas dquot_6.6.0 May 15 12:23:38.933536 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 15 12:23:38.933543 kernel: pnp: PnP ACPI init May 15 12:23:38.933553 kernel: pnp: PnP ACPI: found 3 devices May 15 12:23:38.933561 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 15 12:23:38.933569 kernel: NET: Registered PF_INET protocol family May 15 12:23:38.933577 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 15 12:23:38.933584 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) May 15 12:23:38.933591 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 15 12:23:38.933597 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) May 15 12:23:38.933605 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 15 12:23:38.933612 kernel: TCP: Hash tables configured (established 65536 bind 65536) May 15 12:23:38.933621 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) May 15 12:23:38.933628 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) May 15 12:23:38.933636 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 15 12:23:38.933643 kernel: NET: Registered PF_XDP protocol family May 15 12:23:38.933650 kernel: PCI: CLS 0 bytes, default 64 May 15 12:23:38.933658 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 15 12:23:38.933666 kernel: software IO TLB: mapped [mem 0x000000003aa59000-0x000000003ea59000] (64MB) May 15 12:23:38.933674 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer May 15 12:23:38.933681 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules May 15 12:23:38.933689 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns May 15 12:23:38.933697 kernel: clocksource: Switched to clocksource tsc May 15 12:23:38.933705 kernel: Initialise system trusted keyrings May 15 12:23:38.933712 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 May 15 12:23:38.933720 kernel: Key type asymmetric registered May 15 12:23:38.933727 kernel: Asymmetric key parser 'x509' registered May 15 12:23:38.933734 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 15 12:23:38.933742 kernel: io scheduler mq-deadline registered May 15 12:23:38.933749 kernel: io scheduler kyber registered May 15 12:23:38.933758 kernel: io scheduler bfq registered May 15 12:23:38.933765 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 15 12:23:38.933773 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 15 12:23:38.933781 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:23:38.933789 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A May 15 12:23:38.933796 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:23:38.933804 kernel: i8042: PNP: No PS/2 controller found. May 15 12:23:38.933914 kernel: rtc_cmos 00:02: registered as rtc0 May 15 12:23:38.933980 kernel: rtc_cmos 00:02: setting system clock to 2025-05-15T12:23:38 UTC (1747311818) May 15 12:23:38.934040 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram May 15 12:23:38.934050 kernel: intel_pstate: Intel P-state driver initializing May 15 12:23:38.934058 kernel: efifb: probing for efifb May 15 12:23:38.934066 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 15 12:23:38.934075 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 15 12:23:38.934082 kernel: efifb: scrolling: redraw May 15 12:23:38.934090 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 15 12:23:38.934100 kernel: Console: switching to colour frame buffer device 128x48 May 15 12:23:38.934108 kernel: fb0: EFI VGA frame buffer device May 15 12:23:38.934115 kernel: pstore: Using crash dump compression: deflate May 15 12:23:38.934123 kernel: pstore: Registered efi_pstore as persistent store backend May 15 12:23:38.934130 kernel: NET: Registered PF_INET6 protocol family May 15 12:23:38.934138 kernel: Segment Routing with IPv6 May 15 12:23:38.934146 kernel: In-situ OAM (IOAM) with IPv6 May 15 12:23:38.934155 kernel: NET: Registered PF_PACKET protocol family May 15 12:23:38.934163 kernel: Key type dns_resolver registered May 15 12:23:38.936219 kernel: IPI shorthand broadcast: enabled May 15 12:23:38.936235 kernel: sched_clock: Marking stable (2748003387, 85528032)->(3129951485, -296420066) May 15 12:23:38.936244 kernel: registered taskstats version 1 May 15 12:23:38.936253 kernel: Loading compiled-in X.509 certificates May 15 12:23:38.936262 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 05e05785144663be6df1db78301487421c4773b6' May 15 12:23:38.936274 kernel: Demotion targets for Node 0: null May 15 12:23:38.936281 kernel: Key type .fscrypt registered May 15 12:23:38.936290 kernel: Key type fscrypt-provisioning registered May 15 12:23:38.936298 kernel: ima: No TPM chip found, activating TPM-bypass! May 15 12:23:38.936310 kernel: ima: Allocated hash algorithm: sha1 May 15 12:23:38.936318 kernel: ima: No architecture policies found May 15 12:23:38.936327 kernel: clk: Disabling unused clocks May 15 12:23:38.936335 kernel: Warning: unable to open an initial console. May 15 12:23:38.936344 kernel: Freeing unused kernel image (initmem) memory: 54416K May 15 12:23:38.936353 kernel: Write protecting the kernel read-only data: 24576k May 15 12:23:38.936361 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 15 12:23:38.936370 kernel: Run /init as init process May 15 12:23:38.936379 kernel: with arguments: May 15 12:23:38.936389 kernel: /init May 15 12:23:38.936397 kernel: with environment: May 15 12:23:38.936406 kernel: HOME=/ May 15 12:23:38.936414 kernel: TERM=linux May 15 12:23:38.936423 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 15 12:23:38.936434 systemd[1]: Successfully made /usr/ read-only. May 15 12:23:38.936446 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:23:38.936454 systemd[1]: Detected virtualization microsoft. May 15 12:23:38.936464 systemd[1]: Detected architecture x86-64. May 15 12:23:38.936472 systemd[1]: Running in initrd. May 15 12:23:38.936481 systemd[1]: No hostname configured, using default hostname. May 15 12:23:38.936490 systemd[1]: Hostname set to . May 15 12:23:38.936498 systemd[1]: Initializing machine ID from random generator. May 15 12:23:38.936507 systemd[1]: Queued start job for default target initrd.target. May 15 12:23:38.936515 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:23:38.936524 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:23:38.936537 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 15 12:23:38.936546 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:23:38.936555 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 15 12:23:38.936564 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 15 12:23:38.936575 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 15 12:23:38.936583 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 15 12:23:38.936592 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:23:38.936603 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:23:38.936612 systemd[1]: Reached target paths.target - Path Units. May 15 12:23:38.936622 systemd[1]: Reached target slices.target - Slice Units. May 15 12:23:38.936631 systemd[1]: Reached target swap.target - Swaps. May 15 12:23:38.936642 systemd[1]: Reached target timers.target - Timer Units. May 15 12:23:38.936653 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:23:38.936662 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:23:38.936672 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 15 12:23:38.936682 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 15 12:23:38.936692 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:23:38.936701 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:23:38.936711 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:23:38.936721 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:23:38.936732 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 15 12:23:38.936742 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:23:38.936751 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 15 12:23:38.936761 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 15 12:23:38.936772 systemd[1]: Starting systemd-fsck-usr.service... May 15 12:23:38.936781 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:23:38.936790 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:23:38.936810 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:38.936843 systemd-journald[205]: Collecting audit messages is disabled. May 15 12:23:38.936864 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 15 12:23:38.936874 systemd-journald[205]: Journal started May 15 12:23:38.936896 systemd-journald[205]: Runtime Journal (/run/log/journal/8e074201baa64ad0acd284249e8c91ab) is 8M, max 159M, 151M free. May 15 12:23:38.945235 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:23:38.947370 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:23:38.950540 systemd-modules-load[207]: Inserted module 'overlay' May 15 12:23:38.950670 systemd[1]: Finished systemd-fsck-usr.service. May 15 12:23:38.957300 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:23:38.961912 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:23:38.979108 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:38.983957 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 15 12:23:38.983977 kernel: Bridge firewalling registered May 15 12:23:38.984115 systemd-modules-load[207]: Inserted module 'br_netfilter' May 15 12:23:38.985219 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:23:38.989031 systemd-tmpfiles[218]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 15 12:23:38.993946 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:23:38.995129 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:23:38.997749 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 12:23:39.002164 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:23:39.008550 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:23:39.015099 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:23:39.022500 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:23:39.032247 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:23:39.036440 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:23:39.040090 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 15 12:23:39.059732 dracut-cmdline[245]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:23:39.059748 systemd-resolved[239]: Positive Trust Anchors: May 15 12:23:39.059757 systemd-resolved[239]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:23:39.059786 systemd-resolved[239]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:23:39.073572 systemd-resolved[239]: Defaulting to hostname 'linux'. May 15 12:23:39.076673 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:23:39.101310 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:23:39.125186 kernel: SCSI subsystem initialized May 15 12:23:39.132185 kernel: Loading iSCSI transport class v2.0-870. May 15 12:23:39.139195 kernel: iscsi: registered transport (tcp) May 15 12:23:39.155186 kernel: iscsi: registered transport (qla4xxx) May 15 12:23:39.155219 kernel: QLogic iSCSI HBA Driver May 15 12:23:39.166634 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:23:39.178847 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:23:39.184142 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:23:39.211809 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 15 12:23:39.215346 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 15 12:23:39.262185 kernel: raid6: avx512x4 gen() 45325 MB/s May 15 12:23:39.279182 kernel: raid6: avx512x2 gen() 44676 MB/s May 15 12:23:39.296181 kernel: raid6: avx512x1 gen() 30062 MB/s May 15 12:23:39.314180 kernel: raid6: avx2x4 gen() 41712 MB/s May 15 12:23:39.331180 kernel: raid6: avx2x2 gen() 43628 MB/s May 15 12:23:39.348779 kernel: raid6: avx2x1 gen() 30834 MB/s May 15 12:23:39.348862 kernel: raid6: using algorithm avx512x4 gen() 45325 MB/s May 15 12:23:39.366540 kernel: raid6: .... xor() 7907 MB/s, rmw enabled May 15 12:23:39.366560 kernel: raid6: using avx512x2 recovery algorithm May 15 12:23:39.383187 kernel: xor: automatically using best checksumming function avx May 15 12:23:39.485186 kernel: Btrfs loaded, zoned=no, fsverity=no May 15 12:23:39.488713 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 15 12:23:39.491286 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:23:39.509749 systemd-udevd[453]: Using default interface naming scheme 'v255'. May 15 12:23:39.513279 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:23:39.519305 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 15 12:23:39.536286 dracut-pre-trigger[463]: rd.md=0: removing MD RAID activation May 15 12:23:39.551701 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:23:39.552759 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:23:39.587113 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:23:39.589621 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 15 12:23:39.633184 kernel: cryptd: max_cpu_qlen set to 1000 May 15 12:23:39.649969 kernel: hv_vmbus: Vmbus version:5.3 May 15 12:23:39.656823 kernel: AES CTR mode by8 optimization enabled May 15 12:23:39.660270 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:39.663299 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:39.673965 kernel: pps_core: LinuxPPS API ver. 1 registered May 15 12:23:39.673998 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 15 12:23:39.673622 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:39.679440 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:39.699028 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:39.699198 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:39.705315 kernel: hv_vmbus: registering driver hyperv_keyboard May 15 12:23:39.705329 kernel: PTP clock support registered May 15 12:23:39.712246 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 May 15 12:23:39.716369 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:23:39.723621 kernel: hv_vmbus: registering driver hv_pci May 15 12:23:39.728243 kernel: hv_vmbus: registering driver hv_storvsc May 15 12:23:39.730190 kernel: scsi host0: storvsc_host_t May 15 12:23:39.732932 kernel: hv_vmbus: registering driver hv_netvsc May 15 12:23:39.732972 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 May 15 12:23:39.772188 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 May 15 12:23:40.166071 kernel: hv_utils: Registering HyperV Utility Driver May 15 12:23:40.166090 kernel: hv_vmbus: registering driver hv_utils May 15 12:23:40.166103 kernel: hid: raw HID events driver (C) Jiri Kosina May 15 12:23:40.166116 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 May 15 12:23:40.166241 kernel: hv_utils: Shutdown IC version 3.2 May 15 12:23:40.166259 kernel: hv_utils: Heartbeat IC version 3.0 May 15 12:23:40.166272 kernel: hv_utils: TimeSync IC version 4.0 May 15 12:23:40.166284 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] May 15 12:23:40.166383 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] May 15 12:23:40.166458 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbc335 (unnamed net_device) (uninitialized): VF slot 1 added May 15 12:23:40.166535 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint May 15 12:23:40.166631 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] May 15 12:23:40.166713 kernel: hv_vmbus: registering driver hid_hyperv May 15 12:23:40.166727 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 May 15 12:23:40.166738 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 15 12:23:40.166818 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) May 15 12:23:40.166902 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 15 12:23:40.166989 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 15 12:23:40.166999 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 May 15 12:23:40.167072 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned May 15 12:23:40.167154 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 15 12:23:40.123635 systemd-resolved[239]: Clock change detected. Flushing caches. May 15 12:23:40.161887 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:40.182487 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#96 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:23:40.182656 kernel: nvme nvme0: pci function c05b:00:00.0 May 15 12:23:40.185779 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) May 15 12:23:40.471248 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#138 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:23:40.471356 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 15 12:23:40.471461 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:40.947194 kernel: nvme nvme0: using unchecked data buffer May 15 12:23:41.159485 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 May 15 12:23:41.213316 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 May 15 12:23:41.213432 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] May 15 12:23:41.213533 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] May 15 12:23:41.213618 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint May 15 12:23:41.213758 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] May 15 12:23:41.213864 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] May 15 12:23:41.213934 kernel: pci 7870:00:00.0: enabling Extended Tags May 15 12:23:41.214097 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 May 15 12:23:41.214185 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned May 15 12:23:41.214286 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned May 15 12:23:41.214376 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) May 15 12:23:41.214463 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 May 15 12:23:41.214546 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbc335 eth0: VF registering: eth1 May 15 12:23:41.214622 kernel: mana 7870:00:00.0 eth1: joined to eth0 May 15 12:23:41.214708 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 May 15 12:23:41.287760 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 15 12:23:41.325431 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. May 15 12:23:41.336610 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. May 15 12:23:41.389760 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. May 15 12:23:41.395250 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. May 15 12:23:41.395495 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 15 12:23:41.401653 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:23:41.403641 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:23:41.407221 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:23:41.409355 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 15 12:23:41.418278 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 15 12:23:41.433281 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 15 12:23:41.437273 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:42.451199 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 15 12:23:42.452062 disk-uuid[672]: The operation has completed successfully. May 15 12:23:42.497318 systemd[1]: disk-uuid.service: Deactivated successfully. May 15 12:23:42.497397 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 15 12:23:42.524516 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 15 12:23:42.537060 sh[714]: Success May 15 12:23:42.560370 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 15 12:23:42.560422 kernel: device-mapper: uevent: version 1.0.3 May 15 12:23:42.561634 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 15 12:23:42.570187 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 15 12:23:42.992066 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 15 12:23:42.995682 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 15 12:23:43.008065 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 15 12:23:43.019200 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 15 12:23:43.019260 kernel: BTRFS: device fsid 2d504097-db49-4d66-a0d5-eeb665b21004 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (727) May 15 12:23:43.022406 kernel: BTRFS info (device dm-0): first mount of filesystem 2d504097-db49-4d66-a0d5-eeb665b21004 May 15 12:23:43.024184 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:43.025560 kernel: BTRFS info (device dm-0): using free-space-tree May 15 12:23:43.729948 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 15 12:23:43.734290 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 15 12:23:43.737794 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 15 12:23:43.740130 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 15 12:23:43.756774 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 15 12:23:43.776237 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (750) May 15 12:23:43.780046 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:43.780081 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:43.781581 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:43.806752 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 15 12:23:43.810680 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:43.811192 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 15 12:23:43.829279 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:23:43.832283 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:23:43.852345 systemd-networkd[896]: lo: Link UP May 15 12:23:43.852352 systemd-networkd[896]: lo: Gained carrier May 15 12:23:43.853693 systemd-networkd[896]: Enumeration completed May 15 12:23:43.860075 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 15 12:23:43.861376 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:23:43.854019 systemd-networkd[896]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:23:43.867225 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbc335 eth0: Data path switched to VF: enP30832s1 May 15 12:23:43.854022 systemd-networkd[896]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:23:43.854329 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:23:43.859288 systemd[1]: Reached target network.target - Network. May 15 12:23:43.863846 systemd-networkd[896]: enP30832s1: Link UP May 15 12:23:43.863904 systemd-networkd[896]: eth0: Link UP May 15 12:23:43.863986 systemd-networkd[896]: eth0: Gained carrier May 15 12:23:43.863993 systemd-networkd[896]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:23:43.873311 systemd-networkd[896]: enP30832s1: Gained carrier May 15 12:23:43.879199 systemd-networkd[896]: eth0: DHCPv4 address 10.200.8.35/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:23:45.288365 systemd-networkd[896]: eth0: Gained IPv6LL May 15 12:23:45.672284 systemd-networkd[896]: enP30832s1: Gained IPv6LL May 15 12:23:46.887965 ignition[865]: Ignition 2.21.0 May 15 12:23:46.887976 ignition[865]: Stage: fetch-offline May 15 12:23:46.889548 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:23:46.888046 ignition[865]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:46.888051 ignition[865]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:46.895059 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 15 12:23:46.888119 ignition[865]: parsed url from cmdline: "" May 15 12:23:46.888122 ignition[865]: no config URL provided May 15 12:23:46.888125 ignition[865]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:23:46.888130 ignition[865]: no config at "/usr/lib/ignition/user.ign" May 15 12:23:46.888133 ignition[865]: failed to fetch config: resource requires networking May 15 12:23:46.888364 ignition[865]: Ignition finished successfully May 15 12:23:46.916440 ignition[905]: Ignition 2.21.0 May 15 12:23:46.916449 ignition[905]: Stage: fetch May 15 12:23:46.916611 ignition[905]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:46.916617 ignition[905]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:46.916680 ignition[905]: parsed url from cmdline: "" May 15 12:23:46.916682 ignition[905]: no config URL provided May 15 12:23:46.916686 ignition[905]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:23:46.916690 ignition[905]: no config at "/usr/lib/ignition/user.ign" May 15 12:23:46.916719 ignition[905]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 15 12:23:47.025106 ignition[905]: GET result: OK May 15 12:23:47.025202 ignition[905]: config has been read from IMDS userdata May 15 12:23:47.025236 ignition[905]: parsing config with SHA512: e934d7abee13e68af1bd6e6d99f6bfdf7e84b3b0d67e94bf1d57525f1142ed33c442771ca8ca885c056e5b8501b1a7cd472b2ba4f82d0ead693adb297d7dc7f3 May 15 12:23:47.031666 unknown[905]: fetched base config from "system" May 15 12:23:47.031673 unknown[905]: fetched base config from "system" May 15 12:23:47.031953 ignition[905]: fetch: fetch complete May 15 12:23:47.031678 unknown[905]: fetched user config from "azure" May 15 12:23:47.031956 ignition[905]: fetch: fetch passed May 15 12:23:47.033871 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 15 12:23:47.031986 ignition[905]: Ignition finished successfully May 15 12:23:47.036455 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 15 12:23:47.061475 ignition[912]: Ignition 2.21.0 May 15 12:23:47.061485 ignition[912]: Stage: kargs May 15 12:23:47.063299 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 15 12:23:47.061627 ignition[912]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:47.066594 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 15 12:23:47.061634 ignition[912]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:47.062241 ignition[912]: kargs: kargs passed May 15 12:23:47.062269 ignition[912]: Ignition finished successfully May 15 12:23:47.091829 ignition[918]: Ignition 2.21.0 May 15 12:23:47.091838 ignition[918]: Stage: disks May 15 12:23:47.092002 ignition[918]: no configs at "/usr/lib/ignition/base.d" May 15 12:23:47.093964 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 15 12:23:47.092008 ignition[918]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:47.092968 ignition[918]: disks: disks passed May 15 12:23:47.093006 ignition[918]: Ignition finished successfully May 15 12:23:47.100773 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 15 12:23:47.106255 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 15 12:23:47.106488 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:23:47.112216 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:23:47.113332 systemd[1]: Reached target basic.target - Basic System. May 15 12:23:47.117812 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 15 12:23:47.241756 systemd-fsck[926]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks May 15 12:23:47.245465 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 15 12:23:47.250306 systemd[1]: Mounting sysroot.mount - /sysroot... May 15 12:23:47.622185 kernel: EXT4-fs (nvme0n1p9): mounted filesystem f7dea4bd-2644-4592-b85b-330f322c4d2b r/w with ordered data mode. Quota mode: none. May 15 12:23:47.622796 systemd[1]: Mounted sysroot.mount - /sysroot. May 15 12:23:47.624471 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 15 12:23:47.652982 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:23:47.656428 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 15 12:23:47.678284 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 15 12:23:47.680203 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 15 12:23:47.699511 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (936) May 15 12:23:47.699533 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:47.699546 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:47.699556 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:47.680232 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:23:47.684846 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 15 12:23:47.699226 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 15 12:23:47.706726 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:23:48.716484 coreos-metadata[938]: May 15 12:23:48.716 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 15 12:23:48.719548 coreos-metadata[938]: May 15 12:23:48.719 INFO Fetch successful May 15 12:23:48.719548 coreos-metadata[938]: May 15 12:23:48.719 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 15 12:23:48.732110 coreos-metadata[938]: May 15 12:23:48.732 INFO Fetch successful May 15 12:23:48.735669 coreos-metadata[938]: May 15 12:23:48.735 INFO wrote hostname ci-4334.0.0-a-9b1bbdffc7 to /sysroot/etc/hostname May 15 12:23:48.738357 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:23:49.163375 initrd-setup-root[966]: cut: /sysroot/etc/passwd: No such file or directory May 15 12:23:49.277277 initrd-setup-root[973]: cut: /sysroot/etc/group: No such file or directory May 15 12:23:49.282377 initrd-setup-root[980]: cut: /sysroot/etc/shadow: No such file or directory May 15 12:23:49.286716 initrd-setup-root[987]: cut: /sysroot/etc/gshadow: No such file or directory May 15 12:23:50.423454 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 15 12:23:50.428273 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 15 12:23:50.441287 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 15 12:23:50.448472 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 15 12:23:50.452425 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:50.471815 ignition[1054]: INFO : Ignition 2.21.0 May 15 12:23:50.471815 ignition[1054]: INFO : Stage: mount May 15 12:23:50.474203 ignition[1054]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:50.474203 ignition[1054]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:50.474203 ignition[1054]: INFO : mount: mount passed May 15 12:23:50.474203 ignition[1054]: INFO : Ignition finished successfully May 15 12:23:50.476755 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 15 12:23:50.483342 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 15 12:23:50.486705 systemd[1]: Starting ignition-files.service - Ignition (files)... May 15 12:23:50.498655 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:23:50.513188 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1066) May 15 12:23:50.515675 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:23:50.515699 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 15 12:23:50.515710 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 15 12:23:50.520593 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:23:50.554334 ignition[1083]: INFO : Ignition 2.21.0 May 15 12:23:50.554334 ignition[1083]: INFO : Stage: files May 15 12:23:50.554334 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:50.554334 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:50.554334 ignition[1083]: DEBUG : files: compiled without relabeling support, skipping May 15 12:23:50.610951 ignition[1083]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 15 12:23:50.610951 ignition[1083]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 15 12:23:50.681263 ignition[1083]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 15 12:23:50.684514 ignition[1083]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 15 12:23:50.684514 ignition[1083]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 15 12:23:50.681612 unknown[1083]: wrote ssh authorized keys file for user: core May 15 12:23:50.819361 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:23:50.824594 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 15 12:23:50.871861 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 15 12:23:51.152433 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:23:51.154620 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:23:51.181243 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 15 12:23:51.751836 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 15 12:23:52.868265 ignition[1083]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:23:52.868265 ignition[1083]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 15 12:23:53.193816 ignition[1083]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:23:54.825493 ignition[1083]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:23:54.830442 ignition[1083]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 15 12:23:54.830442 ignition[1083]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 15 12:23:54.830442 ignition[1083]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 15 12:23:54.830442 ignition[1083]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 15 12:23:54.830442 ignition[1083]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 15 12:23:54.830442 ignition[1083]: INFO : files: files passed May 15 12:23:54.830442 ignition[1083]: INFO : Ignition finished successfully May 15 12:23:54.827746 systemd[1]: Finished ignition-files.service - Ignition (files). May 15 12:23:54.846371 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 15 12:23:54.850907 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 15 12:23:54.862055 systemd[1]: ignition-quench.service: Deactivated successfully. May 15 12:23:54.862128 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 15 12:23:55.389293 initrd-setup-root-after-ignition[1112]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.389293 initrd-setup-root-after-ignition[1112]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.395328 initrd-setup-root-after-ignition[1116]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:23:55.394426 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:23:55.398562 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 15 12:23:55.404665 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 15 12:23:55.437421 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 15 12:23:55.437495 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 15 12:23:55.440403 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 15 12:23:55.443239 systemd[1]: Reached target initrd.target - Initrd Default Target. May 15 12:23:55.446617 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 15 12:23:55.447515 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 15 12:23:55.466504 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:23:55.469592 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 15 12:23:55.488002 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 15 12:23:55.488154 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:23:55.488373 systemd[1]: Stopped target timers.target - Timer Units. May 15 12:23:55.488656 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 15 12:23:55.488756 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:23:55.497691 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 15 12:23:55.504432 systemd[1]: Stopped target basic.target - Basic System. May 15 12:23:55.505502 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 15 12:23:55.505812 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:23:55.506097 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 15 12:23:55.506448 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 15 12:23:55.506706 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 15 12:23:55.507235 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:23:55.515055 systemd[1]: Stopped target sysinit.target - System Initialization. May 15 12:23:55.519320 systemd[1]: Stopped target local-fs.target - Local File Systems. May 15 12:23:55.521919 systemd[1]: Stopped target swap.target - Swaps. May 15 12:23:55.523529 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 15 12:23:55.523638 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 15 12:23:55.524167 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 15 12:23:55.524456 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:23:55.524731 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 15 12:23:55.525179 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:23:55.546305 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 15 12:23:55.546448 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 15 12:23:55.550435 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 15 12:23:55.550548 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:23:55.554346 systemd[1]: ignition-files.service: Deactivated successfully. May 15 12:23:55.554461 systemd[1]: Stopped ignition-files.service - Ignition (files). May 15 12:23:55.557286 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 15 12:23:55.557393 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:23:55.560782 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 15 12:23:55.572380 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 15 12:23:55.576504 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 15 12:23:55.576645 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:23:55.583187 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 15 12:23:55.585360 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:23:55.595105 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 15 12:23:55.597043 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 15 12:23:55.598710 ignition[1136]: INFO : Ignition 2.21.0 May 15 12:23:55.603247 ignition[1136]: INFO : Stage: umount May 15 12:23:55.603247 ignition[1136]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:23:55.603247 ignition[1136]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 15 12:23:55.603247 ignition[1136]: INFO : umount: umount passed May 15 12:23:55.603247 ignition[1136]: INFO : Ignition finished successfully May 15 12:23:55.602638 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 15 12:23:55.602969 systemd[1]: ignition-mount.service: Deactivated successfully. May 15 12:23:55.603021 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 15 12:23:55.610595 systemd[1]: ignition-disks.service: Deactivated successfully. May 15 12:23:55.610655 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 15 12:23:55.611008 systemd[1]: ignition-kargs.service: Deactivated successfully. May 15 12:23:55.611035 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 15 12:23:55.611339 systemd[1]: ignition-fetch.service: Deactivated successfully. May 15 12:23:55.611371 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 15 12:23:55.611500 systemd[1]: Stopped target network.target - Network. May 15 12:23:55.611523 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 15 12:23:55.611548 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:23:55.611807 systemd[1]: Stopped target paths.target - Path Units. May 15 12:23:55.611826 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 15 12:23:55.613301 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:23:55.619137 systemd[1]: Stopped target slices.target - Slice Units. May 15 12:23:55.622240 systemd[1]: Stopped target sockets.target - Socket Units. May 15 12:23:55.623640 systemd[1]: iscsid.socket: Deactivated successfully. May 15 12:23:55.623674 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:23:55.623860 systemd[1]: iscsiuio.socket: Deactivated successfully. May 15 12:23:55.623884 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:23:55.624361 systemd[1]: ignition-setup.service: Deactivated successfully. May 15 12:23:55.624401 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 15 12:23:55.624827 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 15 12:23:55.624854 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 15 12:23:55.625161 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 15 12:23:55.625728 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 15 12:23:55.628733 systemd[1]: systemd-resolved.service: Deactivated successfully. May 15 12:23:55.628805 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 15 12:23:55.647047 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 15 12:23:55.647217 systemd[1]: systemd-networkd.service: Deactivated successfully. May 15 12:23:55.647290 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 15 12:23:55.650541 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 15 12:23:55.651117 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 15 12:23:55.669039 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 15 12:23:55.669072 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 15 12:23:55.689797 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 15 12:23:55.691699 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 15 12:23:55.691752 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:23:55.694593 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 15 12:23:55.694635 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 15 12:23:55.698968 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 15 12:23:55.699533 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 15 12:23:55.704382 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 15 12:23:55.704424 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:23:55.709165 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:23:55.711799 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 15 12:23:55.730365 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbc335 eth0: Data path switched from VF: enP30832s1 May 15 12:23:55.732134 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:23:55.711853 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 15 12:23:55.723463 systemd[1]: systemd-udevd.service: Deactivated successfully. May 15 12:23:55.723606 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:23:55.729092 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 15 12:23:55.729152 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 15 12:23:55.732046 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 15 12:23:55.732078 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:23:55.741343 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 15 12:23:55.741387 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 15 12:23:55.751051 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 15 12:23:55.751103 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 15 12:23:55.754222 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 12:23:55.754262 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:23:55.760278 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 15 12:23:55.762456 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 15 12:23:55.762510 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:23:55.762842 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 15 12:23:55.762876 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:23:55.763551 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 15 12:23:55.763586 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:23:55.763889 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 15 12:23:55.763921 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:23:55.764435 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:23:55.764460 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:23:55.765728 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 15 12:23:55.765766 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 15 12:23:55.765792 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 15 12:23:55.765819 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 15 12:23:55.766034 systemd[1]: network-cleanup.service: Deactivated successfully. May 15 12:23:55.773431 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 15 12:23:55.776257 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 15 12:23:55.776335 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 15 12:23:56.438744 systemd[1]: sysroot-boot.service: Deactivated successfully. May 15 12:23:56.438873 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 15 12:23:56.440681 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 15 12:23:56.441626 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 15 12:23:56.441681 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 15 12:23:56.445046 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 15 12:23:56.457512 systemd[1]: Switching root. May 15 12:23:56.502809 systemd-journald[205]: Journal stopped May 15 12:24:02.008909 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). May 15 12:24:02.008942 kernel: SELinux: policy capability network_peer_controls=1 May 15 12:24:02.008953 kernel: SELinux: policy capability open_perms=1 May 15 12:24:02.008961 kernel: SELinux: policy capability extended_socket_class=1 May 15 12:24:02.008968 kernel: SELinux: policy capability always_check_network=0 May 15 12:24:02.008975 kernel: SELinux: policy capability cgroup_seclabel=1 May 15 12:24:02.008985 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 15 12:24:02.008992 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 15 12:24:02.008999 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 15 12:24:02.009007 kernel: SELinux: policy capability userspace_initial_context=0 May 15 12:24:02.009014 kernel: audit: type=1403 audit(1747311839.933:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 15 12:24:02.009024 systemd[1]: Successfully loaded SELinux policy in 57.539ms. May 15 12:24:02.009033 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.013ms. May 15 12:24:02.009045 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:24:02.009055 systemd[1]: Detected virtualization microsoft. May 15 12:24:02.009063 systemd[1]: Detected architecture x86-64. May 15 12:24:02.009071 systemd[1]: Detected first boot. May 15 12:24:02.009079 systemd[1]: Hostname set to . May 15 12:24:02.009089 systemd[1]: Initializing machine ID from random generator. May 15 12:24:02.009098 zram_generator::config[1180]: No configuration found. May 15 12:24:02.009107 kernel: Guest personality initialized and is inactive May 15 12:24:02.009115 kernel: VMCI host device registered (name=vmci, major=10, minor=124) May 15 12:24:02.009122 kernel: Initialized host personality May 15 12:24:02.009130 kernel: NET: Registered PF_VSOCK protocol family May 15 12:24:02.009138 systemd[1]: Populated /etc with preset unit settings. May 15 12:24:02.009148 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 15 12:24:02.009158 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 15 12:24:02.009167 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 15 12:24:02.010813 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 15 12:24:02.010828 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 15 12:24:02.010838 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 15 12:24:02.010847 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 15 12:24:02.010859 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 15 12:24:02.010868 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 15 12:24:02.010878 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 15 12:24:02.010887 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 15 12:24:02.010896 systemd[1]: Created slice user.slice - User and Session Slice. May 15 12:24:02.010905 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:24:02.010914 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:24:02.010923 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 15 12:24:02.010935 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 15 12:24:02.010946 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 15 12:24:02.010955 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:24:02.010964 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 15 12:24:02.010974 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:24:02.010983 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:24:02.010992 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 15 12:24:02.011001 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 15 12:24:02.011012 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 15 12:24:02.011021 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 15 12:24:02.011031 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:24:02.011040 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:24:02.011049 systemd[1]: Reached target slices.target - Slice Units. May 15 12:24:02.011058 systemd[1]: Reached target swap.target - Swaps. May 15 12:24:02.011067 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 15 12:24:02.011076 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 15 12:24:02.011087 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 15 12:24:02.011096 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:24:02.011105 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:24:02.011114 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:24:02.011123 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 15 12:24:02.011134 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 15 12:24:02.011143 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 15 12:24:02.011152 systemd[1]: Mounting media.mount - External Media Directory... May 15 12:24:02.011162 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:02.011200 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 15 12:24:02.011210 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 15 12:24:02.011219 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 15 12:24:02.011228 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 15 12:24:02.011240 systemd[1]: Reached target machines.target - Containers. May 15 12:24:02.011249 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 15 12:24:02.011258 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:02.011268 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:24:02.011277 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 15 12:24:02.011286 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:02.011295 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:24:02.011304 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:02.011315 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 15 12:24:02.011324 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:02.011334 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 15 12:24:02.011343 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 15 12:24:02.011352 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 15 12:24:02.011361 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 15 12:24:02.011370 systemd[1]: Stopped systemd-fsck-usr.service. May 15 12:24:02.011380 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:02.011391 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:24:02.011401 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:24:02.011410 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:24:02.011419 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 15 12:24:02.011449 systemd-journald[1263]: Collecting audit messages is disabled. May 15 12:24:02.011471 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 15 12:24:02.011481 systemd-journald[1263]: Journal started May 15 12:24:02.011501 systemd-journald[1263]: Runtime Journal (/run/log/journal/0cd5c904dcee460ab17d2645bf57fbea) is 8M, max 159M, 151M free. May 15 12:24:02.017215 kernel: loop: module loaded May 15 12:24:01.658278 systemd[1]: Queued start job for default target multi-user.target. May 15 12:24:01.665532 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 15 12:24:01.665830 systemd[1]: systemd-journald.service: Deactivated successfully. May 15 12:24:02.021189 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:24:02.026267 systemd[1]: verity-setup.service: Deactivated successfully. May 15 12:24:02.026307 systemd[1]: Stopped verity-setup.service. May 15 12:24:02.032191 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:02.037318 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:24:02.038750 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 15 12:24:02.040748 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 15 12:24:02.045356 systemd[1]: Mounted media.mount - External Media Directory. May 15 12:24:02.048282 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 15 12:24:02.051275 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 15 12:24:02.053262 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 15 12:24:02.054476 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:24:02.057390 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 15 12:24:02.057503 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 15 12:24:02.060435 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:02.060532 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:02.063356 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:02.063457 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:02.066418 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:02.066526 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:02.069381 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:24:02.070900 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 15 12:24:02.079157 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 15 12:24:02.082246 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 15 12:24:02.082282 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:24:02.085281 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 15 12:24:02.092274 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 15 12:24:02.104183 kernel: fuse: init (API version 7.41) May 15 12:24:02.143265 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:02.192256 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 15 12:24:02.197040 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 15 12:24:02.199573 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:24:02.203331 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 15 12:24:02.205479 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:24:02.208291 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:24:02.213308 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 15 12:24:02.219286 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:24:02.222889 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 15 12:24:02.223040 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 15 12:24:02.225645 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:24:02.228871 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 15 12:24:02.233765 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 15 12:24:02.237903 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:24:02.243257 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 15 12:24:02.252763 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:24:02.255001 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 15 12:24:02.546198 systemd-journald[1263]: Time spent on flushing to /var/log/journal/0cd5c904dcee460ab17d2645bf57fbea is 100.110ms for 984 entries. May 15 12:24:02.546198 systemd-journald[1263]: System Journal (/var/log/journal/0cd5c904dcee460ab17d2645bf57fbea) is 8M, max 2.6G, 2.6G free. May 15 12:24:05.886356 systemd-journald[1263]: Received client request to flush runtime journal. May 15 12:24:05.886424 kernel: ACPI: bus type drm_connector registered May 15 12:24:05.886448 kernel: loop0: detected capacity change from 0 to 146240 May 15 12:24:05.886464 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 15 12:24:05.886480 kernel: loop1: detected capacity change from 0 to 210664 May 15 12:24:02.657186 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:24:02.657339 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:24:02.685035 systemd-tmpfiles[1309]: ACLs are not supported, ignoring. May 15 12:24:02.685047 systemd-tmpfiles[1309]: ACLs are not supported, ignoring. May 15 12:24:02.688156 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:24:02.935316 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:24:03.089794 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 15 12:24:03.095419 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 15 12:24:03.100295 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 15 12:24:03.142429 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 15 12:24:03.151291 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 15 12:24:04.482308 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 15 12:24:04.485719 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:24:04.508523 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. May 15 12:24:04.508530 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. May 15 12:24:04.510757 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:24:05.887929 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 15 12:24:07.283210 kernel: loop2: detected capacity change from 0 to 113872 May 15 12:24:08.847584 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 15 12:24:08.851524 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:24:08.880512 systemd-udevd[1345]: Using default interface naming scheme 'v255'. May 15 12:24:09.289046 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 15 12:24:09.289745 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 15 12:24:09.296201 kernel: loop3: detected capacity change from 0 to 28536 May 15 12:24:09.345964 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:24:09.354293 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:24:09.391802 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 15 12:24:09.435787 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 15 12:24:09.507629 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 15 12:24:09.539188 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#112 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 15 12:24:09.791292 kernel: mousedev: PS/2 mouse device common for all mice May 15 12:24:09.950467 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:24:09.991301 kernel: hv_vmbus: registering driver hv_balloon May 15 12:24:09.992136 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 15 12:24:09.996858 systemd-networkd[1361]: lo: Link UP May 15 12:24:09.996864 systemd-networkd[1361]: lo: Gained carrier May 15 12:24:09.998143 systemd-networkd[1361]: Enumeration completed May 15 12:24:09.998226 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:24:09.999292 systemd-networkd[1361]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:09.999300 systemd-networkd[1361]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:24:09.999631 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 15 12:24:10.003276 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 15 12:24:10.007189 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 May 15 12:24:10.013258 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 May 15 12:24:10.013447 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fbc335 eth0: Data path switched to VF: enP30832s1 May 15 12:24:10.014005 systemd-networkd[1361]: enP30832s1: Link UP May 15 12:24:10.014067 systemd-networkd[1361]: eth0: Link UP May 15 12:24:10.014070 systemd-networkd[1361]: eth0: Gained carrier May 15 12:24:10.014083 systemd-networkd[1361]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:10.025348 systemd-networkd[1361]: enP30832s1: Gained carrier May 15 12:24:10.038220 systemd-networkd[1361]: eth0: DHCPv4 address 10.200.8.35/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:24:10.043195 kernel: hv_vmbus: registering driver hyperv_fb May 15 12:24:10.047209 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 15 12:24:10.051187 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 15 12:24:10.053756 kernel: Console: switching to colour dummy device 80x25 May 15 12:24:10.058022 kernel: Console: switching to colour frame buffer device 128x48 May 15 12:24:10.067424 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:24:10.067566 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:24:10.070771 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 15 12:24:10.071937 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:24:10.143349 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 15 12:24:10.404189 kernel: kvm_intel: Using Hyper-V Enlightened VMCS May 15 12:24:10.534409 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 15 12:24:10.538578 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 15 12:24:10.591884 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 15 12:24:10.657190 kernel: loop4: detected capacity change from 0 to 146240 May 15 12:24:10.671208 kernel: loop5: detected capacity change from 0 to 210664 May 15 12:24:11.464392 systemd-networkd[1361]: enP30832s1: Gained IPv6LL May 15 12:24:11.535208 kernel: loop6: detected capacity change from 0 to 113872 May 15 12:24:11.548201 kernel: loop7: detected capacity change from 0 to 28536 May 15 12:24:11.584397 (sd-merge)[1445]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 15 12:24:11.584741 (sd-merge)[1445]: Merged extensions into '/usr'. May 15 12:24:11.589021 systemd[1]: Reload requested from client PID 1308 ('systemd-sysext') (unit systemd-sysext.service)... May 15 12:24:11.589035 systemd[1]: Reloading... May 15 12:24:11.648194 zram_generator::config[1479]: No configuration found. May 15 12:24:11.723754 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:11.804762 systemd[1]: Reloading finished in 215 ms. May 15 12:24:11.833543 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 15 12:24:11.836470 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:24:11.844994 systemd[1]: Starting ensure-sysext.service... May 15 12:24:11.847306 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:24:11.857115 systemd[1]: Reload requested from client PID 1536 ('systemctl') (unit ensure-sysext.service)... May 15 12:24:11.857131 systemd[1]: Reloading... May 15 12:24:11.865560 systemd-tmpfiles[1537]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 12:24:11.865779 systemd-tmpfiles[1537]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 12:24:11.866008 systemd-tmpfiles[1537]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 12:24:11.866245 systemd-tmpfiles[1537]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 12:24:11.866701 systemd-tmpfiles[1537]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 12:24:11.866861 systemd-tmpfiles[1537]: ACLs are not supported, ignoring. May 15 12:24:11.866901 systemd-tmpfiles[1537]: ACLs are not supported, ignoring. May 15 12:24:11.902266 zram_generator::config[1567]: No configuration found. May 15 12:24:11.976274 systemd-networkd[1361]: eth0: Gained IPv6LL May 15 12:24:11.986996 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:12.061202 systemd[1]: Reloading finished in 203 ms. May 15 12:24:12.078906 systemd-tmpfiles[1537]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:24:12.078914 systemd-tmpfiles[1537]: Skipping /boot May 15 12:24:12.081775 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 15 12:24:12.088921 systemd-tmpfiles[1537]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:24:12.089002 systemd-tmpfiles[1537]: Skipping /boot May 15 12:24:12.148494 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:24:12.150953 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:12.151909 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:24:12.189994 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 15 12:24:12.192311 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:12.204432 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:12.208054 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:12.213383 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:12.215288 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:12.215408 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:12.216835 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 15 12:24:12.230199 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:24:12.234212 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 15 12:24:12.236811 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:12.238783 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:12.239026 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:12.241852 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:12.241990 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:12.246641 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:12.246768 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:12.253105 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:12.253559 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:24:12.254872 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:24:12.259382 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:24:12.265255 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:24:12.268364 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:24:12.271352 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:24:12.271460 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:24:12.271593 systemd[1]: Reached target time-set.target - System Time Set. May 15 12:24:12.273332 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:24:12.275855 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:24:12.277375 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:24:12.279990 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:24:12.280143 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:24:12.283715 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:24:12.283865 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:24:12.286898 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:24:12.287076 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:24:12.293027 systemd[1]: Finished ensure-sysext.service. May 15 12:24:12.299146 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 15 12:24:12.303861 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:24:12.303914 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:24:12.547113 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 15 12:24:12.585109 systemd-resolved[1635]: Positive Trust Anchors: May 15 12:24:12.585120 systemd-resolved[1635]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:24:12.585152 systemd-resolved[1635]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:24:12.632417 systemd-resolved[1635]: Using system hostname 'ci-4334.0.0-a-9b1bbdffc7'. May 15 12:24:12.633733 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:24:12.635075 systemd[1]: Reached target network.target - Network. May 15 12:24:12.636257 systemd[1]: Reached target network-online.target - Network is Online. May 15 12:24:12.637439 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:24:12.840642 augenrules[1671]: No rules May 15 12:24:12.841395 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:24:12.841563 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:24:13.229303 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 15 12:24:13.232395 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 15 12:24:16.741494 ldconfig[1296]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 15 12:24:16.752690 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 15 12:24:16.755469 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 15 12:24:16.774389 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 15 12:24:16.775885 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:24:16.779386 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 15 12:24:16.782240 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 15 12:24:16.785212 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 15 12:24:16.786789 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 15 12:24:16.789263 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 15 12:24:16.790841 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 15 12:24:16.792438 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 15 12:24:16.792460 systemd[1]: Reached target paths.target - Path Units. May 15 12:24:16.793525 systemd[1]: Reached target timers.target - Timer Units. May 15 12:24:16.944377 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 15 12:24:16.947944 systemd[1]: Starting docker.socket - Docker Socket for the API... May 15 12:24:16.951033 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 15 12:24:16.952740 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 15 12:24:16.956249 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 15 12:24:16.965559 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 15 12:24:16.967166 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 15 12:24:16.971645 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 15 12:24:16.974845 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:24:16.976124 systemd[1]: Reached target basic.target - Basic System. May 15 12:24:16.978258 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 15 12:24:16.978279 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 15 12:24:16.980331 systemd[1]: Starting chronyd.service - NTP client/server... May 15 12:24:16.983819 systemd[1]: Starting containerd.service - containerd container runtime... May 15 12:24:16.988935 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 15 12:24:16.993374 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 15 12:24:16.997273 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 15 12:24:17.003005 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 15 12:24:17.007993 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 15 12:24:17.010419 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 15 12:24:17.011601 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 15 12:24:17.016313 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:17.022585 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 15 12:24:17.026374 jq[1691]: false May 15 12:24:17.025563 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 15 12:24:17.030620 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 15 12:24:17.039381 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 15 12:24:17.042394 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 15 12:24:17.052283 systemd[1]: Starting systemd-logind.service - User Login Management... May 15 12:24:17.054950 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 15 12:24:17.055339 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 15 12:24:17.058918 extend-filesystems[1692]: Found loop4 May 15 12:24:17.059238 systemd[1]: Starting update-engine.service - Update Engine... May 15 12:24:17.063316 extend-filesystems[1692]: Found loop5 May 15 12:24:17.063316 extend-filesystems[1692]: Found loop6 May 15 12:24:17.063316 extend-filesystems[1692]: Found loop7 May 15 12:24:17.063316 extend-filesystems[1692]: Found sr0 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p1 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p2 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p3 May 15 12:24:17.063316 extend-filesystems[1692]: Found usr May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p4 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p6 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p7 May 15 12:24:17.063316 extend-filesystems[1692]: Found nvme0n1p9 May 15 12:24:17.063316 extend-filesystems[1692]: Checking size of /dev/nvme0n1p9 May 15 12:24:17.067733 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 15 12:24:17.098117 oslogin_cache_refresh[1693]: Refreshing passwd entry cache May 15 12:24:17.110616 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing passwd entry cache May 15 12:24:17.081276 (chronyd)[1683]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 15 12:24:17.088468 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 15 12:24:17.090955 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 15 12:24:17.091105 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 15 12:24:17.093593 systemd[1]: motdgen.service: Deactivated successfully. May 15 12:24:17.093745 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 15 12:24:17.101513 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 15 12:24:17.102226 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 15 12:24:17.112806 chronyd[1726]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 15 12:24:17.116727 chronyd[1726]: Timezone right/UTC failed leap second check, ignoring May 15 12:24:17.117453 jq[1711]: true May 15 12:24:17.116845 chronyd[1726]: Loaded seccomp filter (level 2) May 15 12:24:17.117411 systemd[1]: Started chronyd.service - NTP client/server. May 15 12:24:17.121935 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting users, quitting May 15 12:24:17.121935 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:24:17.121935 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing group entry cache May 15 12:24:17.119577 oslogin_cache_refresh[1693]: Failure getting users, quitting May 15 12:24:17.119591 oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:24:17.119623 oslogin_cache_refresh[1693]: Refreshing group entry cache May 15 12:24:17.124818 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 15 12:24:17.130348 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting groups, quitting May 15 12:24:17.130414 oslogin_cache_refresh[1693]: Failure getting groups, quitting May 15 12:24:17.130455 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:24:17.130481 oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:24:17.133198 extend-filesystems[1692]: Old size kept for /dev/nvme0n1p9 May 15 12:24:17.133947 systemd[1]: extend-filesystems.service: Deactivated successfully. May 15 12:24:17.134108 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 15 12:24:17.136316 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 15 12:24:17.136471 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 15 12:24:17.148437 (ntainerd)[1734]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 15 12:24:17.156690 jq[1732]: true May 15 12:24:17.640137 tar[1718]: linux-amd64/helm May 15 12:24:17.706160 update_engine[1704]: I20250515 12:24:17.706084 1704 main.cc:92] Flatcar Update Engine starting May 15 12:24:17.733580 systemd-logind[1702]: New seat seat0. May 15 12:24:17.738929 systemd-logind[1702]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 15 12:24:17.739074 systemd[1]: Started systemd-logind.service - User Login Management. May 15 12:24:17.812833 dbus-daemon[1686]: [system] SELinux support is enabled May 15 12:24:17.812951 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 15 12:24:17.816966 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 15 12:24:17.816988 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 15 12:24:17.822286 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 15 12:24:17.825536 update_engine[1704]: I20250515 12:24:17.822321 1704 update_check_scheduler.cc:74] Next update check in 8m53s May 15 12:24:17.822336 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 15 12:24:17.827604 systemd[1]: Started update-engine.service - Update Engine. May 15 12:24:17.831388 dbus-daemon[1686]: [system] Successfully activated service 'org.freedesktop.systemd1' May 15 12:24:17.835744 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 15 12:24:18.038271 coreos-metadata[1685]: May 15 12:24:18.038 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 15 12:24:18.081351 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.043 INFO Fetch successful May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.043 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.047 INFO Fetch successful May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.047 INFO Fetching http://168.63.129.16/machine/7bfee1d0-d9dc-47bc-9ac4-cbfbd51a4d55/3e0d24e3%2D9c41%2D4c2a%2Dbaae%2D68fdde544e53.%5Fci%2D4334.0.0%2Da%2D9b1bbdffc7?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.049 INFO Fetch successful May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.049 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 15 12:24:18.439436 coreos-metadata[1685]: May 15 12:24:18.059 INFO Fetch successful May 15 12:24:18.084472 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 15 12:24:18.600349 tar[1718]: linux-amd64/LICENSE May 15 12:24:18.600349 tar[1718]: linux-amd64/README.md May 15 12:24:18.611383 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 15 12:24:18.641259 sshd_keygen[1730]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 15 12:24:18.660201 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 15 12:24:18.664932 systemd[1]: Starting issuegen.service - Generate /run/issue... May 15 12:24:18.669325 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 15 12:24:18.681932 systemd[1]: issuegen.service: Deactivated successfully. May 15 12:24:18.682154 systemd[1]: Finished issuegen.service - Generate /run/issue. May 15 12:24:18.686551 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 15 12:24:18.793397 locksmithd[1785]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 15 12:24:18.842408 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 15 12:24:18.844757 systemd[1]: Started getty@tty1.service - Getty on tty1. May 15 12:24:18.848443 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 15 12:24:18.851025 systemd[1]: Reached target getty.target - Login Prompts. May 15 12:24:19.041550 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 15 12:24:19.091343 bash[1771]: Updated "/home/core/.ssh/authorized_keys" May 15 12:24:19.091843 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 15 12:24:19.095890 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 15 12:24:19.345870 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:19.349002 (kubelet)[1832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:19.852120 kubelet[1832]: E0515 12:24:19.852088 1832 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:19.853608 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:19.853723 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:19.853993 systemd[1]: kubelet.service: Consumed 787ms CPU time, 241.1M memory peak. May 15 12:24:20.192581 containerd[1734]: time="2025-05-15T12:24:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 15 12:24:20.194348 containerd[1734]: time="2025-05-15T12:24:20.193223578Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 15 12:24:20.199949 containerd[1734]: time="2025-05-15T12:24:20.199908980Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.307µs" May 15 12:24:20.199949 containerd[1734]: time="2025-05-15T12:24:20.199947165Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 15 12:24:20.200021 containerd[1734]: time="2025-05-15T12:24:20.199963807Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 15 12:24:20.200098 containerd[1734]: time="2025-05-15T12:24:20.200083815Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 15 12:24:20.200119 containerd[1734]: time="2025-05-15T12:24:20.200102593Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 15 12:24:20.200140 containerd[1734]: time="2025-05-15T12:24:20.200120709Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200164182Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200197925Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200397664Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200409328Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200418260Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200425935Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200483655Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200645089Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200666697Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:24:20.200699 containerd[1734]: time="2025-05-15T12:24:20.200675109Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 15 12:24:20.200955 containerd[1734]: time="2025-05-15T12:24:20.200943047Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 15 12:24:20.201307 containerd[1734]: time="2025-05-15T12:24:20.201272492Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 15 12:24:20.201366 containerd[1734]: time="2025-05-15T12:24:20.201353937Z" level=info msg="metadata content store policy set" policy=shared May 15 12:24:20.278926 containerd[1734]: time="2025-05-15T12:24:20.278900240Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 15 12:24:20.279020 containerd[1734]: time="2025-05-15T12:24:20.279010799Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 15 12:24:20.279121 containerd[1734]: time="2025-05-15T12:24:20.279104016Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 15 12:24:20.279146 containerd[1734]: time="2025-05-15T12:24:20.279134091Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 15 12:24:20.279168 containerd[1734]: time="2025-05-15T12:24:20.279146522Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 15 12:24:20.279168 containerd[1734]: time="2025-05-15T12:24:20.279156688Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 15 12:24:20.279227 containerd[1734]: time="2025-05-15T12:24:20.279166659Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 15 12:24:20.279227 containerd[1734]: time="2025-05-15T12:24:20.279206013Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 15 12:24:20.279227 containerd[1734]: time="2025-05-15T12:24:20.279216625Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 15 12:24:20.279227 containerd[1734]: time="2025-05-15T12:24:20.279225436Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 15 12:24:20.279294 containerd[1734]: time="2025-05-15T12:24:20.279238349Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 15 12:24:20.279294 containerd[1734]: time="2025-05-15T12:24:20.279249555Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 15 12:24:20.279417 containerd[1734]: time="2025-05-15T12:24:20.279368190Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 15 12:24:20.279417 containerd[1734]: time="2025-05-15T12:24:20.279390851Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279404945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279495068Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279505530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279515877Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279530107Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 15 12:24:20.279558 containerd[1734]: time="2025-05-15T12:24:20.279541095Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 15 12:24:20.279777 containerd[1734]: time="2025-05-15T12:24:20.279712510Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 15 12:24:20.279777 containerd[1734]: time="2025-05-15T12:24:20.279727167Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 15 12:24:20.279777 containerd[1734]: time="2025-05-15T12:24:20.279742412Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 15 12:24:20.279860 containerd[1734]: time="2025-05-15T12:24:20.279846458Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 15 12:24:20.279908 containerd[1734]: time="2025-05-15T12:24:20.279901783Z" level=info msg="Start snapshots syncer" May 15 12:24:20.279966 containerd[1734]: time="2025-05-15T12:24:20.279958961Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 15 12:24:20.280464 containerd[1734]: time="2025-05-15T12:24:20.280420656Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 15 12:24:20.280696 containerd[1734]: time="2025-05-15T12:24:20.280671199Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 15 12:24:20.280785 containerd[1734]: time="2025-05-15T12:24:20.280773350Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 15 12:24:20.280885 containerd[1734]: time="2025-05-15T12:24:20.280872888Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 15 12:24:20.280912 containerd[1734]: time="2025-05-15T12:24:20.280901554Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 15 12:24:20.280930 containerd[1734]: time="2025-05-15T12:24:20.280916757Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 15 12:24:20.280948 containerd[1734]: time="2025-05-15T12:24:20.280930086Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 15 12:24:20.280966 containerd[1734]: time="2025-05-15T12:24:20.280945544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 15 12:24:20.280966 containerd[1734]: time="2025-05-15T12:24:20.280958717Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 15 12:24:20.281005 containerd[1734]: time="2025-05-15T12:24:20.280968933Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 15 12:24:20.281005 containerd[1734]: time="2025-05-15T12:24:20.280995802Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 15 12:24:20.281040 containerd[1734]: time="2025-05-15T12:24:20.281010737Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 15 12:24:20.281040 containerd[1734]: time="2025-05-15T12:24:20.281024203Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 15 12:24:20.281076 containerd[1734]: time="2025-05-15T12:24:20.281059889Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:24:20.281095 containerd[1734]: time="2025-05-15T12:24:20.281072662Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:24:20.281095 containerd[1734]: time="2025-05-15T12:24:20.281084533Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:24:20.281137 containerd[1734]: time="2025-05-15T12:24:20.281096893Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:24:20.281137 containerd[1734]: time="2025-05-15T12:24:20.281104767Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 15 12:24:20.281137 containerd[1734]: time="2025-05-15T12:24:20.281115909Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 15 12:24:20.281137 containerd[1734]: time="2025-05-15T12:24:20.281127760Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 15 12:24:20.281226 containerd[1734]: time="2025-05-15T12:24:20.281142358Z" level=info msg="runtime interface created" May 15 12:24:20.281226 containerd[1734]: time="2025-05-15T12:24:20.281150047Z" level=info msg="created NRI interface" May 15 12:24:20.281226 containerd[1734]: time="2025-05-15T12:24:20.281157374Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 15 12:24:20.281226 containerd[1734]: time="2025-05-15T12:24:20.281190751Z" level=info msg="Connect containerd service" May 15 12:24:20.281291 containerd[1734]: time="2025-05-15T12:24:20.281227249Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 15 12:24:20.281908 containerd[1734]: time="2025-05-15T12:24:20.281848503Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:24:21.537999 containerd[1734]: time="2025-05-15T12:24:21.537958770Z" level=info msg="Start subscribing containerd event" May 15 12:24:21.538308 containerd[1734]: time="2025-05-15T12:24:21.538212398Z" level=info msg="Start recovering state" May 15 12:24:21.538346 containerd[1734]: time="2025-05-15T12:24:21.538311635Z" level=info msg="Start event monitor" May 15 12:24:21.538346 containerd[1734]: time="2025-05-15T12:24:21.538326117Z" level=info msg="Start cni network conf syncer for default" May 15 12:24:21.538346 containerd[1734]: time="2025-05-15T12:24:21.538332869Z" level=info msg="Start streaming server" May 15 12:24:21.538346 containerd[1734]: time="2025-05-15T12:24:21.538341299Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 15 12:24:21.538420 containerd[1734]: time="2025-05-15T12:24:21.538348386Z" level=info msg="runtime interface starting up..." May 15 12:24:21.538420 containerd[1734]: time="2025-05-15T12:24:21.538354914Z" level=info msg="starting plugins..." May 15 12:24:21.538420 containerd[1734]: time="2025-05-15T12:24:21.538366022Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 15 12:24:21.538474 containerd[1734]: time="2025-05-15T12:24:21.538145393Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 15 12:24:21.538474 containerd[1734]: time="2025-05-15T12:24:21.538468530Z" level=info msg=serving... address=/run/containerd/containerd.sock May 15 12:24:21.538857 containerd[1734]: time="2025-05-15T12:24:21.538536723Z" level=info msg="containerd successfully booted in 1.346445s" May 15 12:24:21.538579 systemd[1]: Started containerd.service - containerd container runtime. May 15 12:24:21.540501 systemd[1]: Reached target multi-user.target - Multi-User System. May 15 12:24:21.542316 systemd[1]: Startup finished in 2.880s (kernel) + 20.831s (initrd) + 21.665s (userspace) = 45.377s. May 15 12:24:21.799231 login[1814]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 12:24:21.802129 login[1815]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 12:24:21.809372 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 15 12:24:21.812097 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 15 12:24:21.818415 systemd-logind[1702]: New session 1 of user core. May 15 12:24:21.821987 systemd-logind[1702]: New session 2 of user core. May 15 12:24:21.831100 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 15 12:24:21.832649 systemd[1]: Starting user@500.service - User Manager for UID 500... May 15 12:24:21.841100 (systemd)[1861]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 15 12:24:21.842755 systemd-logind[1702]: New session c1 of user core. May 15 12:24:21.935189 waagent[1819]: 2025-05-15T12:24:21.934312Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 15 12:24:21.935520 waagent[1819]: 2025-05-15T12:24:21.935482Z INFO Daemon Daemon OS: flatcar 4334.0.0 May 15 12:24:21.936479 waagent[1819]: 2025-05-15T12:24:21.936438Z INFO Daemon Daemon Python: 3.11.12 May 15 12:24:21.940239 waagent[1819]: 2025-05-15T12:24:21.938997Z INFO Daemon Daemon Run daemon May 15 12:24:21.940785 waagent[1819]: 2025-05-15T12:24:21.940752Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4334.0.0' May 15 12:24:21.944358 waagent[1819]: 2025-05-15T12:24:21.944280Z INFO Daemon Daemon Using waagent for provisioning May 15 12:24:21.946522 waagent[1819]: 2025-05-15T12:24:21.946486Z INFO Daemon Daemon Activate resource disk May 15 12:24:21.948018 waagent[1819]: 2025-05-15T12:24:21.947991Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 15 12:24:21.952334 waagent[1819]: 2025-05-15T12:24:21.952302Z INFO Daemon Daemon Found device: None May 15 12:24:21.954012 waagent[1819]: 2025-05-15T12:24:21.953979Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 15 12:24:21.957184 waagent[1819]: 2025-05-15T12:24:21.957141Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 15 12:24:21.961430 waagent[1819]: 2025-05-15T12:24:21.961399Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 15 12:24:21.963337 waagent[1819]: 2025-05-15T12:24:21.963311Z INFO Daemon Daemon Running default provisioning handler May 15 12:24:21.971210 waagent[1819]: 2025-05-15T12:24:21.970477Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 15 12:24:21.971732 waagent[1819]: 2025-05-15T12:24:21.971701Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 15 12:24:21.972040 waagent[1819]: 2025-05-15T12:24:21.972021Z INFO Daemon Daemon cloud-init is enabled: False May 15 12:24:21.972327 waagent[1819]: 2025-05-15T12:24:21.972312Z INFO Daemon Daemon Copying ovf-env.xml May 15 12:24:21.994945 systemd[1861]: Queued start job for default target default.target. May 15 12:24:22.001754 systemd[1861]: Created slice app.slice - User Application Slice. May 15 12:24:22.001780 systemd[1861]: Reached target paths.target - Paths. May 15 12:24:22.001805 systemd[1861]: Reached target timers.target - Timers. May 15 12:24:22.002597 systemd[1861]: Starting dbus.socket - D-Bus User Message Bus Socket... May 15 12:24:22.009428 systemd[1861]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 15 12:24:22.009472 systemd[1861]: Reached target sockets.target - Sockets. May 15 12:24:22.009502 systemd[1861]: Reached target basic.target - Basic System. May 15 12:24:22.009532 systemd[1861]: Reached target default.target - Main User Target. May 15 12:24:22.009551 systemd[1861]: Startup finished in 162ms. May 15 12:24:22.009687 systemd[1]: Started user@500.service - User Manager for UID 500. May 15 12:24:22.020283 systemd[1]: Started session-1.scope - Session 1 of User core. May 15 12:24:22.021027 systemd[1]: Started session-2.scope - Session 2 of User core. May 15 12:24:22.154549 waagent[1819]: 2025-05-15T12:24:22.151528Z INFO Daemon Daemon Successfully mounted dvd May 15 12:24:22.161862 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 15 12:24:22.163194 waagent[1819]: 2025-05-15T12:24:22.163139Z INFO Daemon Daemon Detect protocol endpoint May 15 12:24:22.170590 waagent[1819]: 2025-05-15T12:24:22.163313Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 15 12:24:22.170590 waagent[1819]: 2025-05-15T12:24:22.163553Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 15 12:24:22.170590 waagent[1819]: 2025-05-15T12:24:22.163797Z INFO Daemon Daemon Test for route to 168.63.129.16 May 15 12:24:22.170590 waagent[1819]: 2025-05-15T12:24:22.163922Z INFO Daemon Daemon Route to 168.63.129.16 exists May 15 12:24:22.170590 waagent[1819]: 2025-05-15T12:24:22.164119Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 15 12:24:22.176226 waagent[1819]: 2025-05-15T12:24:22.176195Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 15 12:24:22.177656 waagent[1819]: 2025-05-15T12:24:22.176434Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 15 12:24:22.177656 waagent[1819]: 2025-05-15T12:24:22.176552Z INFO Daemon Daemon Server preferred version:2015-04-05 May 15 12:24:22.223268 waagent[1819]: 2025-05-15T12:24:22.223227Z INFO Daemon Daemon Initializing goal state during protocol detection May 15 12:24:22.223681 waagent[1819]: 2025-05-15T12:24:22.223382Z INFO Daemon Daemon Forcing an update of the goal state. May 15 12:24:22.229600 waagent[1819]: 2025-05-15T12:24:22.229574Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 15 12:24:22.241386 waagent[1819]: 2025-05-15T12:24:22.241358Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 15 12:24:22.242115 waagent[1819]: 2025-05-15T12:24:22.241794Z INFO Daemon May 15 12:24:22.242115 waagent[1819]: 2025-05-15T12:24:22.241925Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: f1f4993e-665d-4f91-bb90-9780c2632cf3 eTag: 6365235488815560214 source: Fabric] May 15 12:24:22.242115 waagent[1819]: 2025-05-15T12:24:22.242140Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 15 12:24:22.242115 waagent[1819]: 2025-05-15T12:24:22.242573Z INFO Daemon May 15 12:24:22.242115 waagent[1819]: 2025-05-15T12:24:22.242740Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 15 12:24:22.250307 waagent[1819]: 2025-05-15T12:24:22.250275Z INFO Daemon Daemon Downloading artifacts profile blob May 15 12:24:22.329569 waagent[1819]: 2025-05-15T12:24:22.329528Z INFO Daemon Downloaded certificate {'thumbprint': 'C36DAA638ED8BF81B9D42C2929C346D9B33DC132', 'hasPrivateKey': True} May 15 12:24:22.332194 waagent[1819]: 2025-05-15T12:24:22.329905Z INFO Daemon Fetch goal state completed May 15 12:24:22.336854 waagent[1819]: 2025-05-15T12:24:22.336806Z INFO Daemon Daemon Starting provisioning May 15 12:24:22.337090 waagent[1819]: 2025-05-15T12:24:22.336935Z INFO Daemon Daemon Handle ovf-env.xml. May 15 12:24:22.337090 waagent[1819]: 2025-05-15T12:24:22.337091Z INFO Daemon Daemon Set hostname [ci-4334.0.0-a-9b1bbdffc7] May 15 12:24:22.828364 waagent[1819]: 2025-05-15T12:24:22.828297Z INFO Daemon Daemon Publish hostname [ci-4334.0.0-a-9b1bbdffc7] May 15 12:24:22.831247 waagent[1819]: 2025-05-15T12:24:22.831210Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 15 12:24:22.832774 waagent[1819]: 2025-05-15T12:24:22.832744Z INFO Daemon Daemon Primary interface is [eth0] May 15 12:24:22.839396 systemd-networkd[1361]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:24:22.839403 systemd-networkd[1361]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:24:22.839428 systemd-networkd[1361]: eth0: DHCP lease lost May 15 12:24:22.840221 waagent[1819]: 2025-05-15T12:24:22.840150Z INFO Daemon Daemon Create user account if not exists May 15 12:24:22.840902 waagent[1819]: 2025-05-15T12:24:22.840380Z INFO Daemon Daemon User core already exists, skip useradd May 15 12:24:22.840902 waagent[1819]: 2025-05-15T12:24:22.840570Z INFO Daemon Daemon Configure sudoer May 15 12:24:22.858210 systemd-networkd[1361]: eth0: DHCPv4 address 10.200.8.35/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 15 12:24:22.891374 waagent[1819]: 2025-05-15T12:24:22.891322Z INFO Daemon Daemon Configure sshd May 15 12:24:22.896092 waagent[1819]: 2025-05-15T12:24:22.896016Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 15 12:24:22.898150 waagent[1819]: 2025-05-15T12:24:22.896459Z INFO Daemon Daemon Deploy ssh public key. May 15 12:24:23.991011 waagent[1819]: 2025-05-15T12:24:23.990938Z INFO Daemon Daemon Provisioning complete May 15 12:24:24.003845 waagent[1819]: 2025-05-15T12:24:24.003816Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 15 12:24:24.004615 waagent[1819]: 2025-05-15T12:24:24.004001Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 15 12:24:24.004615 waagent[1819]: 2025-05-15T12:24:24.004168Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 15 12:24:24.092371 waagent[1914]: 2025-05-15T12:24:24.092318Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 15 12:24:24.092584 waagent[1914]: 2025-05-15T12:24:24.092400Z INFO ExtHandler ExtHandler OS: flatcar 4334.0.0 May 15 12:24:24.092584 waagent[1914]: 2025-05-15T12:24:24.092436Z INFO ExtHandler ExtHandler Python: 3.11.12 May 15 12:24:24.092584 waagent[1914]: 2025-05-15T12:24:24.092472Z INFO ExtHandler ExtHandler CPU Arch: x86_64 May 15 12:24:24.492982 waagent[1914]: 2025-05-15T12:24:24.492938Z INFO ExtHandler ExtHandler Distro: flatcar-4334.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; May 15 12:24:24.493117 waagent[1914]: 2025-05-15T12:24:24.493094Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:24.493184 waagent[1914]: 2025-05-15T12:24:24.493143Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:24.500913 waagent[1914]: 2025-05-15T12:24:24.500865Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 15 12:24:24.513575 waagent[1914]: 2025-05-15T12:24:24.513548Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 15 12:24:24.513873 waagent[1914]: 2025-05-15T12:24:24.513849Z INFO ExtHandler May 15 12:24:24.513910 waagent[1914]: 2025-05-15T12:24:24.513895Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 437433df-628f-4b50-b143-9cda9714c9d3 eTag: 6365235488815560214 source: Fabric] May 15 12:24:24.514091 waagent[1914]: 2025-05-15T12:24:24.514069Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 15 12:24:24.514390 waagent[1914]: 2025-05-15T12:24:24.514367Z INFO ExtHandler May 15 12:24:24.514422 waagent[1914]: 2025-05-15T12:24:24.514403Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 15 12:24:24.524467 waagent[1914]: 2025-05-15T12:24:24.524444Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 15 12:24:24.744269 waagent[1914]: 2025-05-15T12:24:24.744141Z INFO ExtHandler Downloaded certificate {'thumbprint': 'C36DAA638ED8BF81B9D42C2929C346D9B33DC132', 'hasPrivateKey': True} May 15 12:24:24.744688 waagent[1914]: 2025-05-15T12:24:24.744650Z INFO ExtHandler Fetch goal state completed May 15 12:24:24.762907 waagent[1914]: 2025-05-15T12:24:24.762867Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 15 12:24:24.766659 waagent[1914]: 2025-05-15T12:24:24.766617Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1914 May 15 12:24:24.766759 waagent[1914]: 2025-05-15T12:24:24.766738Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 15 12:24:24.766972 waagent[1914]: 2025-05-15T12:24:24.766950Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 15 12:24:24.767847 waagent[1914]: 2025-05-15T12:24:24.767818Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 15 12:24:24.768091 waagent[1914]: 2025-05-15T12:24:24.768068Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4334.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 15 12:24:24.768956 waagent[1914]: 2025-05-15T12:24:24.768185Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 15 12:24:24.768956 waagent[1914]: 2025-05-15T12:24:24.768622Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 15 12:24:24.886597 waagent[1914]: 2025-05-15T12:24:24.886567Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 15 12:24:24.886752 waagent[1914]: 2025-05-15T12:24:24.886727Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 15 12:24:24.891707 waagent[1914]: 2025-05-15T12:24:24.891681Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 15 12:24:24.896311 systemd[1]: Reload requested from client PID 1929 ('systemctl') (unit waagent.service)... May 15 12:24:24.896321 systemd[1]: Reloading... May 15 12:24:24.973188 zram_generator::config[1973]: No configuration found. May 15 12:24:25.043705 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:24:25.124550 systemd[1]: Reloading finished in 227 ms. May 15 12:24:25.136432 waagent[1914]: 2025-05-15T12:24:25.135904Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 15 12:24:25.136432 waagent[1914]: 2025-05-15T12:24:25.135984Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 15 12:24:25.898393 waagent[1914]: 2025-05-15T12:24:25.898334Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 15 12:24:25.898627 waagent[1914]: 2025-05-15T12:24:25.898603Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 15 12:24:25.899257 waagent[1914]: 2025-05-15T12:24:25.899160Z INFO ExtHandler ExtHandler Starting env monitor service. May 15 12:24:25.899317 waagent[1914]: 2025-05-15T12:24:25.899278Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:25.899347 waagent[1914]: 2025-05-15T12:24:25.899337Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:25.899511 waagent[1914]: 2025-05-15T12:24:25.899492Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 15 12:24:25.899931 waagent[1914]: 2025-05-15T12:24:25.899889Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 15 12:24:25.900002 waagent[1914]: 2025-05-15T12:24:25.899979Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 15 12:24:25.900002 waagent[1914]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 15 12:24:25.900002 waagent[1914]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 May 15 12:24:25.900002 waagent[1914]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 15 12:24:25.900002 waagent[1914]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:25.900002 waagent[1914]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:25.900002 waagent[1914]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 15 12:24:25.900457 waagent[1914]: 2025-05-15T12:24:25.900414Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 15 12:24:25.900517 waagent[1914]: 2025-05-15T12:24:25.900495Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 15 12:24:25.900894 waagent[1914]: 2025-05-15T12:24:25.900839Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 15 12:24:25.900937 waagent[1914]: 2025-05-15T12:24:25.900911Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 15 12:24:25.900979 waagent[1914]: 2025-05-15T12:24:25.900961Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 15 12:24:25.901009 waagent[1914]: 2025-05-15T12:24:25.900994Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 15 12:24:25.901199 waagent[1914]: 2025-05-15T12:24:25.901149Z INFO EnvHandler ExtHandler Configure routes May 15 12:24:25.901654 waagent[1914]: 2025-05-15T12:24:25.901637Z INFO EnvHandler ExtHandler Gateway:None May 15 12:24:25.901749 waagent[1914]: 2025-05-15T12:24:25.901721Z INFO EnvHandler ExtHandler Routes:None May 15 12:24:25.901816 waagent[1914]: 2025-05-15T12:24:25.901798Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 15 12:24:25.909216 waagent[1914]: 2025-05-15T12:24:25.908874Z INFO ExtHandler ExtHandler May 15 12:24:25.909216 waagent[1914]: 2025-05-15T12:24:25.908915Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 1675addf-4cc4-49e5-9024-a2677e99e9cc correlation e0049508-c03b-42f1-b066-9cbc54aff988 created: 2025-05-15T12:23:16.571312Z] May 15 12:24:25.909216 waagent[1914]: 2025-05-15T12:24:25.909104Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 15 12:24:25.909491 waagent[1914]: 2025-05-15T12:24:25.909468Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] May 15 12:24:25.943293 waagent[1914]: 2025-05-15T12:24:25.943130Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command May 15 12:24:25.943293 waagent[1914]: Try `iptables -h' or 'iptables --help' for more information.) May 15 12:24:25.943548 waagent[1914]: 2025-05-15T12:24:25.943466Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 1B97C952-59E5-4FD3-BF55-AF80149D724A;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 15 12:24:25.952463 waagent[1914]: 2025-05-15T12:24:25.952423Z INFO MonitorHandler ExtHandler Network interfaces: May 15 12:24:25.952463 waagent[1914]: Executing ['ip', '-a', '-o', 'link']: May 15 12:24:25.952463 waagent[1914]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 15 12:24:25.952463 waagent[1914]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fb:c3:35 brd ff:ff:ff:ff:ff:ff\ alias Network Device May 15 12:24:25.952463 waagent[1914]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fb:c3:35 brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 May 15 12:24:25.952463 waagent[1914]: Executing ['ip', '-4', '-a', '-o', 'address']: May 15 12:24:25.952463 waagent[1914]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 15 12:24:25.952463 waagent[1914]: 2: eth0 inet 10.200.8.35/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever May 15 12:24:25.952463 waagent[1914]: Executing ['ip', '-6', '-a', '-o', 'address']: May 15 12:24:25.952463 waagent[1914]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 15 12:24:25.952463 waagent[1914]: 2: eth0 inet6 fe80::7e1e:52ff:fefb:c335/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 15 12:24:25.952463 waagent[1914]: 3: enP30832s1 inet6 fe80::7e1e:52ff:fefb:c335/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 15 12:24:26.047044 waagent[1914]: 2025-05-15T12:24:26.047002Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 15 12:24:26.047044 waagent[1914]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.047044 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.047044 waagent[1914]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.047044 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.047044 waagent[1914]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.047044 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.047044 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 15 12:24:26.047044 waagent[1914]: 3 535 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 15 12:24:26.047044 waagent[1914]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 15 12:24:26.049436 waagent[1914]: 2025-05-15T12:24:26.049395Z INFO EnvHandler ExtHandler Current Firewall rules: May 15 12:24:26.049436 waagent[1914]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.049436 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.049436 waagent[1914]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.049436 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.049436 waagent[1914]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 15 12:24:26.049436 waagent[1914]: pkts bytes target prot opt in out source destination May 15 12:24:26.049436 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 15 12:24:26.049436 waagent[1914]: 4 587 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 15 12:24:26.049436 waagent[1914]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 15 12:24:29.895104 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 15 12:24:29.896825 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:35.271155 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:35.279356 (kubelet)[2066]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:35.317251 kubelet[2066]: E0515 12:24:35.317204 2066 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:35.319927 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:35.320026 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:35.320292 systemd[1]: kubelet.service: Consumed 122ms CPU time, 96.5M memory peak. May 15 12:24:36.793047 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 15 12:24:36.794184 systemd[1]: Started sshd@0-10.200.8.35:22-10.200.16.10:59578.service - OpenSSH per-connection server daemon (10.200.16.10:59578). May 15 12:24:37.459049 sshd[2076]: Accepted publickey for core from 10.200.16.10 port 59578 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:37.460336 sshd-session[2076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:37.464654 systemd-logind[1702]: New session 3 of user core. May 15 12:24:37.471304 systemd[1]: Started session-3.scope - Session 3 of User core. May 15 12:24:38.024127 systemd[1]: Started sshd@1-10.200.8.35:22-10.200.16.10:59584.service - OpenSSH per-connection server daemon (10.200.16.10:59584). May 15 12:24:38.653861 sshd[2081]: Accepted publickey for core from 10.200.16.10 port 59584 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:38.655061 sshd-session[2081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:38.659478 systemd-logind[1702]: New session 4 of user core. May 15 12:24:38.669301 systemd[1]: Started session-4.scope - Session 4 of User core. May 15 12:24:39.101536 sshd[2083]: Connection closed by 10.200.16.10 port 59584 May 15 12:24:39.102310 sshd-session[2081]: pam_unix(sshd:session): session closed for user core May 15 12:24:39.105508 systemd[1]: sshd@1-10.200.8.35:22-10.200.16.10:59584.service: Deactivated successfully. May 15 12:24:39.106754 systemd[1]: session-4.scope: Deactivated successfully. May 15 12:24:39.107339 systemd-logind[1702]: Session 4 logged out. Waiting for processes to exit. May 15 12:24:39.108329 systemd-logind[1702]: Removed session 4. May 15 12:24:39.216039 systemd[1]: Started sshd@2-10.200.8.35:22-10.200.16.10:56656.service - OpenSSH per-connection server daemon (10.200.16.10:56656). May 15 12:24:39.850324 sshd[2089]: Accepted publickey for core from 10.200.16.10 port 56656 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:39.851493 sshd-session[2089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:39.855662 systemd-logind[1702]: New session 5 of user core. May 15 12:24:39.861308 systemd[1]: Started session-5.scope - Session 5 of User core. May 15 12:24:40.296181 sshd[2091]: Connection closed by 10.200.16.10 port 56656 May 15 12:24:40.296666 sshd-session[2089]: pam_unix(sshd:session): session closed for user core May 15 12:24:40.299664 systemd[1]: sshd@2-10.200.8.35:22-10.200.16.10:56656.service: Deactivated successfully. May 15 12:24:40.300927 systemd[1]: session-5.scope: Deactivated successfully. May 15 12:24:40.301504 systemd-logind[1702]: Session 5 logged out. Waiting for processes to exit. May 15 12:24:40.302478 systemd-logind[1702]: Removed session 5. May 15 12:24:40.411976 systemd[1]: Started sshd@3-10.200.8.35:22-10.200.16.10:56668.service - OpenSSH per-connection server daemon (10.200.16.10:56668). May 15 12:24:40.900342 chronyd[1726]: Selected source PHC0 May 15 12:24:41.046698 sshd[2100]: Accepted publickey for core from 10.200.16.10 port 56668 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:41.047848 sshd-session[2100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:41.051816 systemd-logind[1702]: New session 6 of user core. May 15 12:24:41.058284 systemd[1]: Started session-6.scope - Session 6 of User core. May 15 12:24:41.501882 sshd[2102]: Connection closed by 10.200.16.10 port 56668 May 15 12:24:41.501776 sshd-session[2100]: pam_unix(sshd:session): session closed for user core May 15 12:24:41.504601 systemd[1]: sshd@3-10.200.8.35:22-10.200.16.10:56668.service: Deactivated successfully. May 15 12:24:41.505842 systemd[1]: session-6.scope: Deactivated successfully. May 15 12:24:41.506442 systemd-logind[1702]: Session 6 logged out. Waiting for processes to exit. May 15 12:24:41.507388 systemd-logind[1702]: Removed session 6. May 15 12:24:41.616009 systemd[1]: Started sshd@4-10.200.8.35:22-10.200.16.10:56672.service - OpenSSH per-connection server daemon (10.200.16.10:56672). May 15 12:24:42.254479 sshd[2108]: Accepted publickey for core from 10.200.16.10 port 56672 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:42.255726 sshd-session[2108]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:42.259932 systemd-logind[1702]: New session 7 of user core. May 15 12:24:42.266271 systemd[1]: Started session-7.scope - Session 7 of User core. May 15 12:24:42.687517 sudo[2111]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 15 12:24:42.687706 sudo[2111]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:42.697069 sudo[2111]: pam_unix(sudo:session): session closed for user root May 15 12:24:42.800242 sshd[2110]: Connection closed by 10.200.16.10 port 56672 May 15 12:24:42.800938 sshd-session[2108]: pam_unix(sshd:session): session closed for user core May 15 12:24:42.803766 systemd[1]: sshd@4-10.200.8.35:22-10.200.16.10:56672.service: Deactivated successfully. May 15 12:24:42.805091 systemd[1]: session-7.scope: Deactivated successfully. May 15 12:24:42.806517 systemd-logind[1702]: Session 7 logged out. Waiting for processes to exit. May 15 12:24:42.807204 systemd-logind[1702]: Removed session 7. May 15 12:24:42.910953 systemd[1]: Started sshd@5-10.200.8.35:22-10.200.16.10:56676.service - OpenSSH per-connection server daemon (10.200.16.10:56676). May 15 12:24:43.546045 sshd[2117]: Accepted publickey for core from 10.200.16.10 port 56676 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:43.547251 sshd-session[2117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:43.551369 systemd-logind[1702]: New session 8 of user core. May 15 12:24:43.562314 systemd[1]: Started session-8.scope - Session 8 of User core. May 15 12:24:43.893025 sudo[2121]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 15 12:24:43.893327 sudo[2121]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:43.898970 sudo[2121]: pam_unix(sudo:session): session closed for user root May 15 12:24:43.902292 sudo[2120]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 15 12:24:43.902473 sudo[2120]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:43.908945 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:24:43.938951 augenrules[2143]: No rules May 15 12:24:43.939378 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:24:43.939543 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:24:43.940396 sudo[2120]: pam_unix(sudo:session): session closed for user root May 15 12:24:44.042741 sshd[2119]: Connection closed by 10.200.16.10 port 56676 May 15 12:24:44.043202 sshd-session[2117]: pam_unix(sshd:session): session closed for user core May 15 12:24:44.046034 systemd[1]: sshd@5-10.200.8.35:22-10.200.16.10:56676.service: Deactivated successfully. May 15 12:24:44.047157 systemd[1]: session-8.scope: Deactivated successfully. May 15 12:24:44.047771 systemd-logind[1702]: Session 8 logged out. Waiting for processes to exit. May 15 12:24:44.048613 systemd-logind[1702]: Removed session 8. May 15 12:24:44.158828 systemd[1]: Started sshd@6-10.200.8.35:22-10.200.16.10:56680.service - OpenSSH per-connection server daemon (10.200.16.10:56680). May 15 12:24:44.803113 sshd[2152]: Accepted publickey for core from 10.200.16.10 port 56680 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:24:44.804304 sshd-session[2152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:24:44.808392 systemd-logind[1702]: New session 9 of user core. May 15 12:24:44.814291 systemd[1]: Started session-9.scope - Session 9 of User core. May 15 12:24:45.150005 sudo[2155]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 15 12:24:45.150209 sudo[2155]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:24:45.395015 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 15 12:24:45.396607 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:24:48.217332 systemd[1]: Starting docker.service - Docker Application Container Engine... May 15 12:24:48.228474 (dockerd)[2176]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 15 12:24:49.135041 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:24:49.143351 (kubelet)[2186]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:24:49.178011 kubelet[2186]: E0515 12:24:49.177983 2186 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:24:49.179449 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:24:49.179556 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:24:49.179842 systemd[1]: kubelet.service: Consumed 120ms CPU time, 95.5M memory peak. May 15 12:24:49.722670 dockerd[2176]: time="2025-05-15T12:24:49.722629147Z" level=info msg="Starting up" May 15 12:24:49.723429 dockerd[2176]: time="2025-05-15T12:24:49.723402609Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 15 12:24:49.759773 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport4167916614-merged.mount: Deactivated successfully. May 15 12:24:49.806748 dockerd[2176]: time="2025-05-15T12:24:49.806722194Z" level=info msg="Loading containers: start." May 15 12:24:49.820184 kernel: Initializing XFRM netlink socket May 15 12:24:49.999902 systemd-networkd[1361]: docker0: Link UP May 15 12:24:50.013527 dockerd[2176]: time="2025-05-15T12:24:50.013500350Z" level=info msg="Loading containers: done." May 15 12:24:50.031380 dockerd[2176]: time="2025-05-15T12:24:50.031354796Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 15 12:24:50.031474 dockerd[2176]: time="2025-05-15T12:24:50.031407583Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 15 12:24:50.031498 dockerd[2176]: time="2025-05-15T12:24:50.031482058Z" level=info msg="Initializing buildkit" May 15 12:24:50.280795 dockerd[2176]: time="2025-05-15T12:24:50.280707655Z" level=info msg="Completed buildkit initialization" May 15 12:24:50.286525 dockerd[2176]: time="2025-05-15T12:24:50.286482702Z" level=info msg="Daemon has completed initialization" May 15 12:24:50.286595 dockerd[2176]: time="2025-05-15T12:24:50.286535443Z" level=info msg="API listen on /run/docker.sock" May 15 12:24:50.286683 systemd[1]: Started docker.service - Docker Application Container Engine. May 15 12:24:53.516367 containerd[1734]: time="2025-05-15T12:24:53.516313195Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 15 12:24:55.000713 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2688133761.mount: Deactivated successfully. May 15 12:24:58.102405 kernel: hv_balloon: Max. dynamic memory size: 8192 MB May 15 12:24:59.394895 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 15 12:24:59.396189 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:02.856081 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:02.864412 (kubelet)[2419]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:02.897643 kubelet[2419]: E0515 12:25:02.897609 2419 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:02.899083 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:02.899208 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:02.899485 systemd[1]: kubelet.service: Consumed 111ms CPU time, 93.7M memory peak. May 15 12:25:03.178612 update_engine[1704]: I20250515 12:25:03.178550 1704 update_attempter.cc:509] Updating boot flags... May 15 12:25:10.831050 containerd[1734]: time="2025-05-15T12:25:10.830995862Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:10.877876 containerd[1734]: time="2025-05-15T12:25:10.877834856Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674881" May 15 12:25:10.922963 containerd[1734]: time="2025-05-15T12:25:10.922906097Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:10.927037 containerd[1734]: time="2025-05-15T12:25:10.926997234Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:10.927800 containerd[1734]: time="2025-05-15T12:25:10.927623448Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 17.411257427s" May 15 12:25:10.927800 containerd[1734]: time="2025-05-15T12:25:10.927654884Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 15 12:25:10.941555 containerd[1734]: time="2025-05-15T12:25:10.941533280Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 15 12:25:13.145064 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 15 12:25:13.146787 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:17.198875 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:17.206396 (kubelet)[2528]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:17.239223 kubelet[2528]: E0515 12:25:17.239196 2528 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:17.240509 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:17.240617 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:17.240930 systemd[1]: kubelet.service: Consumed 113ms CPU time, 95.7M memory peak. May 15 12:25:24.260369 containerd[1734]: time="2025-05-15T12:25:24.260327748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:24.262363 containerd[1734]: time="2025-05-15T12:25:24.262330906Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617542" May 15 12:25:24.272939 containerd[1734]: time="2025-05-15T12:25:24.272904120Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:24.276403 containerd[1734]: time="2025-05-15T12:25:24.276366111Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:24.276993 containerd[1734]: time="2025-05-15T12:25:24.276826895Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 13.335264918s" May 15 12:25:24.276993 containerd[1734]: time="2025-05-15T12:25:24.276853011Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 15 12:25:24.290199 containerd[1734]: time="2025-05-15T12:25:24.290161157Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 15 12:25:27.395072 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 15 12:25:27.396812 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:32.850094 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:32.870364 (kubelet)[2555]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:32.902658 kubelet[2555]: E0515 12:25:32.902630 2555 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:32.903935 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:32.904065 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:32.904413 systemd[1]: kubelet.service: Consumed 114ms CPU time, 95.9M memory peak. May 15 12:25:33.591718 containerd[1734]: time="2025-05-15T12:25:33.591681544Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:33.594568 containerd[1734]: time="2025-05-15T12:25:33.594536450Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903690" May 15 12:25:33.600256 containerd[1734]: time="2025-05-15T12:25:33.600222379Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:33.603539 containerd[1734]: time="2025-05-15T12:25:33.603492227Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:33.604138 containerd[1734]: time="2025-05-15T12:25:33.603985250Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 9.31378233s" May 15 12:25:33.604138 containerd[1734]: time="2025-05-15T12:25:33.604010451Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 15 12:25:33.617631 containerd[1734]: time="2025-05-15T12:25:33.617611915Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 15 12:25:39.279424 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount645153615.mount: Deactivated successfully. May 15 12:25:40.682433 containerd[1734]: time="2025-05-15T12:25:40.682372365Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:40.697380 containerd[1734]: time="2025-05-15T12:25:40.697334558Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185825" May 15 12:25:40.729433 containerd[1734]: time="2025-05-15T12:25:40.729375658Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:40.732660 containerd[1734]: time="2025-05-15T12:25:40.732624490Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:40.733223 containerd[1734]: time="2025-05-15T12:25:40.732944788Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 7.115290886s" May 15 12:25:40.733223 containerd[1734]: time="2025-05-15T12:25:40.732972158Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 15 12:25:40.746626 containerd[1734]: time="2025-05-15T12:25:40.746603601Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 15 12:25:43.145106 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 15 12:25:43.147209 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:25:47.648218 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:25:47.653473 (kubelet)[2599]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:25:47.686757 kubelet[2599]: E0515 12:25:47.686712 2599 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:25:47.688143 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:25:47.688292 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:25:47.688620 systemd[1]: kubelet.service: Consumed 112ms CPU time, 96M memory peak. May 15 12:25:50.112556 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2901785923.mount: Deactivated successfully. May 15 12:25:57.774785 containerd[1734]: time="2025-05-15T12:25:57.774743680Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:57.782320 containerd[1734]: time="2025-05-15T12:25:57.782287812Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 15 12:25:57.829145 containerd[1734]: time="2025-05-15T12:25:57.829117140Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:57.832531 containerd[1734]: time="2025-05-15T12:25:57.832487652Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:25:57.833184 containerd[1734]: time="2025-05-15T12:25:57.833112796Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 17.086460278s" May 15 12:25:57.833184 containerd[1734]: time="2025-05-15T12:25:57.833143450Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 15 12:25:57.846741 containerd[1734]: time="2025-05-15T12:25:57.846704061Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 15 12:25:57.879301 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 15 12:25:57.880646 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:00.738954 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:00.744366 (kubelet)[2666]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:26:00.775963 kubelet[2666]: E0515 12:26:00.775933 2666 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:26:00.776877 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:26:00.776981 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:26:00.777301 systemd[1]: kubelet.service: Consumed 112ms CPU time, 95.7M memory peak. May 15 12:26:03.391418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount263651260.mount: Deactivated successfully. May 15 12:26:03.634331 containerd[1734]: time="2025-05-15T12:26:03.634269420Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:03.636516 containerd[1734]: time="2025-05-15T12:26:03.636471855Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" May 15 12:26:03.683654 containerd[1734]: time="2025-05-15T12:26:03.683597700Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:03.729822 containerd[1734]: time="2025-05-15T12:26:03.729750308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:03.730716 containerd[1734]: time="2025-05-15T12:26:03.730538322Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 5.883791245s" May 15 12:26:03.730716 containerd[1734]: time="2025-05-15T12:26:03.730575718Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 15 12:26:03.745076 containerd[1734]: time="2025-05-15T12:26:03.745055118Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 15 12:26:06.431103 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount35897085.mount: Deactivated successfully. May 15 12:26:10.895217 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 15 12:26:10.896948 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:13.857774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:13.863403 (kubelet)[2706]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:26:13.894883 kubelet[2706]: E0515 12:26:13.894858 2706 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:26:13.896181 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:26:13.896293 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:26:13.896640 systemd[1]: kubelet.service: Consumed 112ms CPU time, 94.3M memory peak. May 15 12:26:16.908692 containerd[1734]: time="2025-05-15T12:26:16.908651085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:16.911107 containerd[1734]: time="2025-05-15T12:26:16.911074063Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" May 15 12:26:16.913635 containerd[1734]: time="2025-05-15T12:26:16.913601529Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:16.917355 containerd[1734]: time="2025-05-15T12:26:16.917306532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:16.917890 containerd[1734]: time="2025-05-15T12:26:16.917869247Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 13.172789986s" May 15 12:26:16.917928 containerd[1734]: time="2025-05-15T12:26:16.917899354Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 15 12:26:19.113896 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:19.114369 systemd[1]: kubelet.service: Consumed 112ms CPU time, 94.3M memory peak. May 15 12:26:19.116099 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:19.131129 systemd[1]: Reload requested from client PID 2835 ('systemctl') (unit session-9.scope)... May 15 12:26:19.131140 systemd[1]: Reloading... May 15 12:26:19.220197 zram_generator::config[2890]: No configuration found. May 15 12:26:19.301027 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:26:19.379509 systemd[1]: Reloading finished in 248 ms. May 15 12:26:20.173373 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 15 12:26:20.173471 systemd[1]: kubelet.service: Failed with result 'signal'. May 15 12:26:20.173760 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:20.176267 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:25.450087 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:25.456515 (kubelet)[2948]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:26:25.489141 kubelet[2948]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:25.489141 kubelet[2948]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:26:25.489141 kubelet[2948]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:25.489399 kubelet[2948]: I0515 12:26:25.489189 2948 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:26:25.608643 kubelet[2948]: I0515 12:26:25.608623 2948 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 15 12:26:25.608643 kubelet[2948]: I0515 12:26:25.608641 2948 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:26:25.608820 kubelet[2948]: I0515 12:26:25.608792 2948 server.go:927] "Client rotation is on, will bootstrap in background" May 15 12:26:25.626759 kubelet[2948]: I0515 12:26:25.626348 2948 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:26:25.627165 kubelet[2948]: E0515 12:26:25.627144 2948 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.8.35:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.637306 kubelet[2948]: I0515 12:26:25.637289 2948 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:26:25.638569 kubelet[2948]: I0515 12:26:25.638546 2948 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:26:25.638705 kubelet[2948]: I0515 12:26:25.638568 2948 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-9b1bbdffc7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 15 12:26:25.639278 kubelet[2948]: I0515 12:26:25.639266 2948 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:26:25.639313 kubelet[2948]: I0515 12:26:25.639280 2948 container_manager_linux.go:301] "Creating device plugin manager" May 15 12:26:25.639374 kubelet[2948]: I0515 12:26:25.639365 2948 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:25.640091 kubelet[2948]: I0515 12:26:25.640080 2948 kubelet.go:400] "Attempting to sync node with API server" May 15 12:26:25.640119 kubelet[2948]: I0515 12:26:25.640095 2948 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:26:25.640119 kubelet[2948]: I0515 12:26:25.640115 2948 kubelet.go:312] "Adding apiserver pod source" May 15 12:26:25.640162 kubelet[2948]: I0515 12:26:25.640130 2948 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:26:25.645817 kubelet[2948]: W0515 12:26:25.645777 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.645817 kubelet[2948]: E0515 12:26:25.645817 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.645921 kubelet[2948]: I0515 12:26:25.645881 2948 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:26:25.646663 kubelet[2948]: W0515 12:26:25.646606 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.646663 kubelet[2948]: E0515 12:26:25.646645 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.647286 kubelet[2948]: I0515 12:26:25.647276 2948 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:26:25.647330 kubelet[2948]: W0515 12:26:25.647319 2948 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 15 12:26:25.647854 kubelet[2948]: I0515 12:26:25.647703 2948 server.go:1264] "Started kubelet" May 15 12:26:25.647854 kubelet[2948]: I0515 12:26:25.647774 2948 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:26:25.648684 kubelet[2948]: I0515 12:26:25.648474 2948 server.go:455] "Adding debug handlers to kubelet server" May 15 12:26:25.650625 kubelet[2948]: I0515 12:26:25.650192 2948 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:26:25.651770 kubelet[2948]: I0515 12:26:25.651728 2948 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:26:25.651907 kubelet[2948]: I0515 12:26:25.651894 2948 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:26:25.652124 kubelet[2948]: E0515 12:26:25.652047 2948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.35:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.35:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334.0.0-a-9b1bbdffc7.183fb2fe48501551 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-9b1bbdffc7,UID:ci-4334.0.0-a-9b1bbdffc7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-9b1bbdffc7,},FirstTimestamp:2025-05-15 12:26:25.647686993 +0000 UTC m=+0.188149031,LastTimestamp:2025-05-15 12:26:25.647686993 +0000 UTC m=+0.188149031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-9b1bbdffc7,}" May 15 12:26:25.655247 kubelet[2948]: I0515 12:26:25.654862 2948 volume_manager.go:291] "Starting Kubelet Volume Manager" May 15 12:26:25.655909 kubelet[2948]: E0515 12:26:25.655879 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="200ms" May 15 12:26:25.656375 kubelet[2948]: I0515 12:26:25.656363 2948 factory.go:221] Registration of the systemd container factory successfully May 15 12:26:25.656507 kubelet[2948]: I0515 12:26:25.656495 2948 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:26:25.657267 kubelet[2948]: I0515 12:26:25.657254 2948 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 12:26:25.657328 kubelet[2948]: I0515 12:26:25.657299 2948 reconciler.go:26] "Reconciler: start to sync state" May 15 12:26:25.658168 kubelet[2948]: E0515 12:26:25.658154 2948 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:26:25.658406 kubelet[2948]: I0515 12:26:25.658398 2948 factory.go:221] Registration of the containerd container factory successfully May 15 12:26:25.665164 kubelet[2948]: I0515 12:26:25.665135 2948 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:26:25.665917 kubelet[2948]: I0515 12:26:25.665893 2948 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:26:25.665917 kubelet[2948]: I0515 12:26:25.665915 2948 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:26:25.665991 kubelet[2948]: I0515 12:26:25.665929 2948 kubelet.go:2337] "Starting kubelet main sync loop" May 15 12:26:25.665991 kubelet[2948]: E0515 12:26:25.665957 2948 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:26:25.670920 kubelet[2948]: W0515 12:26:25.670889 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.670920 kubelet[2948]: E0515 12:26:25.670923 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.671146 kubelet[2948]: W0515 12:26:25.671092 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.671146 kubelet[2948]: E0515 12:26:25.671132 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:25.679803 kubelet[2948]: I0515 12:26:25.679732 2948 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:26:25.679803 kubelet[2948]: I0515 12:26:25.679745 2948 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:26:25.679803 kubelet[2948]: I0515 12:26:25.679760 2948 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:25.756414 kubelet[2948]: I0515 12:26:25.756287 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:25.756538 kubelet[2948]: E0515 12:26:25.756513 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:25.766809 kubelet[2948]: E0515 12:26:25.766779 2948 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:25.857420 kubelet[2948]: E0515 12:26:25.857381 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="400ms" May 15 12:26:25.958854 kubelet[2948]: I0515 12:26:25.958835 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:25.959143 kubelet[2948]: E0515 12:26:25.959115 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:25.967524 kubelet[2948]: E0515 12:26:25.967499 2948 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:26.023039 kubelet[2948]: I0515 12:26:26.022938 2948 policy_none.go:49] "None policy: Start" May 15 12:26:26.023954 kubelet[2948]: I0515 12:26:26.023847 2948 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:26:26.023954 kubelet[2948]: I0515 12:26:26.023892 2948 state_mem.go:35] "Initializing new in-memory state store" May 15 12:26:26.031824 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 15 12:26:26.044156 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 15 12:26:26.046495 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 15 12:26:26.056656 kubelet[2948]: I0515 12:26:26.056637 2948 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:26:26.056797 kubelet[2948]: I0515 12:26:26.056776 2948 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:26:26.056865 kubelet[2948]: I0515 12:26:26.056856 2948 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:26:26.058159 kubelet[2948]: E0515 12:26:26.058083 2948 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:26.258621 kubelet[2948]: E0515 12:26:26.258567 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="800ms" May 15 12:26:26.361286 kubelet[2948]: I0515 12:26:26.361207 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.361637 kubelet[2948]: E0515 12:26:26.361602 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.367723 kubelet[2948]: I0515 12:26:26.367702 2948 topology_manager.go:215] "Topology Admit Handler" podUID="edc8b565b56544f631a75dfe2f535e88" podNamespace="kube-system" podName="kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.368846 kubelet[2948]: I0515 12:26:26.368826 2948 topology_manager.go:215] "Topology Admit Handler" podUID="6e1cd5b086316ecabbcf11059c23db04" podNamespace="kube-system" podName="kube-scheduler-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.370021 kubelet[2948]: I0515 12:26:26.369840 2948 topology_manager.go:215] "Topology Admit Handler" podUID="14bc05342636079e576e84a577241a3c" podNamespace="kube-system" podName="kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.374402 systemd[1]: Created slice kubepods-burstable-podedc8b565b56544f631a75dfe2f535e88.slice - libcontainer container kubepods-burstable-podedc8b565b56544f631a75dfe2f535e88.slice. May 15 12:26:26.389644 systemd[1]: Created slice kubepods-burstable-pod6e1cd5b086316ecabbcf11059c23db04.slice - libcontainer container kubepods-burstable-pod6e1cd5b086316ecabbcf11059c23db04.slice. May 15 12:26:26.392689 systemd[1]: Created slice kubepods-burstable-pod14bc05342636079e576e84a577241a3c.slice - libcontainer container kubepods-burstable-pod14bc05342636079e576e84a577241a3c.slice. May 15 12:26:26.460356 kubelet[2948]: I0515 12:26:26.460283 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460356 kubelet[2948]: I0515 12:26:26.460321 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460356 kubelet[2948]: I0515 12:26:26.460336 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460356 kubelet[2948]: I0515 12:26:26.460352 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460531 kubelet[2948]: I0515 12:26:26.460366 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460531 kubelet[2948]: I0515 12:26:26.460381 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6e1cd5b086316ecabbcf11059c23db04-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"6e1cd5b086316ecabbcf11059c23db04\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460531 kubelet[2948]: I0515 12:26:26.460394 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460531 kubelet[2948]: I0515 12:26:26.460410 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.460531 kubelet[2948]: I0515 12:26:26.460423 2948 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:26.548811 kubelet[2948]: W0515 12:26:26.548744 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:26.548811 kubelet[2948]: E0515 12:26:26.548812 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:26.687937 containerd[1734]: time="2025-05-15T12:26:26.687890126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7,Uid:edc8b565b56544f631a75dfe2f535e88,Namespace:kube-system,Attempt:0,}" May 15 12:26:26.692343 containerd[1734]: time="2025-05-15T12:26:26.692320552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-9b1bbdffc7,Uid:6e1cd5b086316ecabbcf11059c23db04,Namespace:kube-system,Attempt:0,}" May 15 12:26:26.694804 containerd[1734]: time="2025-05-15T12:26:26.694767412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-9b1bbdffc7,Uid:14bc05342636079e576e84a577241a3c,Namespace:kube-system,Attempt:0,}" May 15 12:26:26.755132 kubelet[2948]: W0515 12:26:26.755084 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:26.755232 kubelet[2948]: E0515 12:26:26.755136 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:26.962287 kubelet[2948]: W0515 12:26:26.962202 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:26.962287 kubelet[2948]: E0515 12:26:26.962256 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:27.033885 kubelet[2948]: W0515 12:26:27.033838 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:27.033965 kubelet[2948]: E0515 12:26:27.033891 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:27.059372 kubelet[2948]: E0515 12:26:27.059347 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="1.6s" May 15 12:26:27.152042 kubelet[2948]: E0515 12:26:27.151976 2948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.35:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.35:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334.0.0-a-9b1bbdffc7.183fb2fe48501551 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334.0.0-a-9b1bbdffc7,UID:ci-4334.0.0-a-9b1bbdffc7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334.0.0-a-9b1bbdffc7,},FirstTimestamp:2025-05-15 12:26:25.647686993 +0000 UTC m=+0.188149031,LastTimestamp:2025-05-15 12:26:25.647686993 +0000 UTC m=+0.188149031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334.0.0-a-9b1bbdffc7,}" May 15 12:26:27.163203 kubelet[2948]: I0515 12:26:27.163167 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:27.163448 kubelet[2948]: E0515 12:26:27.163433 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:28.078878 kubelet[2948]: E0515 12:26:27.716193 2948 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.8.35:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:28.457271 kubelet[2948]: W0515 12:26:28.457140 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:28.474344 kubelet[2948]: E0515 12:26:28.457278 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:28.660031 kubelet[2948]: E0515 12:26:28.659980 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="3.2s" May 15 12:26:28.692629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount87319999.mount: Deactivated successfully. May 15 12:26:28.717697 kubelet[2948]: W0515 12:26:28.717605 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:28.717697 kubelet[2948]: E0515 12:26:28.717651 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.8.35:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:28.764960 kubelet[2948]: I0515 12:26:28.764940 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:28.775724 kubelet[2948]: E0515 12:26:28.765208 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:29.031077 containerd[1734]: time="2025-05-15T12:26:29.030997872Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:29.077826 containerd[1734]: time="2025-05-15T12:26:29.077788823Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 15 12:26:29.124823 containerd[1734]: time="2025-05-15T12:26:29.124788050Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:29.219913 containerd[1734]: time="2025-05-15T12:26:29.219854732Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:29.281599 containerd[1734]: time="2025-05-15T12:26:29.281375583Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 15 12:26:29.300470 containerd[1734]: time="2025-05-15T12:26:29.300397155Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:29.329389 containerd[1734]: time="2025-05-15T12:26:29.329061901Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 15 12:26:29.375216 containerd[1734]: time="2025-05-15T12:26:29.375125271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:26:29.376191 containerd[1734]: time="2025-05-15T12:26:29.375940304Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 2.671792948s" May 15 12:26:29.376735 containerd[1734]: time="2025-05-15T12:26:29.376706747Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 2.670375192s" May 15 12:26:29.485624 containerd[1734]: time="2025-05-15T12:26:29.485593311Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 2.791478791s" May 15 12:26:30.082886 kubelet[2948]: W0515 12:26:30.082853 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:30.480973 kubelet[2948]: E0515 12:26:30.082894 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.8.35:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:30.480973 kubelet[2948]: W0515 12:26:30.208986 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:30.480973 kubelet[2948]: E0515 12:26:30.209028 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.8.35:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:31.630707 containerd[1734]: time="2025-05-15T12:26:31.630660345Z" level=info msg="connecting to shim fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a" address="unix:///run/containerd/s/06005ec0ac0cc52fd372c37db68d1736387427e29bbe4907bcdab27d16f478ac" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:31.685883 containerd[1734]: time="2025-05-15T12:26:31.685831248Z" level=info msg="connecting to shim 0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71" address="unix:///run/containerd/s/033459c24d244e6f392dc7141db5c301862a55715ce479f231a9948f4e37dce1" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:31.694338 systemd[1]: Started cri-containerd-fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a.scope - libcontainer container fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a. May 15 12:26:31.703256 systemd[1]: Started cri-containerd-0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71.scope - libcontainer container 0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71. May 15 12:26:31.747756 containerd[1734]: time="2025-05-15T12:26:31.747664308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334.0.0-a-9b1bbdffc7,Uid:14bc05342636079e576e84a577241a3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a\"" May 15 12:26:31.751700 containerd[1734]: time="2025-05-15T12:26:31.751334084Z" level=info msg="connecting to shim c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531" address="unix:///run/containerd/s/03b363c18ac6cfb155fbb9ce094dea3b83cf8ccb82ad1a88c3769e219e230525" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:31.754041 containerd[1734]: time="2025-05-15T12:26:31.754014921Z" level=info msg="CreateContainer within sandbox \"fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 15 12:26:31.768292 systemd[1]: Started cri-containerd-c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531.scope - libcontainer container c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531. May 15 12:26:31.781831 containerd[1734]: time="2025-05-15T12:26:31.781809995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334.0.0-a-9b1bbdffc7,Uid:6e1cd5b086316ecabbcf11059c23db04,Namespace:kube-system,Attempt:0,} returns sandbox id \"0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71\"" May 15 12:26:31.783620 containerd[1734]: time="2025-05-15T12:26:31.783602051Z" level=info msg="CreateContainer within sandbox \"0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 15 12:26:31.841963 kubelet[2948]: E0515 12:26:31.841942 2948 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.8.35:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:31.860485 kubelet[2948]: E0515 12:26:31.860456 2948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.35:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334.0.0-a-9b1bbdffc7?timeout=10s\": dial tcp 10.200.8.35:6443: connect: connection refused" interval="6.4s" May 15 12:26:31.877177 containerd[1734]: time="2025-05-15T12:26:31.877153767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7,Uid:edc8b565b56544f631a75dfe2f535e88,Namespace:kube-system,Attempt:0,} returns sandbox id \"c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531\"" May 15 12:26:31.878977 containerd[1734]: time="2025-05-15T12:26:31.878955126Z" level=info msg="CreateContainer within sandbox \"c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 15 12:26:31.927497 kubelet[2948]: W0515 12:26:31.927468 2948 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:31.927557 kubelet[2948]: E0515 12:26:31.927507 2948 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.35:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334.0.0-a-9b1bbdffc7&limit=500&resourceVersion=0": dial tcp 10.200.8.35:6443: connect: connection refused May 15 12:26:31.966998 kubelet[2948]: I0515 12:26:31.966970 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:31.967249 kubelet[2948]: E0515 12:26:31.967227 2948 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.35:6443/api/v1/nodes\": dial tcp 10.200.8.35:6443: connect: connection refused" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:32.272401 containerd[1734]: time="2025-05-15T12:26:32.272318243Z" level=info msg="Container 89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:32.274587 containerd[1734]: time="2025-05-15T12:26:32.274537298Z" level=info msg="Container 6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:32.430773 containerd[1734]: time="2025-05-15T12:26:32.430726472Z" level=info msg="Container a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:32.575032 containerd[1734]: time="2025-05-15T12:26:32.574930372Z" level=info msg="CreateContainer within sandbox \"fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07\"" May 15 12:26:32.575670 containerd[1734]: time="2025-05-15T12:26:32.575644523Z" level=info msg="StartContainer for \"89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07\"" May 15 12:26:32.576726 containerd[1734]: time="2025-05-15T12:26:32.576685822Z" level=info msg="connecting to shim 89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07" address="unix:///run/containerd/s/06005ec0ac0cc52fd372c37db68d1736387427e29bbe4907bcdab27d16f478ac" protocol=ttrpc version=3 May 15 12:26:32.595320 systemd[1]: Started cri-containerd-89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07.scope - libcontainer container 89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07. May 15 12:26:32.686433 containerd[1734]: time="2025-05-15T12:26:32.686397680Z" level=info msg="StartContainer for \"89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07\" returns successfully" May 15 12:26:32.731378 containerd[1734]: time="2025-05-15T12:26:32.731319224Z" level=info msg="CreateContainer within sandbox \"c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc\"" May 15 12:26:32.731912 containerd[1734]: time="2025-05-15T12:26:32.731898520Z" level=info msg="CreateContainer within sandbox \"0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c\"" May 15 12:26:32.732225 containerd[1734]: time="2025-05-15T12:26:32.732188281Z" level=info msg="StartContainer for \"a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc\"" May 15 12:26:32.733057 containerd[1734]: time="2025-05-15T12:26:32.733040478Z" level=info msg="connecting to shim a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc" address="unix:///run/containerd/s/03b363c18ac6cfb155fbb9ce094dea3b83cf8ccb82ad1a88c3769e219e230525" protocol=ttrpc version=3 May 15 12:26:32.733515 containerd[1734]: time="2025-05-15T12:26:32.733441810Z" level=info msg="StartContainer for \"6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c\"" May 15 12:26:32.737293 containerd[1734]: time="2025-05-15T12:26:32.734480678Z" level=info msg="connecting to shim 6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c" address="unix:///run/containerd/s/033459c24d244e6f392dc7141db5c301862a55715ce479f231a9948f4e37dce1" protocol=ttrpc version=3 May 15 12:26:32.762404 systemd[1]: Started cri-containerd-6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c.scope - libcontainer container 6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c. May 15 12:26:32.770320 systemd[1]: Started cri-containerd-a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc.scope - libcontainer container a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc. May 15 12:26:32.830496 containerd[1734]: time="2025-05-15T12:26:32.830405782Z" level=info msg="StartContainer for \"6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c\" returns successfully" May 15 12:26:32.854359 containerd[1734]: time="2025-05-15T12:26:32.854339219Z" level=info msg="StartContainer for \"a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc\" returns successfully" May 15 12:26:34.625971 kubelet[2948]: E0515 12:26:34.625897 2948 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-9b1bbdffc7" not found May 15 12:26:35.093053 kubelet[2948]: E0515 12:26:35.093010 2948 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-9b1bbdffc7" not found May 15 12:26:35.572002 kubelet[2948]: E0515 12:26:35.571972 2948 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-9b1bbdffc7" not found May 15 12:26:36.058493 kubelet[2948]: E0515 12:26:36.058206 2948 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:36.523530 kubelet[2948]: E0515 12:26:36.523483 2948 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334.0.0-a-9b1bbdffc7" not found May 15 12:26:38.264004 kubelet[2948]: E0515 12:26:38.263966 2948 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4334.0.0-a-9b1bbdffc7\" not found" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:38.369917 kubelet[2948]: I0515 12:26:38.369693 2948 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:38.375626 kubelet[2948]: I0515 12:26:38.375573 2948 kubelet_node_status.go:76] "Successfully registered node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:38.380405 kubelet[2948]: E0515 12:26:38.380376 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.480903 kubelet[2948]: E0515 12:26:38.480875 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.581339 kubelet[2948]: E0515 12:26:38.581118 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.682085 kubelet[2948]: E0515 12:26:38.682061 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.782466 kubelet[2948]: E0515 12:26:38.782432 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.831522 systemd[1]: Reload requested from client PID 3219 ('systemctl') (unit session-9.scope)... May 15 12:26:38.831537 systemd[1]: Reloading... May 15 12:26:38.884428 kubelet[2948]: E0515 12:26:38.884406 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.909192 zram_generator::config[3260]: No configuration found. May 15 12:26:38.984911 kubelet[2948]: E0515 12:26:38.984889 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:38.990437 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:26:39.079945 systemd[1]: Reloading finished in 248 ms. May 15 12:26:39.085421 kubelet[2948]: E0515 12:26:39.085370 2948 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334.0.0-a-9b1bbdffc7\" not found" May 15 12:26:39.098520 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:39.108044 systemd[1]: kubelet.service: Deactivated successfully. May 15 12:26:39.108282 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:39.108324 systemd[1]: kubelet.service: Consumed 461ms CPU time, 111.6M memory peak. May 15 12:26:39.109556 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:26:41.356915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:26:41.361124 (kubelet)[3331]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:26:41.396218 kubelet[3331]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:41.396218 kubelet[3331]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:26:41.396218 kubelet[3331]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:26:41.396443 kubelet[3331]: I0515 12:26:41.396274 3331 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:26:41.399991 kubelet[3331]: I0515 12:26:41.399973 3331 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 15 12:26:41.399991 kubelet[3331]: I0515 12:26:41.399988 3331 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:26:41.400141 kubelet[3331]: I0515 12:26:41.400131 3331 server.go:927] "Client rotation is on, will bootstrap in background" May 15 12:26:41.400969 kubelet[3331]: I0515 12:26:41.400955 3331 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 15 12:26:41.401935 kubelet[3331]: I0515 12:26:41.401832 3331 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:26:41.407187 kubelet[3331]: I0515 12:26:41.407158 3331 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:26:41.407456 kubelet[3331]: I0515 12:26:41.407434 3331 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:26:41.407605 kubelet[3331]: I0515 12:26:41.407494 3331 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334.0.0-a-9b1bbdffc7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 15 12:26:41.407699 kubelet[3331]: I0515 12:26:41.407694 3331 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:26:41.407725 kubelet[3331]: I0515 12:26:41.407722 3331 container_manager_linux.go:301] "Creating device plugin manager" May 15 12:26:41.407773 kubelet[3331]: I0515 12:26:41.407770 3331 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:41.407861 kubelet[3331]: I0515 12:26:41.407856 3331 kubelet.go:400] "Attempting to sync node with API server" May 15 12:26:41.407888 kubelet[3331]: I0515 12:26:41.407880 3331 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:26:41.407914 kubelet[3331]: I0515 12:26:41.407902 3331 kubelet.go:312] "Adding apiserver pod source" May 15 12:26:41.407935 kubelet[3331]: I0515 12:26:41.407926 3331 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:26:41.408907 kubelet[3331]: I0515 12:26:41.408891 3331 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:26:41.409036 kubelet[3331]: I0515 12:26:41.409027 3331 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:26:41.409378 kubelet[3331]: I0515 12:26:41.409368 3331 server.go:1264] "Started kubelet" May 15 12:26:41.413573 kubelet[3331]: I0515 12:26:41.413485 3331 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:26:41.415249 kubelet[3331]: I0515 12:26:41.414562 3331 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:26:41.416044 kubelet[3331]: I0515 12:26:41.416032 3331 server.go:455] "Adding debug handlers to kubelet server" May 15 12:26:41.419006 kubelet[3331]: I0515 12:26:41.418009 3331 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:26:41.419237 kubelet[3331]: I0515 12:26:41.419225 3331 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:26:41.424467 kubelet[3331]: I0515 12:26:41.424343 3331 volume_manager.go:291] "Starting Kubelet Volume Manager" May 15 12:26:41.424758 kubelet[3331]: I0515 12:26:41.424748 3331 reconciler.go:26] "Reconciler: start to sync state" May 15 12:26:41.427599 kubelet[3331]: I0515 12:26:41.427587 3331 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 12:26:41.428636 kubelet[3331]: I0515 12:26:41.428390 3331 factory.go:221] Registration of the systemd container factory successfully May 15 12:26:41.428636 kubelet[3331]: I0515 12:26:41.428462 3331 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:26:41.429869 kubelet[3331]: I0515 12:26:41.429853 3331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:26:41.431526 kubelet[3331]: I0515 12:26:41.431267 3331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:26:41.431526 kubelet[3331]: I0515 12:26:41.431288 3331 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:26:41.431526 kubelet[3331]: I0515 12:26:41.431301 3331 kubelet.go:2337] "Starting kubelet main sync loop" May 15 12:26:41.431526 kubelet[3331]: E0515 12:26:41.431357 3331 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:26:41.433503 kubelet[3331]: I0515 12:26:41.433490 3331 factory.go:221] Registration of the containerd container factory successfully May 15 12:26:41.451999 kubelet[3331]: E0515 12:26:41.451984 3331 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:26:41.482259 kubelet[3331]: I0515 12:26:41.482243 3331 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:26:41.482259 kubelet[3331]: I0515 12:26:41.482254 3331 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:26:41.482354 kubelet[3331]: I0515 12:26:41.482280 3331 state_mem.go:36] "Initialized new in-memory state store" May 15 12:26:41.482399 kubelet[3331]: I0515 12:26:41.482386 3331 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 15 12:26:41.482421 kubelet[3331]: I0515 12:26:41.482399 3331 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 15 12:26:41.482421 kubelet[3331]: I0515 12:26:41.482413 3331 policy_none.go:49] "None policy: Start" May 15 12:26:41.482889 kubelet[3331]: I0515 12:26:41.482861 3331 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:26:41.482960 kubelet[3331]: I0515 12:26:41.482895 3331 state_mem.go:35] "Initializing new in-memory state store" May 15 12:26:41.483041 kubelet[3331]: I0515 12:26:41.483030 3331 state_mem.go:75] "Updated machine memory state" May 15 12:26:41.486001 kubelet[3331]: I0515 12:26:41.485961 3331 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:26:41.526162 kubelet[3331]: I0515 12:26:41.526146 3331 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:41.531940 kubelet[3331]: E0515 12:26:41.531921 3331 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:41.670799 kubelet[3331]: I0515 12:26:41.670760 3331 kubelet_node_status.go:112] "Node was previously registered" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:41.732025 kubelet[3331]: E0515 12:26:41.732006 3331 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 15 12:26:42.070949 kubelet[3331]: I0515 12:26:42.069893 3331 kubelet_node_status.go:76] "Successfully registered node" node="ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.070949 kubelet[3331]: I0515 12:26:42.069974 3331 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:26:42.071149 kubelet[3331]: I0515 12:26:42.071136 3331 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:26:42.132690 kubelet[3331]: I0515 12:26:42.132667 3331 topology_manager.go:215] "Topology Admit Handler" podUID="14bc05342636079e576e84a577241a3c" podNamespace="kube-system" podName="kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.132767 kubelet[3331]: I0515 12:26:42.132743 3331 topology_manager.go:215] "Topology Admit Handler" podUID="edc8b565b56544f631a75dfe2f535e88" podNamespace="kube-system" podName="kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.132805 kubelet[3331]: I0515 12:26:42.132796 3331 topology_manager.go:215] "Topology Admit Handler" podUID="6e1cd5b086316ecabbcf11059c23db04" podNamespace="kube-system" podName="kube-scheduler-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.142880 kubelet[3331]: W0515 12:26:42.142861 3331 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:42.145967 kubelet[3331]: W0515 12:26:42.145796 3331 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:42.145967 kubelet[3331]: W0515 12:26:42.145835 3331 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 12:26:42.329606 kubelet[3331]: I0515 12:26:42.329493 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-k8s-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329606 kubelet[3331]: I0515 12:26:42.329551 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329606 kubelet[3331]: I0515 12:26:42.329581 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6e1cd5b086316ecabbcf11059c23db04-kubeconfig\") pod \"kube-scheduler-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"6e1cd5b086316ecabbcf11059c23db04\") " pod="kube-system/kube-scheduler-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329606 kubelet[3331]: I0515 12:26:42.329603 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-ca-certs\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329737 kubelet[3331]: I0515 12:26:42.329621 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-k8s-certs\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329737 kubelet[3331]: I0515 12:26:42.329641 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/14bc05342636079e576e84a577241a3c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"14bc05342636079e576e84a577241a3c\") " pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329737 kubelet[3331]: I0515 12:26:42.329658 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-ca-certs\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329737 kubelet[3331]: I0515 12:26:42.329680 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-flexvolume-dir\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.329737 kubelet[3331]: I0515 12:26:42.329695 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edc8b565b56544f631a75dfe2f535e88-kubeconfig\") pod \"kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7\" (UID: \"edc8b565b56544f631a75dfe2f535e88\") " pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" May 15 12:26:42.408548 kubelet[3331]: I0515 12:26:42.408534 3331 apiserver.go:52] "Watching apiserver" May 15 12:26:42.428355 kubelet[3331]: I0515 12:26:42.428314 3331 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 12:26:42.462871 kubelet[3331]: I0515 12:26:42.462749 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4334.0.0-a-9b1bbdffc7" podStartSLOduration=0.462722522 podStartE2EDuration="462.722522ms" podCreationTimestamp="2025-05-15 12:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:42.462523289 +0000 UTC m=+1.097856732" watchObservedRunningTime="2025-05-15 12:26:42.462722522 +0000 UTC m=+1.098055970" May 15 12:26:42.521824 kubelet[3331]: I0515 12:26:42.521671 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4334.0.0-a-9b1bbdffc7" podStartSLOduration=0.521658136 podStartE2EDuration="521.658136ms" podCreationTimestamp="2025-05-15 12:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:42.52163068 +0000 UTC m=+1.156964120" watchObservedRunningTime="2025-05-15 12:26:42.521658136 +0000 UTC m=+1.156991577" May 15 12:26:42.521824 kubelet[3331]: I0515 12:26:42.521743 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4334.0.0-a-9b1bbdffc7" podStartSLOduration=0.521738333 podStartE2EDuration="521.738333ms" podCreationTimestamp="2025-05-15 12:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:42.469890518 +0000 UTC m=+1.105223962" watchObservedRunningTime="2025-05-15 12:26:42.521738333 +0000 UTC m=+1.157071776" May 15 12:26:48.526120 sudo[2155]: pam_unix(sudo:session): session closed for user root May 15 12:26:48.632684 sshd[2154]: Connection closed by 10.200.16.10 port 56680 May 15 12:26:48.632938 sshd-session[2152]: pam_unix(sshd:session): session closed for user core May 15 12:26:48.636283 systemd[1]: sshd@6-10.200.8.35:22-10.200.16.10:56680.service: Deactivated successfully. May 15 12:26:48.637917 systemd[1]: session-9.scope: Deactivated successfully. May 15 12:26:48.638079 systemd[1]: session-9.scope: Consumed 3.072s CPU time, 248.1M memory peak. May 15 12:26:48.639195 systemd-logind[1702]: Session 9 logged out. Waiting for processes to exit. May 15 12:26:48.640383 systemd-logind[1702]: Removed session 9. May 15 12:26:52.092508 kubelet[3331]: I0515 12:26:52.092356 3331 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 15 12:26:52.093394 containerd[1734]: time="2025-05-15T12:26:52.093235161Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 15 12:26:52.094383 kubelet[3331]: I0515 12:26:52.094367 3331 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 15 12:26:52.498775 kubelet[3331]: I0515 12:26:52.498747 3331 topology_manager.go:215] "Topology Admit Handler" podUID="a95e43b4-181d-442d-814c-93b9147b47c3" podNamespace="kube-system" podName="kube-proxy-z49b5" May 15 12:26:52.505076 systemd[1]: Created slice kubepods-besteffort-poda95e43b4_181d_442d_814c_93b9147b47c3.slice - libcontainer container kubepods-besteffort-poda95e43b4_181d_442d_814c_93b9147b47c3.slice. May 15 12:26:52.596924 kubelet[3331]: I0515 12:26:52.596903 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a95e43b4-181d-442d-814c-93b9147b47c3-kube-proxy\") pod \"kube-proxy-z49b5\" (UID: \"a95e43b4-181d-442d-814c-93b9147b47c3\") " pod="kube-system/kube-proxy-z49b5" May 15 12:26:52.597025 kubelet[3331]: I0515 12:26:52.596928 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctv7c\" (UniqueName: \"kubernetes.io/projected/a95e43b4-181d-442d-814c-93b9147b47c3-kube-api-access-ctv7c\") pod \"kube-proxy-z49b5\" (UID: \"a95e43b4-181d-442d-814c-93b9147b47c3\") " pod="kube-system/kube-proxy-z49b5" May 15 12:26:52.597025 kubelet[3331]: I0515 12:26:52.596949 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a95e43b4-181d-442d-814c-93b9147b47c3-xtables-lock\") pod \"kube-proxy-z49b5\" (UID: \"a95e43b4-181d-442d-814c-93b9147b47c3\") " pod="kube-system/kube-proxy-z49b5" May 15 12:26:52.597025 kubelet[3331]: I0515 12:26:52.596965 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a95e43b4-181d-442d-814c-93b9147b47c3-lib-modules\") pod \"kube-proxy-z49b5\" (UID: \"a95e43b4-181d-442d-814c-93b9147b47c3\") " pod="kube-system/kube-proxy-z49b5" May 15 12:26:52.813368 containerd[1734]: time="2025-05-15T12:26:52.813290937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z49b5,Uid:a95e43b4-181d-442d-814c-93b9147b47c3,Namespace:kube-system,Attempt:0,}" May 15 12:26:52.848494 containerd[1734]: time="2025-05-15T12:26:52.848434911Z" level=info msg="connecting to shim dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768" address="unix:///run/containerd/s/5bd96df5356bc2136f0fa14971aca9f9f414bb149e71f3dba996c540008f22df" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:52.867437 systemd[1]: Started cri-containerd-dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768.scope - libcontainer container dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768. May 15 12:26:52.891348 containerd[1734]: time="2025-05-15T12:26:52.891329659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z49b5,Uid:a95e43b4-181d-442d-814c-93b9147b47c3,Namespace:kube-system,Attempt:0,} returns sandbox id \"dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768\"" May 15 12:26:52.893710 containerd[1734]: time="2025-05-15T12:26:52.893684795Z" level=info msg="CreateContainer within sandbox \"dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 15 12:26:52.913869 containerd[1734]: time="2025-05-15T12:26:52.913103639Z" level=info msg="Container 0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:52.925255 containerd[1734]: time="2025-05-15T12:26:52.925231632Z" level=info msg="CreateContainer within sandbox \"dfbb34f9ab749c337203534452a02b287fd016f07b92428b5f14caf0441a4768\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c\"" May 15 12:26:52.925585 containerd[1734]: time="2025-05-15T12:26:52.925566857Z" level=info msg="StartContainer for \"0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c\"" May 15 12:26:52.926385 containerd[1734]: time="2025-05-15T12:26:52.926362765Z" level=info msg="connecting to shim 0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c" address="unix:///run/containerd/s/5bd96df5356bc2136f0fa14971aca9f9f414bb149e71f3dba996c540008f22df" protocol=ttrpc version=3 May 15 12:26:52.939279 systemd[1]: Started cri-containerd-0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c.scope - libcontainer container 0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c. May 15 12:26:52.963334 containerd[1734]: time="2025-05-15T12:26:52.963302228Z" level=info msg="StartContainer for \"0602a709a6f21c4ea91452368c9ff17cf0c705791a0162979e6bb79474a8568c\" returns successfully" May 15 12:26:53.127296 kubelet[3331]: I0515 12:26:53.126864 3331 topology_manager.go:215] "Topology Admit Handler" podUID="e1ac0940-80ff-4deb-8067-60c2d5baace8" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-9qtfl" May 15 12:26:53.135017 systemd[1]: Created slice kubepods-besteffort-pode1ac0940_80ff_4deb_8067_60c2d5baace8.slice - libcontainer container kubepods-besteffort-pode1ac0940_80ff_4deb_8067_60c2d5baace8.slice. May 15 12:26:53.200868 kubelet[3331]: I0515 12:26:53.200820 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e1ac0940-80ff-4deb-8067-60c2d5baace8-var-lib-calico\") pod \"tigera-operator-797db67f8-9qtfl\" (UID: \"e1ac0940-80ff-4deb-8067-60c2d5baace8\") " pod="tigera-operator/tigera-operator-797db67f8-9qtfl" May 15 12:26:53.200868 kubelet[3331]: I0515 12:26:53.200857 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd68h\" (UniqueName: \"kubernetes.io/projected/e1ac0940-80ff-4deb-8067-60c2d5baace8-kube-api-access-vd68h\") pod \"tigera-operator-797db67f8-9qtfl\" (UID: \"e1ac0940-80ff-4deb-8067-60c2d5baace8\") " pod="tigera-operator/tigera-operator-797db67f8-9qtfl" May 15 12:26:53.437420 containerd[1734]: time="2025-05-15T12:26:53.437395501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-9qtfl,Uid:e1ac0940-80ff-4deb-8067-60c2d5baace8,Namespace:tigera-operator,Attempt:0,}" May 15 12:26:53.473881 containerd[1734]: time="2025-05-15T12:26:53.473466818Z" level=info msg="connecting to shim 39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79" address="unix:///run/containerd/s/004c2f65898010df526bb7f69aaf0dcc07b045e1a967210f818e60046a777a52" namespace=k8s.io protocol=ttrpc version=3 May 15 12:26:53.496918 kubelet[3331]: I0515 12:26:53.496877 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-z49b5" podStartSLOduration=1.49686126 podStartE2EDuration="1.49686126s" podCreationTimestamp="2025-05-15 12:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:26:53.496699628 +0000 UTC m=+12.132033068" watchObservedRunningTime="2025-05-15 12:26:53.49686126 +0000 UTC m=+12.132194701" May 15 12:26:53.497490 systemd[1]: Started cri-containerd-39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79.scope - libcontainer container 39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79. May 15 12:26:53.536667 containerd[1734]: time="2025-05-15T12:26:53.536649716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-9qtfl,Uid:e1ac0940-80ff-4deb-8067-60c2d5baace8,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79\"" May 15 12:26:53.537975 containerd[1734]: time="2025-05-15T12:26:53.537943007Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 15 12:26:54.982720 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3904833696.mount: Deactivated successfully. May 15 12:26:55.774339 containerd[1734]: time="2025-05-15T12:26:55.774307401Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:55.776162 containerd[1734]: time="2025-05-15T12:26:55.776136781Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 15 12:26:55.778261 containerd[1734]: time="2025-05-15T12:26:55.778226342Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:55.782461 containerd[1734]: time="2025-05-15T12:26:55.782414535Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:26:55.783065 containerd[1734]: time="2025-05-15T12:26:55.782790764Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.24482309s" May 15 12:26:55.783065 containerd[1734]: time="2025-05-15T12:26:55.782815939Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 15 12:26:55.784434 containerd[1734]: time="2025-05-15T12:26:55.784414978Z" level=info msg="CreateContainer within sandbox \"39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 15 12:26:55.800610 containerd[1734]: time="2025-05-15T12:26:55.800589033Z" level=info msg="Container 5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89: CDI devices from CRI Config.CDIDevices: []" May 15 12:26:55.811647 containerd[1734]: time="2025-05-15T12:26:55.811624173Z" level=info msg="CreateContainer within sandbox \"39c87e7d99f6d4f529e055a45ea39e17ad732c7988d1574f97a5b475d78a9f79\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89\"" May 15 12:26:55.811954 containerd[1734]: time="2025-05-15T12:26:55.811935432Z" level=info msg="StartContainer for \"5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89\"" May 15 12:26:55.812717 containerd[1734]: time="2025-05-15T12:26:55.812670886Z" level=info msg="connecting to shim 5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89" address="unix:///run/containerd/s/004c2f65898010df526bb7f69aaf0dcc07b045e1a967210f818e60046a777a52" protocol=ttrpc version=3 May 15 12:26:55.828296 systemd[1]: Started cri-containerd-5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89.scope - libcontainer container 5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89. May 15 12:26:55.849955 containerd[1734]: time="2025-05-15T12:26:55.849888372Z" level=info msg="StartContainer for \"5e9f8d497a4c77c2c8f467a2a112801c111c138e741546ec0ce2afe8c4cacb89\" returns successfully" May 15 12:26:56.500844 kubelet[3331]: I0515 12:26:56.500804 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-9qtfl" podStartSLOduration=2.254965971 podStartE2EDuration="4.500789696s" podCreationTimestamp="2025-05-15 12:26:52 +0000 UTC" firstStartedPulling="2025-05-15 12:26:53.537474879 +0000 UTC m=+12.172808313" lastFinishedPulling="2025-05-15 12:26:55.78329861 +0000 UTC m=+14.418632038" observedRunningTime="2025-05-15 12:26:56.500548789 +0000 UTC m=+15.135882249" watchObservedRunningTime="2025-05-15 12:26:56.500789696 +0000 UTC m=+15.136123138" May 15 12:26:59.813823 kubelet[3331]: I0515 12:26:59.813168 3331 topology_manager.go:215] "Topology Admit Handler" podUID="a28ba243-cea9-4d25-b6fc-ca52a3819ed6" podNamespace="calico-system" podName="calico-typha-8db6559cd-s256c" May 15 12:26:59.822692 systemd[1]: Created slice kubepods-besteffort-poda28ba243_cea9_4d25_b6fc_ca52a3819ed6.slice - libcontainer container kubepods-besteffort-poda28ba243_cea9_4d25_b6fc_ca52a3819ed6.slice. May 15 12:26:59.843732 kubelet[3331]: I0515 12:26:59.843697 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfbb\" (UniqueName: \"kubernetes.io/projected/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-kube-api-access-zbfbb\") pod \"calico-typha-8db6559cd-s256c\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " pod="calico-system/calico-typha-8db6559cd-s256c" May 15 12:26:59.843821 kubelet[3331]: I0515 12:26:59.843745 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-typha-certs\") pod \"calico-typha-8db6559cd-s256c\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " pod="calico-system/calico-typha-8db6559cd-s256c" May 15 12:26:59.843821 kubelet[3331]: I0515 12:26:59.843767 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-tigera-ca-bundle\") pod \"calico-typha-8db6559cd-s256c\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " pod="calico-system/calico-typha-8db6559cd-s256c" May 15 12:27:00.127645 containerd[1734]: time="2025-05-15T12:27:00.127558952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8db6559cd-s256c,Uid:a28ba243-cea9-4d25-b6fc-ca52a3819ed6,Namespace:calico-system,Attempt:0,}" May 15 12:27:00.165798 containerd[1734]: time="2025-05-15T12:27:00.165759728Z" level=info msg="connecting to shim fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd" address="unix:///run/containerd/s/11a48d8d1736d052f343238f2eb354970817af4a9e71f4f0656a5c125699d4d4" namespace=k8s.io protocol=ttrpc version=3 May 15 12:27:00.194302 systemd[1]: Started cri-containerd-fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd.scope - libcontainer container fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd. May 15 12:27:00.227090 containerd[1734]: time="2025-05-15T12:27:00.227032073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8db6559cd-s256c,Uid:a28ba243-cea9-4d25-b6fc-ca52a3819ed6,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\"" May 15 12:27:00.228367 containerd[1734]: time="2025-05-15T12:27:00.228287877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 15 12:27:00.372945 kubelet[3331]: I0515 12:27:00.372920 3331 topology_manager.go:215] "Topology Admit Handler" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" podNamespace="calico-system" podName="calico-node-j67jc" May 15 12:27:00.381068 systemd[1]: Created slice kubepods-besteffort-pod358caab1_faf4_41bf_a17e_7eb09b4eaabd.slice - libcontainer container kubepods-besteffort-pod358caab1_faf4_41bf_a17e_7eb09b4eaabd.slice. May 15 12:27:00.447101 kubelet[3331]: I0515 12:27:00.447069 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/358caab1-faf4-41bf-a17e-7eb09b4eaabd-node-certs\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447182 kubelet[3331]: I0515 12:27:00.447106 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-log-dir\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447182 kubelet[3331]: I0515 12:27:00.447122 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-flexvol-driver-host\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447182 kubelet[3331]: I0515 12:27:00.447140 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-lib-calico\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447182 kubelet[3331]: I0515 12:27:00.447155 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358caab1-faf4-41bf-a17e-7eb09b4eaabd-tigera-ca-bundle\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447278 kubelet[3331]: I0515 12:27:00.447181 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxqbx\" (UniqueName: \"kubernetes.io/projected/358caab1-faf4-41bf-a17e-7eb09b4eaabd-kube-api-access-bxqbx\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447278 kubelet[3331]: I0515 12:27:00.447199 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-xtables-lock\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447278 kubelet[3331]: I0515 12:27:00.447215 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-run-calico\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447278 kubelet[3331]: I0515 12:27:00.447230 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-policysync\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447278 kubelet[3331]: I0515 12:27:00.447246 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-bin-dir\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447372 kubelet[3331]: I0515 12:27:00.447262 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-lib-modules\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.447372 kubelet[3331]: I0515 12:27:00.447280 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-net-dir\") pod \"calico-node-j67jc\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " pod="calico-system/calico-node-j67jc" May 15 12:27:00.549476 kubelet[3331]: E0515 12:27:00.549455 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.550031 kubelet[3331]: W0515 12:27:00.549924 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.550031 kubelet[3331]: E0515 12:27:00.549954 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.554252 kubelet[3331]: E0515 12:27:00.554231 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.554252 kubelet[3331]: W0515 12:27:00.554252 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.554347 kubelet[3331]: E0515 12:27:00.554267 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.648634 kubelet[3331]: E0515 12:27:00.648619 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.648703 kubelet[3331]: W0515 12:27:00.648632 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.648703 kubelet[3331]: E0515 12:27:00.648649 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.671634 kubelet[3331]: I0515 12:27:00.670516 3331 topology_manager.go:215] "Topology Admit Handler" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" podNamespace="calico-system" podName="csi-node-driver-fthxd" May 15 12:27:00.671634 kubelet[3331]: E0515 12:27:00.670769 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:00.682093 kubelet[3331]: E0515 12:27:00.682020 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.682093 kubelet[3331]: W0515 12:27:00.682035 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.682093 kubelet[3331]: E0515 12:27:00.682054 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.684597 containerd[1734]: time="2025-05-15T12:27:00.684571711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j67jc,Uid:358caab1-faf4-41bf-a17e-7eb09b4eaabd,Namespace:calico-system,Attempt:0,}" May 15 12:27:00.727244 containerd[1734]: time="2025-05-15T12:27:00.727141848Z" level=info msg="connecting to shim 586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278" address="unix:///run/containerd/s/d9d0270194fbe9d61146192750bea91b220ecf8b8fb1a869b075ad9752bd7591" namespace=k8s.io protocol=ttrpc version=3 May 15 12:27:00.745052 kubelet[3331]: E0515 12:27:00.745029 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745115 kubelet[3331]: W0515 12:27:00.745095 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745115 kubelet[3331]: E0515 12:27:00.745112 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745261 kubelet[3331]: E0515 12:27:00.745239 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745261 kubelet[3331]: W0515 12:27:00.745260 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745337 kubelet[3331]: E0515 12:27:00.745267 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745367 kubelet[3331]: E0515 12:27:00.745346 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745367 kubelet[3331]: W0515 12:27:00.745351 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745367 kubelet[3331]: E0515 12:27:00.745357 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745444 kubelet[3331]: E0515 12:27:00.745432 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745444 kubelet[3331]: W0515 12:27:00.745436 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745444 kubelet[3331]: E0515 12:27:00.745442 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745523 kubelet[3331]: E0515 12:27:00.745516 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745523 kubelet[3331]: W0515 12:27:00.745520 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745582 kubelet[3331]: E0515 12:27:00.745525 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745620 kubelet[3331]: E0515 12:27:00.745611 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745641 kubelet[3331]: W0515 12:27:00.745619 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745641 kubelet[3331]: E0515 12:27:00.745627 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745719 kubelet[3331]: E0515 12:27:00.745711 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745719 kubelet[3331]: W0515 12:27:00.745719 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745761 kubelet[3331]: E0515 12:27:00.745724 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745844 kubelet[3331]: E0515 12:27:00.745836 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.745865 kubelet[3331]: W0515 12:27:00.745842 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.745865 kubelet[3331]: E0515 12:27:00.745855 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.745938 kubelet[3331]: E0515 12:27:00.745933 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746015 kubelet[3331]: W0515 12:27:00.745972 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746015 kubelet[3331]: E0515 12:27:00.745977 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746063 kubelet[3331]: E0515 12:27:00.746055 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746083 kubelet[3331]: W0515 12:27:00.746064 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746083 kubelet[3331]: E0515 12:27:00.746070 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746206 kubelet[3331]: E0515 12:27:00.746152 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746206 kubelet[3331]: W0515 12:27:00.746185 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746206 kubelet[3331]: E0515 12:27:00.746192 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746358 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746724 kubelet[3331]: W0515 12:27:00.746366 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746372 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746508 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746724 kubelet[3331]: W0515 12:27:00.746513 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746521 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746620 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746724 kubelet[3331]: W0515 12:27:00.746626 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746632 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.746724 kubelet[3331]: E0515 12:27:00.746726 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.746365 systemd[1]: Started cri-containerd-586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278.scope - libcontainer container 586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278. May 15 12:27:00.747024 kubelet[3331]: W0515 12:27:00.746731 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746736 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746817 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.747024 kubelet[3331]: W0515 12:27:00.746822 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746827 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746915 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.747024 kubelet[3331]: W0515 12:27:00.746919 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746924 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.747024 kubelet[3331]: E0515 12:27:00.746998 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.747024 kubelet[3331]: W0515 12:27:00.747002 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747252 kubelet[3331]: E0515 12:27:00.747008 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.747252 kubelet[3331]: E0515 12:27:00.747085 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.747252 kubelet[3331]: W0515 12:27:00.747089 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747252 kubelet[3331]: E0515 12:27:00.747093 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.747252 kubelet[3331]: E0515 12:27:00.747166 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.747252 kubelet[3331]: W0515 12:27:00.747192 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.747252 kubelet[3331]: E0515 12:27:00.747197 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.749867 kubelet[3331]: E0515 12:27:00.749852 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.749867 kubelet[3331]: W0515 12:27:00.749868 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.749968 kubelet[3331]: E0515 12:27:00.749882 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.749968 kubelet[3331]: I0515 12:27:00.749918 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/55ce30e6-eb73-4f69-aef9-927a3bcb6662-varrun\") pod \"csi-node-driver-fthxd\" (UID: \"55ce30e6-eb73-4f69-aef9-927a3bcb6662\") " pod="calico-system/csi-node-driver-fthxd" May 15 12:27:00.750209 kubelet[3331]: E0515 12:27:00.750064 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750209 kubelet[3331]: W0515 12:27:00.750073 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750322 kubelet[3331]: E0515 12:27:00.750312 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750441 kubelet[3331]: E0515 12:27:00.750115 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.750441 kubelet[3331]: I0515 12:27:00.750348 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55ce30e6-eb73-4f69-aef9-927a3bcb6662-socket-dir\") pod \"csi-node-driver-fthxd\" (UID: \"55ce30e6-eb73-4f69-aef9-927a3bcb6662\") " pod="calico-system/csi-node-driver-fthxd" May 15 12:27:00.750441 kubelet[3331]: W0515 12:27:00.750378 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750441 kubelet[3331]: E0515 12:27:00.750389 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.750538 kubelet[3331]: E0515 12:27:00.750492 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750538 kubelet[3331]: W0515 12:27:00.750527 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750580 kubelet[3331]: E0515 12:27:00.750544 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.750743 kubelet[3331]: E0515 12:27:00.750641 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750743 kubelet[3331]: W0515 12:27:00.750647 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750743 kubelet[3331]: E0515 12:27:00.750730 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.750912 kubelet[3331]: I0515 12:27:00.750745 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55ce30e6-eb73-4f69-aef9-927a3bcb6662-registration-dir\") pod \"csi-node-driver-fthxd\" (UID: \"55ce30e6-eb73-4f69-aef9-927a3bcb6662\") " pod="calico-system/csi-node-driver-fthxd" May 15 12:27:00.750912 kubelet[3331]: E0515 12:27:00.750786 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750912 kubelet[3331]: W0515 12:27:00.750791 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750912 kubelet[3331]: E0515 12:27:00.750800 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.750912 kubelet[3331]: E0515 12:27:00.750872 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.750912 kubelet[3331]: W0515 12:27:00.750876 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.750912 kubelet[3331]: E0515 12:27:00.750882 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751252 kubelet[3331]: E0515 12:27:00.750969 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751252 kubelet[3331]: W0515 12:27:00.750973 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751252 kubelet[3331]: E0515 12:27:00.751048 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751252 kubelet[3331]: E0515 12:27:00.751079 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751252 kubelet[3331]: W0515 12:27:00.751084 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751252 kubelet[3331]: E0515 12:27:00.751090 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751252 kubelet[3331]: I0515 12:27:00.751142 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbp5\" (UniqueName: \"kubernetes.io/projected/55ce30e6-eb73-4f69-aef9-927a3bcb6662-kube-api-access-tbbp5\") pod \"csi-node-driver-fthxd\" (UID: \"55ce30e6-eb73-4f69-aef9-927a3bcb6662\") " pod="calico-system/csi-node-driver-fthxd" May 15 12:27:00.751252 kubelet[3331]: E0515 12:27:00.751178 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751252 kubelet[3331]: W0515 12:27:00.751182 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751187 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751301 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751735 kubelet[3331]: W0515 12:27:00.751305 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751315 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751394 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751735 kubelet[3331]: W0515 12:27:00.751408 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751419 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751497 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751735 kubelet[3331]: W0515 12:27:00.751501 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751735 kubelet[3331]: E0515 12:27:00.751507 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751970 kubelet[3331]: I0515 12:27:00.751523 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55ce30e6-eb73-4f69-aef9-927a3bcb6662-kubelet-dir\") pod \"csi-node-driver-fthxd\" (UID: \"55ce30e6-eb73-4f69-aef9-927a3bcb6662\") " pod="calico-system/csi-node-driver-fthxd" May 15 12:27:00.751970 kubelet[3331]: E0515 12:27:00.751628 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751970 kubelet[3331]: W0515 12:27:00.751633 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751970 kubelet[3331]: E0515 12:27:00.751640 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.751970 kubelet[3331]: E0515 12:27:00.751722 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.751970 kubelet[3331]: W0515 12:27:00.751726 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.751970 kubelet[3331]: E0515 12:27:00.751732 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.774334 containerd[1734]: time="2025-05-15T12:27:00.774314433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j67jc,Uid:358caab1-faf4-41bf-a17e-7eb09b4eaabd,Namespace:calico-system,Attempt:0,} returns sandbox id \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\"" May 15 12:27:00.852664 kubelet[3331]: E0515 12:27:00.852623 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.852664 kubelet[3331]: W0515 12:27:00.852635 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.852664 kubelet[3331]: E0515 12:27:00.852646 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.852937 kubelet[3331]: E0515 12:27:00.852822 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.852937 kubelet[3331]: W0515 12:27:00.852830 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.852937 kubelet[3331]: E0515 12:27:00.852842 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853011 kubelet[3331]: E0515 12:27:00.852945 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853011 kubelet[3331]: W0515 12:27:00.852950 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853011 kubelet[3331]: E0515 12:27:00.852980 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853108 kubelet[3331]: E0515 12:27:00.853094 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853108 kubelet[3331]: W0515 12:27:00.853103 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853155 kubelet[3331]: E0515 12:27:00.853114 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853285 kubelet[3331]: E0515 12:27:00.853259 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853285 kubelet[3331]: W0515 12:27:00.853282 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853342 kubelet[3331]: E0515 12:27:00.853294 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853413 kubelet[3331]: E0515 12:27:00.853401 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853413 kubelet[3331]: W0515 12:27:00.853409 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853484 kubelet[3331]: E0515 12:27:00.853416 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853522 kubelet[3331]: E0515 12:27:00.853485 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853522 kubelet[3331]: W0515 12:27:00.853490 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853522 kubelet[3331]: E0515 12:27:00.853495 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853611 kubelet[3331]: E0515 12:27:00.853592 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853611 kubelet[3331]: W0515 12:27:00.853597 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853611 kubelet[3331]: E0515 12:27:00.853606 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853705 kubelet[3331]: E0515 12:27:00.853680 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853705 kubelet[3331]: W0515 12:27:00.853684 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853774 kubelet[3331]: E0515 12:27:00.853714 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853806 kubelet[3331]: E0515 12:27:00.853782 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853806 kubelet[3331]: W0515 12:27:00.853786 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853806 kubelet[3331]: E0515 12:27:00.853794 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.853914 kubelet[3331]: E0515 12:27:00.853872 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.853914 kubelet[3331]: W0515 12:27:00.853876 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.853914 kubelet[3331]: E0515 12:27:00.853884 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854000 kubelet[3331]: E0515 12:27:00.853970 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854000 kubelet[3331]: W0515 12:27:00.853974 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854000 kubelet[3331]: E0515 12:27:00.853981 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854087 kubelet[3331]: E0515 12:27:00.854047 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854087 kubelet[3331]: W0515 12:27:00.854050 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854087 kubelet[3331]: E0515 12:27:00.854062 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854162 kubelet[3331]: E0515 12:27:00.854137 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854162 kubelet[3331]: W0515 12:27:00.854140 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854162 kubelet[3331]: E0515 12:27:00.854149 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854282 kubelet[3331]: E0515 12:27:00.854225 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854282 kubelet[3331]: W0515 12:27:00.854229 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854323 kubelet[3331]: E0515 12:27:00.854297 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854398 kubelet[3331]: E0515 12:27:00.854370 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854398 kubelet[3331]: W0515 12:27:00.854393 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854520 kubelet[3331]: E0515 12:27:00.854461 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854520 kubelet[3331]: E0515 12:27:00.854471 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854520 kubelet[3331]: W0515 12:27:00.854476 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854520 kubelet[3331]: E0515 12:27:00.854495 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854646 kubelet[3331]: E0515 12:27:00.854553 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854646 kubelet[3331]: W0515 12:27:00.854557 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854646 kubelet[3331]: E0515 12:27:00.854566 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854719 kubelet[3331]: E0515 12:27:00.854654 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854719 kubelet[3331]: W0515 12:27:00.854658 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854719 kubelet[3331]: E0515 12:27:00.854667 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854801 kubelet[3331]: E0515 12:27:00.854745 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854801 kubelet[3331]: W0515 12:27:00.854750 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854801 kubelet[3331]: E0515 12:27:00.854755 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854864 kubelet[3331]: E0515 12:27:00.854833 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854864 kubelet[3331]: W0515 12:27:00.854837 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.854864 kubelet[3331]: E0515 12:27:00.854846 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.854963 kubelet[3331]: E0515 12:27:00.854953 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.854963 kubelet[3331]: W0515 12:27:00.854961 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.855005 kubelet[3331]: E0515 12:27:00.854980 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.855212 kubelet[3331]: E0515 12:27:00.855152 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.855212 kubelet[3331]: W0515 12:27:00.855160 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.855340 kubelet[3331]: E0515 12:27:00.855167 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.855426 kubelet[3331]: E0515 12:27:00.855419 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.855467 kubelet[3331]: W0515 12:27:00.855460 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.855501 kubelet[3331]: E0515 12:27:00.855496 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.855688 kubelet[3331]: E0515 12:27:00.855653 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.855688 kubelet[3331]: W0515 12:27:00.855661 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.855688 kubelet[3331]: E0515 12:27:00.855668 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.955671 kubelet[3331]: E0515 12:27:00.955623 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.955671 kubelet[3331]: W0515 12:27:00.955636 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.955671 kubelet[3331]: E0515 12:27:00.955649 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:00.975838 kubelet[3331]: E0515 12:27:00.975778 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:00.975838 kubelet[3331]: W0515 12:27:00.975790 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:00.975838 kubelet[3331]: E0515 12:27:00.975803 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:02.432359 kubelet[3331]: E0515 12:27:02.432276 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:03.349868 containerd[1734]: time="2025-05-15T12:27:03.349834152Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:03.351721 containerd[1734]: time="2025-05-15T12:27:03.351685043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 15 12:27:03.353816 containerd[1734]: time="2025-05-15T12:27:03.353778517Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:03.356764 containerd[1734]: time="2025-05-15T12:27:03.356721870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:03.357104 containerd[1734]: time="2025-05-15T12:27:03.357012969Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.128698805s" May 15 12:27:03.357104 containerd[1734]: time="2025-05-15T12:27:03.357038369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 15 12:27:03.358121 containerd[1734]: time="2025-05-15T12:27:03.358056797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 15 12:27:03.369125 containerd[1734]: time="2025-05-15T12:27:03.369105059Z" level=info msg="CreateContainer within sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 12:27:03.387101 containerd[1734]: time="2025-05-15T12:27:03.386087708Z" level=info msg="Container c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:03.401245 containerd[1734]: time="2025-05-15T12:27:03.401221620Z" level=info msg="CreateContainer within sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\"" May 15 12:27:03.401717 containerd[1734]: time="2025-05-15T12:27:03.401517437Z" level=info msg="StartContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\"" May 15 12:27:03.402557 containerd[1734]: time="2025-05-15T12:27:03.402517293Z" level=info msg="connecting to shim c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff" address="unix:///run/containerd/s/11a48d8d1736d052f343238f2eb354970817af4a9e71f4f0656a5c125699d4d4" protocol=ttrpc version=3 May 15 12:27:03.421303 systemd[1]: Started cri-containerd-c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff.scope - libcontainer container c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff. May 15 12:27:03.462029 containerd[1734]: time="2025-05-15T12:27:03.462005728Z" level=info msg="StartContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" returns successfully" May 15 12:27:03.565041 kubelet[3331]: E0515 12:27:03.565012 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565041 kubelet[3331]: W0515 12:27:03.565038 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565316 kubelet[3331]: E0515 12:27:03.565053 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565316 kubelet[3331]: E0515 12:27:03.565148 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565316 kubelet[3331]: W0515 12:27:03.565153 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565316 kubelet[3331]: E0515 12:27:03.565160 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565316 kubelet[3331]: E0515 12:27:03.565254 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565316 kubelet[3331]: W0515 12:27:03.565258 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565316 kubelet[3331]: E0515 12:27:03.565264 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565464 kubelet[3331]: E0515 12:27:03.565336 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565464 kubelet[3331]: W0515 12:27:03.565341 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565464 kubelet[3331]: E0515 12:27:03.565346 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565464 kubelet[3331]: E0515 12:27:03.565431 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565464 kubelet[3331]: W0515 12:27:03.565435 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565464 kubelet[3331]: E0515 12:27:03.565440 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565579 kubelet[3331]: E0515 12:27:03.565520 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565579 kubelet[3331]: W0515 12:27:03.565525 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565579 kubelet[3331]: E0515 12:27:03.565530 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565635 kubelet[3331]: E0515 12:27:03.565597 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565635 kubelet[3331]: W0515 12:27:03.565601 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565635 kubelet[3331]: E0515 12:27:03.565605 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565694 kubelet[3331]: E0515 12:27:03.565672 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565694 kubelet[3331]: W0515 12:27:03.565676 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565694 kubelet[3331]: E0515 12:27:03.565681 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565765 kubelet[3331]: E0515 12:27:03.565750 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565765 kubelet[3331]: W0515 12:27:03.565759 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565810 kubelet[3331]: E0515 12:27:03.565764 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565847 kubelet[3331]: E0515 12:27:03.565829 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565847 kubelet[3331]: W0515 12:27:03.565832 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565847 kubelet[3331]: E0515 12:27:03.565837 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565913 kubelet[3331]: E0515 12:27:03.565900 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565913 kubelet[3331]: W0515 12:27:03.565904 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565913 kubelet[3331]: E0515 12:27:03.565909 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.565981 kubelet[3331]: E0515 12:27:03.565971 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.565981 kubelet[3331]: W0515 12:27:03.565975 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.565981 kubelet[3331]: E0515 12:27:03.565980 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.566057 kubelet[3331]: E0515 12:27:03.566048 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.566057 kubelet[3331]: W0515 12:27:03.566054 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.566094 kubelet[3331]: E0515 12:27:03.566078 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.566199 kubelet[3331]: E0515 12:27:03.566164 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.566199 kubelet[3331]: W0515 12:27:03.566198 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.566253 kubelet[3331]: E0515 12:27:03.566215 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.566316 kubelet[3331]: E0515 12:27:03.566291 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.566316 kubelet[3331]: W0515 12:27:03.566312 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.566377 kubelet[3331]: E0515 12:27:03.566318 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.571546 kubelet[3331]: E0515 12:27:03.571533 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.571546 kubelet[3331]: W0515 12:27:03.571544 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.571624 kubelet[3331]: E0515 12:27:03.571554 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.571688 kubelet[3331]: E0515 12:27:03.571679 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.571688 kubelet[3331]: W0515 12:27:03.571687 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.571741 kubelet[3331]: E0515 12:27:03.571694 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.571821 kubelet[3331]: E0515 12:27:03.571783 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.571821 kubelet[3331]: W0515 12:27:03.571791 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.571821 kubelet[3331]: E0515 12:27:03.571797 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.571935 kubelet[3331]: E0515 12:27:03.571885 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.571935 kubelet[3331]: W0515 12:27:03.571891 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.571935 kubelet[3331]: E0515 12:27:03.571901 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572007 kubelet[3331]: E0515 12:27:03.571971 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572007 kubelet[3331]: W0515 12:27:03.571975 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572007 kubelet[3331]: E0515 12:27:03.571980 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572085 kubelet[3331]: E0515 12:27:03.572056 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572085 kubelet[3331]: W0515 12:27:03.572060 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572085 kubelet[3331]: E0515 12:27:03.572069 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572166 kubelet[3331]: E0515 12:27:03.572156 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572166 kubelet[3331]: W0515 12:27:03.572161 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572237 kubelet[3331]: E0515 12:27:03.572229 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572355 kubelet[3331]: E0515 12:27:03.572274 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572355 kubelet[3331]: W0515 12:27:03.572280 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572355 kubelet[3331]: E0515 12:27:03.572286 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572426 kubelet[3331]: E0515 12:27:03.572422 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572449 kubelet[3331]: W0515 12:27:03.572427 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572449 kubelet[3331]: E0515 12:27:03.572436 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572570 kubelet[3331]: E0515 12:27:03.572545 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572570 kubelet[3331]: W0515 12:27:03.572566 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572656 kubelet[3331]: E0515 12:27:03.572574 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572656 kubelet[3331]: E0515 12:27:03.572651 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572656 kubelet[3331]: W0515 12:27:03.572655 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572752 kubelet[3331]: E0515 12:27:03.572664 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.572752 kubelet[3331]: E0515 12:27:03.572750 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.572860 kubelet[3331]: W0515 12:27:03.572755 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.572860 kubelet[3331]: E0515 12:27:03.572763 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573037 kubelet[3331]: E0515 12:27:03.573008 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573037 kubelet[3331]: W0515 12:27:03.573034 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573115 kubelet[3331]: E0515 12:27:03.573050 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573145 kubelet[3331]: E0515 12:27:03.573137 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573145 kubelet[3331]: W0515 12:27:03.573144 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573208 kubelet[3331]: E0515 12:27:03.573152 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573267 kubelet[3331]: E0515 12:27:03.573242 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573267 kubelet[3331]: W0515 12:27:03.573264 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573311 kubelet[3331]: E0515 12:27:03.573270 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573367 kubelet[3331]: E0515 12:27:03.573356 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573367 kubelet[3331]: W0515 12:27:03.573363 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573413 kubelet[3331]: E0515 12:27:03.573377 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573522 kubelet[3331]: E0515 12:27:03.573514 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573548 kubelet[3331]: W0515 12:27:03.573531 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573548 kubelet[3331]: E0515 12:27:03.573539 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:03.573780 kubelet[3331]: E0515 12:27:03.573759 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:03.573780 kubelet[3331]: W0515 12:27:03.573778 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:03.573818 kubelet[3331]: E0515 12:27:03.573784 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.432166 kubelet[3331]: E0515 12:27:04.432098 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:04.522050 kubelet[3331]: I0515 12:27:04.521643 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8db6559cd-s256c" podStartSLOduration=2.391920083 podStartE2EDuration="5.52162782s" podCreationTimestamp="2025-05-15 12:26:59 +0000 UTC" firstStartedPulling="2025-05-15 12:27:00.227916715 +0000 UTC m=+18.863250148" lastFinishedPulling="2025-05-15 12:27:03.357624455 +0000 UTC m=+21.992957885" observedRunningTime="2025-05-15 12:27:04.266791634 +0000 UTC m=+22.902125075" watchObservedRunningTime="2025-05-15 12:27:04.52162782 +0000 UTC m=+23.156961285" May 15 12:27:04.571874 kubelet[3331]: E0515 12:27:04.571813 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.571874 kubelet[3331]: W0515 12:27:04.571829 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.571874 kubelet[3331]: E0515 12:27:04.571843 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572190 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572406 kubelet[3331]: W0515 12:27:04.572199 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572209 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572300 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572406 kubelet[3331]: W0515 12:27:04.572305 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572311 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572392 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572406 kubelet[3331]: W0515 12:27:04.572396 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572406 kubelet[3331]: E0515 12:27:04.572402 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572649 kubelet[3331]: E0515 12:27:04.572513 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572649 kubelet[3331]: W0515 12:27:04.572517 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572649 kubelet[3331]: E0515 12:27:04.572523 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572649 kubelet[3331]: E0515 12:27:04.572596 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572649 kubelet[3331]: W0515 12:27:04.572600 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572649 kubelet[3331]: E0515 12:27:04.572614 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572845 kubelet[3331]: E0515 12:27:04.572676 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572845 kubelet[3331]: W0515 12:27:04.572680 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572845 kubelet[3331]: E0515 12:27:04.572694 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572845 kubelet[3331]: E0515 12:27:04.572764 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.572845 kubelet[3331]: W0515 12:27:04.572768 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.572845 kubelet[3331]: E0515 12:27:04.572773 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.572845 kubelet[3331]: E0515 12:27:04.572847 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573065 kubelet[3331]: W0515 12:27:04.572851 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573065 kubelet[3331]: E0515 12:27:04.572856 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573065 kubelet[3331]: E0515 12:27:04.572927 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573065 kubelet[3331]: W0515 12:27:04.572931 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573065 kubelet[3331]: E0515 12:27:04.572935 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573065 kubelet[3331]: E0515 12:27:04.573007 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573065 kubelet[3331]: W0515 12:27:04.573011 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573065 kubelet[3331]: E0515 12:27:04.573016 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573086 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573383 kubelet[3331]: W0515 12:27:04.573090 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573095 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573167 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573383 kubelet[3331]: W0515 12:27:04.573186 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573191 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573272 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573383 kubelet[3331]: W0515 12:27:04.573275 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573281 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.573383 kubelet[3331]: E0515 12:27:04.573353 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.573583 kubelet[3331]: W0515 12:27:04.573356 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.573583 kubelet[3331]: E0515 12:27:04.573361 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.578642 kubelet[3331]: E0515 12:27:04.578624 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.578642 kubelet[3331]: W0515 12:27:04.578638 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.578757 kubelet[3331]: E0515 12:27:04.578652 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.578820 kubelet[3331]: E0515 12:27:04.578811 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.578820 kubelet[3331]: W0515 12:27:04.578818 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.578878 kubelet[3331]: E0515 12:27:04.578834 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.578964 kubelet[3331]: E0515 12:27:04.578950 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.578964 kubelet[3331]: W0515 12:27:04.578961 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579035 kubelet[3331]: E0515 12:27:04.578971 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579095 kubelet[3331]: E0515 12:27:04.579083 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579095 kubelet[3331]: W0515 12:27:04.579092 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579186 kubelet[3331]: E0515 12:27:04.579103 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579233 kubelet[3331]: E0515 12:27:04.579215 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579233 kubelet[3331]: W0515 12:27:04.579220 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579233 kubelet[3331]: E0515 12:27:04.579229 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579338 kubelet[3331]: E0515 12:27:04.579314 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579338 kubelet[3331]: W0515 12:27:04.579319 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579338 kubelet[3331]: E0515 12:27:04.579328 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579472 kubelet[3331]: E0515 12:27:04.579428 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579472 kubelet[3331]: W0515 12:27:04.579459 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579524 kubelet[3331]: E0515 12:27:04.579471 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579823 kubelet[3331]: E0515 12:27:04.579797 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579823 kubelet[3331]: W0515 12:27:04.579821 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.579889 kubelet[3331]: E0515 12:27:04.579833 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.579967 kubelet[3331]: E0515 12:27:04.579941 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.579967 kubelet[3331]: W0515 12:27:04.579963 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580018 kubelet[3331]: E0515 12:27:04.579976 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580137 kubelet[3331]: E0515 12:27:04.580110 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580137 kubelet[3331]: W0515 12:27:04.580133 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580232 kubelet[3331]: E0515 12:27:04.580143 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580320 kubelet[3331]: E0515 12:27:04.580246 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580320 kubelet[3331]: W0515 12:27:04.580250 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580320 kubelet[3331]: E0515 12:27:04.580265 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580405 kubelet[3331]: E0515 12:27:04.580356 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580405 kubelet[3331]: W0515 12:27:04.580360 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580405 kubelet[3331]: E0515 12:27:04.580368 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580498 kubelet[3331]: E0515 12:27:04.580445 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580498 kubelet[3331]: W0515 12:27:04.580449 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580498 kubelet[3331]: E0515 12:27:04.580455 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580573 kubelet[3331]: E0515 12:27:04.580560 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580573 kubelet[3331]: W0515 12:27:04.580565 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580621 kubelet[3331]: E0515 12:27:04.580578 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580711 kubelet[3331]: E0515 12:27:04.580684 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580711 kubelet[3331]: W0515 12:27:04.580706 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580784 kubelet[3331]: E0515 12:27:04.580719 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.580859 kubelet[3331]: E0515 12:27:04.580850 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.580887 kubelet[3331]: W0515 12:27:04.580860 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.580887 kubelet[3331]: E0515 12:27:04.580868 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.581093 kubelet[3331]: E0515 12:27:04.581083 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.581093 kubelet[3331]: W0515 12:27:04.581093 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.581163 kubelet[3331]: E0515 12:27:04.581100 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:04.581276 kubelet[3331]: E0515 12:27:04.581246 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:04.581276 kubelet[3331]: W0515 12:27:04.581270 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:04.581327 kubelet[3331]: E0515 12:27:04.581280 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.580464 kubelet[3331]: E0515 12:27:05.580439 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.580788 kubelet[3331]: W0515 12:27:05.580726 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.580788 kubelet[3331]: E0515 12:27:05.580750 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.580913 kubelet[3331]: E0515 12:27:05.580888 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.580913 kubelet[3331]: W0515 12:27:05.580910 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.580974 kubelet[3331]: E0515 12:27:05.580931 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581046 kubelet[3331]: E0515 12:27:05.581038 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581046 kubelet[3331]: W0515 12:27:05.581045 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581097 kubelet[3331]: E0515 12:27:05.581052 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581134 kubelet[3331]: E0515 12:27:05.581128 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581162 kubelet[3331]: W0515 12:27:05.581134 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581162 kubelet[3331]: E0515 12:27:05.581140 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581247 kubelet[3331]: E0515 12:27:05.581233 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581247 kubelet[3331]: W0515 12:27:05.581238 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581247 kubelet[3331]: E0515 12:27:05.581244 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581328 kubelet[3331]: E0515 12:27:05.581311 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581328 kubelet[3331]: W0515 12:27:05.581315 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581328 kubelet[3331]: E0515 12:27:05.581321 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581424 kubelet[3331]: E0515 12:27:05.581382 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581424 kubelet[3331]: W0515 12:27:05.581386 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581424 kubelet[3331]: E0515 12:27:05.581391 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581503 kubelet[3331]: E0515 12:27:05.581458 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581503 kubelet[3331]: W0515 12:27:05.581463 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581503 kubelet[3331]: E0515 12:27:05.581468 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581585 kubelet[3331]: E0515 12:27:05.581539 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581585 kubelet[3331]: W0515 12:27:05.581543 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581585 kubelet[3331]: E0515 12:27:05.581549 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581661 kubelet[3331]: E0515 12:27:05.581611 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581661 kubelet[3331]: W0515 12:27:05.581615 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581661 kubelet[3331]: E0515 12:27:05.581620 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581742 kubelet[3331]: E0515 12:27:05.581681 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581742 kubelet[3331]: W0515 12:27:05.581685 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581742 kubelet[3331]: E0515 12:27:05.581690 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581831 kubelet[3331]: E0515 12:27:05.581754 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581831 kubelet[3331]: W0515 12:27:05.581758 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581831 kubelet[3331]: E0515 12:27:05.581764 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581831 kubelet[3331]: E0515 12:27:05.581828 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581937 kubelet[3331]: W0515 12:27:05.581832 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581937 kubelet[3331]: E0515 12:27:05.581837 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.581937 kubelet[3331]: E0515 12:27:05.581903 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.581937 kubelet[3331]: W0515 12:27:05.581907 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.581937 kubelet[3331]: E0515 12:27:05.581912 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.582079 kubelet[3331]: E0515 12:27:05.581972 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.582079 kubelet[3331]: W0515 12:27:05.581976 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.582079 kubelet[3331]: E0515 12:27:05.581981 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586299 kubelet[3331]: E0515 12:27:05.586280 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586299 kubelet[3331]: W0515 12:27:05.586294 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.586407 kubelet[3331]: E0515 12:27:05.586308 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586453 kubelet[3331]: E0515 12:27:05.586446 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586453 kubelet[3331]: W0515 12:27:05.586452 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.586500 kubelet[3331]: E0515 12:27:05.586466 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586602 kubelet[3331]: E0515 12:27:05.586591 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586602 kubelet[3331]: W0515 12:27:05.586597 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.586650 kubelet[3331]: E0515 12:27:05.586605 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586739 kubelet[3331]: E0515 12:27:05.586714 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586739 kubelet[3331]: W0515 12:27:05.586736 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.586790 kubelet[3331]: E0515 12:27:05.586745 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586836 kubelet[3331]: E0515 12:27:05.586827 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586836 kubelet[3331]: W0515 12:27:05.586835 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.586879 kubelet[3331]: E0515 12:27:05.586840 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.586958 kubelet[3331]: E0515 12:27:05.586939 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.586958 kubelet[3331]: W0515 12:27:05.586951 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587008 kubelet[3331]: E0515 12:27:05.586960 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587085 kubelet[3331]: E0515 12:27:05.587065 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587085 kubelet[3331]: W0515 12:27:05.587083 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587135 kubelet[3331]: E0515 12:27:05.587090 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587274 kubelet[3331]: E0515 12:27:05.587245 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587274 kubelet[3331]: W0515 12:27:05.587252 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587274 kubelet[3331]: E0515 12:27:05.587261 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587386 kubelet[3331]: E0515 12:27:05.587334 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587386 kubelet[3331]: W0515 12:27:05.587338 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587386 kubelet[3331]: E0515 12:27:05.587347 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587492 kubelet[3331]: E0515 12:27:05.587411 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587492 kubelet[3331]: W0515 12:27:05.587415 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587492 kubelet[3331]: E0515 12:27:05.587423 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587566 kubelet[3331]: E0515 12:27:05.587501 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587566 kubelet[3331]: W0515 12:27:05.587506 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587566 kubelet[3331]: E0515 12:27:05.587514 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587687 kubelet[3331]: E0515 12:27:05.587664 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587687 kubelet[3331]: W0515 12:27:05.587670 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587687 kubelet[3331]: E0515 12:27:05.587678 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587754 kubelet[3331]: E0515 12:27:05.587747 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587754 kubelet[3331]: W0515 12:27:05.587752 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587800 kubelet[3331]: E0515 12:27:05.587762 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587869 kubelet[3331]: E0515 12:27:05.587845 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587869 kubelet[3331]: W0515 12:27:05.587866 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.587918 kubelet[3331]: E0515 12:27:05.587874 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.587975 kubelet[3331]: E0515 12:27:05.587965 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.587975 kubelet[3331]: W0515 12:27:05.587972 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.588022 kubelet[3331]: E0515 12:27:05.587978 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.588301 kubelet[3331]: E0515 12:27:05.588184 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.588301 kubelet[3331]: W0515 12:27:05.588207 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.588301 kubelet[3331]: E0515 12:27:05.588221 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.588518 kubelet[3331]: E0515 12:27:05.588476 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.588518 kubelet[3331]: W0515 12:27:05.588483 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.588518 kubelet[3331]: E0515 12:27:05.588491 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:05.588729 kubelet[3331]: E0515 12:27:05.588706 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:05.588729 kubelet[3331]: W0515 12:27:05.588727 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:05.588774 kubelet[3331]: E0515 12:27:05.588736 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.432233 kubelet[3331]: E0515 12:27:06.432207 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:06.588700 kubelet[3331]: E0515 12:27:06.588682 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.588700 kubelet[3331]: W0515 12:27:06.588696 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588710 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588800 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589007 kubelet[3331]: W0515 12:27:06.588804 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588810 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588889 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589007 kubelet[3331]: W0515 12:27:06.588893 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588898 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.588993 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589007 kubelet[3331]: W0515 12:27:06.588997 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589007 kubelet[3331]: E0515 12:27:06.589003 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589244 kubelet[3331]: E0515 12:27:06.589080 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589244 kubelet[3331]: W0515 12:27:06.589084 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589244 kubelet[3331]: E0515 12:27:06.589089 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589244 kubelet[3331]: E0515 12:27:06.589153 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589244 kubelet[3331]: W0515 12:27:06.589157 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589244 kubelet[3331]: E0515 12:27:06.589161 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589244 kubelet[3331]: E0515 12:27:06.589245 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589399 kubelet[3331]: W0515 12:27:06.589249 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589399 kubelet[3331]: E0515 12:27:06.589254 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589399 kubelet[3331]: E0515 12:27:06.589320 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589399 kubelet[3331]: W0515 12:27:06.589324 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589399 kubelet[3331]: E0515 12:27:06.589328 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589399 kubelet[3331]: E0515 12:27:06.589401 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589527 kubelet[3331]: W0515 12:27:06.589405 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589527 kubelet[3331]: E0515 12:27:06.589411 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589527 kubelet[3331]: E0515 12:27:06.589473 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589527 kubelet[3331]: W0515 12:27:06.589477 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589527 kubelet[3331]: E0515 12:27:06.589482 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589635 kubelet[3331]: E0515 12:27:06.589545 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589635 kubelet[3331]: W0515 12:27:06.589548 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589635 kubelet[3331]: E0515 12:27:06.589553 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589635 kubelet[3331]: E0515 12:27:06.589615 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589635 kubelet[3331]: W0515 12:27:06.589619 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589635 kubelet[3331]: E0515 12:27:06.589624 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589764 kubelet[3331]: E0515 12:27:06.589692 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589764 kubelet[3331]: W0515 12:27:06.589695 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589764 kubelet[3331]: E0515 12:27:06.589700 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589832 kubelet[3331]: E0515 12:27:06.589765 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589832 kubelet[3331]: W0515 12:27:06.589769 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589832 kubelet[3331]: E0515 12:27:06.589774 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.589909 kubelet[3331]: E0515 12:27:06.589838 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.589909 kubelet[3331]: W0515 12:27:06.589842 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.589909 kubelet[3331]: E0515 12:27:06.589847 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.593619 kubelet[3331]: E0515 12:27:06.593518 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.593619 kubelet[3331]: W0515 12:27:06.593534 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.593619 kubelet[3331]: E0515 12:27:06.593545 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.594148 kubelet[3331]: E0515 12:27:06.594037 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.594148 kubelet[3331]: W0515 12:27:06.594050 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.594148 kubelet[3331]: E0515 12:27:06.594063 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.594489 kubelet[3331]: E0515 12:27:06.594244 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.594489 kubelet[3331]: W0515 12:27:06.594250 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.594489 kubelet[3331]: E0515 12:27:06.594259 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.594722 kubelet[3331]: E0515 12:27:06.594635 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.594722 kubelet[3331]: W0515 12:27:06.594645 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.594722 kubelet[3331]: E0515 12:27:06.594660 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.594918 kubelet[3331]: E0515 12:27:06.594864 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.594918 kubelet[3331]: W0515 12:27:06.594870 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.594918 kubelet[3331]: E0515 12:27:06.594879 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.595099 kubelet[3331]: E0515 12:27:06.595027 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.595099 kubelet[3331]: W0515 12:27:06.595033 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.595099 kubelet[3331]: E0515 12:27:06.595044 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.595357 kubelet[3331]: E0515 12:27:06.595217 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.595357 kubelet[3331]: W0515 12:27:06.595223 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.595357 kubelet[3331]: E0515 12:27:06.595234 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.595498 kubelet[3331]: E0515 12:27:06.595492 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.595531 kubelet[3331]: W0515 12:27:06.595526 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.595574 kubelet[3331]: E0515 12:27:06.595567 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.595729 kubelet[3331]: E0515 12:27:06.595692 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.595729 kubelet[3331]: W0515 12:27:06.595698 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.595729 kubelet[3331]: E0515 12:27:06.595708 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.595899 kubelet[3331]: E0515 12:27:06.595867 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.595899 kubelet[3331]: W0515 12:27:06.595872 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.595899 kubelet[3331]: E0515 12:27:06.595880 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.596076 kubelet[3331]: E0515 12:27:06.596025 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.596076 kubelet[3331]: W0515 12:27:06.596030 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.596076 kubelet[3331]: E0515 12:27:06.596037 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.596415 kubelet[3331]: E0515 12:27:06.596243 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.596415 kubelet[3331]: W0515 12:27:06.596251 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.596415 kubelet[3331]: E0515 12:27:06.596264 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.596599 kubelet[3331]: E0515 12:27:06.596589 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.596637 kubelet[3331]: W0515 12:27:06.596631 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.596676 kubelet[3331]: E0515 12:27:06.596669 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.596856 kubelet[3331]: E0515 12:27:06.596838 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.596931 kubelet[3331]: W0515 12:27:06.596897 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.596931 kubelet[3331]: E0515 12:27:06.596913 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.597121 kubelet[3331]: E0515 12:27:06.597106 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.597121 kubelet[3331]: W0515 12:27:06.597113 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.597400 kubelet[3331]: E0515 12:27:06.597349 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.597400 kubelet[3331]: W0515 12:27:06.597357 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.597400 kubelet[3331]: E0515 12:27:06.597365 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.597758 kubelet[3331]: E0515 12:27:06.597679 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.597942 kubelet[3331]: E0515 12:27:06.597933 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.598069 kubelet[3331]: W0515 12:27:06.597995 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.598069 kubelet[3331]: E0515 12:27:06.598006 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.598503 kubelet[3331]: E0515 12:27:06.598464 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:27:06.598503 kubelet[3331]: W0515 12:27:06.598473 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:27:06.598503 kubelet[3331]: E0515 12:27:06.598482 3331 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:27:06.704770 containerd[1734]: time="2025-05-15T12:27:06.704699577Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:06.708002 containerd[1734]: time="2025-05-15T12:27:06.707975082Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 15 12:27:06.710359 containerd[1734]: time="2025-05-15T12:27:06.710309043Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:06.730977 containerd[1734]: time="2025-05-15T12:27:06.730943979Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:06.731429 containerd[1734]: time="2025-05-15T12:27:06.731331403Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 3.373247331s" May 15 12:27:06.731429 containerd[1734]: time="2025-05-15T12:27:06.731357509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 15 12:27:06.733210 containerd[1734]: time="2025-05-15T12:27:06.733160408Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 12:27:06.921042 containerd[1734]: time="2025-05-15T12:27:06.921016350Z" level=info msg="Container fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:06.928133 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3257739335.mount: Deactivated successfully. May 15 12:27:07.036033 containerd[1734]: time="2025-05-15T12:27:07.035963288Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\"" May 15 12:27:07.036423 containerd[1734]: time="2025-05-15T12:27:07.036395911Z" level=info msg="StartContainer for \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\"" May 15 12:27:07.037707 containerd[1734]: time="2025-05-15T12:27:07.037668444Z" level=info msg="connecting to shim fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07" address="unix:///run/containerd/s/d9d0270194fbe9d61146192750bea91b220ecf8b8fb1a869b075ad9752bd7591" protocol=ttrpc version=3 May 15 12:27:07.058296 systemd[1]: Started cri-containerd-fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07.scope - libcontainer container fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07. May 15 12:27:07.086587 containerd[1734]: time="2025-05-15T12:27:07.086562976Z" level=info msg="StartContainer for \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" returns successfully" May 15 12:27:07.090358 systemd[1]: cri-containerd-fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07.scope: Deactivated successfully. May 15 12:27:07.092251 containerd[1734]: time="2025-05-15T12:27:07.092146351Z" level=info msg="received exit event container_id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" pid:4082 exited_at:{seconds:1747312027 nanos:91797155}" May 15 12:27:07.092251 containerd[1734]: time="2025-05-15T12:27:07.092230470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" pid:4082 exited_at:{seconds:1747312027 nanos:91797155}" May 15 12:27:07.107791 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07-rootfs.mount: Deactivated successfully. May 15 12:27:08.431764 kubelet[3331]: E0515 12:27:08.431706 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:10.431714 kubelet[3331]: E0515 12:27:10.431684 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:12.431943 kubelet[3331]: E0515 12:27:12.431897 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:14.432074 kubelet[3331]: E0515 12:27:14.432016 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:16.431564 kubelet[3331]: E0515 12:27:16.431511 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:17.092785 containerd[1734]: time="2025-05-15T12:27:17.092740811Z" level=error msg="failed to handle container TaskExit event container_id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" pid:4082 exited_at:{seconds:1747312027 nanos:91797155}" error="failed to stop container: failed to delete task: context deadline exceeded" May 15 12:27:18.431535 kubelet[3331]: E0515 12:27:18.431501 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:18.539472 containerd[1734]: time="2025-05-15T12:27:18.539419781Z" level=info msg="TaskExit event container_id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" id:\"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" pid:4082 exited_at:{seconds:1747312027 nanos:91797155}" May 15 12:27:20.432113 kubelet[3331]: E0515 12:27:20.432070 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:21.273641 containerd[1734]: time="2025-05-15T12:27:20.539776496Z" level=error msg="get state for fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07" error="context deadline exceeded" May 15 12:27:21.273641 containerd[1734]: time="2025-05-15T12:27:20.539801445Z" level=warning msg="unknown status" status=0 May 15 12:27:21.738144 containerd[1734]: time="2025-05-15T12:27:21.738067111Z" level=error msg="ttrpc: received message on inactive stream" stream=35 May 15 12:27:21.738277 containerd[1734]: time="2025-05-15T12:27:21.738134332Z" level=error msg="ttrpc: received message on inactive stream" stream=31 May 15 12:27:21.739784 containerd[1734]: time="2025-05-15T12:27:21.739750379Z" level=info msg="Ensure that container fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07 in task-service has been cleanup successfully" May 15 12:27:22.432264 kubelet[3331]: E0515 12:27:22.432238 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:22.537424 containerd[1734]: time="2025-05-15T12:27:22.537392899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 15 12:27:24.432623 kubelet[3331]: E0515 12:27:24.432523 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:26.431580 kubelet[3331]: E0515 12:27:26.431525 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:28.431529 kubelet[3331]: E0515 12:27:28.431489 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:30.432087 kubelet[3331]: E0515 12:27:30.432053 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:32.432436 kubelet[3331]: E0515 12:27:32.432360 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:34.432665 kubelet[3331]: E0515 12:27:34.432619 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:35.118986 containerd[1734]: time="2025-05-15T12:27:35.118945097Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:35.122380 containerd[1734]: time="2025-05-15T12:27:35.122348367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 15 12:27:35.124885 containerd[1734]: time="2025-05-15T12:27:35.124854571Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:35.128035 containerd[1734]: time="2025-05-15T12:27:35.127998139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:27:35.128769 containerd[1734]: time="2025-05-15T12:27:35.128353829Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 12.590928114s" May 15 12:27:35.128769 containerd[1734]: time="2025-05-15T12:27:35.128378994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 15 12:27:35.130092 containerd[1734]: time="2025-05-15T12:27:35.130061947Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 12:27:35.144158 containerd[1734]: time="2025-05-15T12:27:35.144123645Z" level=info msg="Container dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e: CDI devices from CRI Config.CDIDevices: []" May 15 12:27:35.161628 containerd[1734]: time="2025-05-15T12:27:35.161600362Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\"" May 15 12:27:35.161944 containerd[1734]: time="2025-05-15T12:27:35.161928767Z" level=info msg="StartContainer for \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\"" May 15 12:27:35.162983 containerd[1734]: time="2025-05-15T12:27:35.162949725Z" level=info msg="connecting to shim dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e" address="unix:///run/containerd/s/d9d0270194fbe9d61146192750bea91b220ecf8b8fb1a869b075ad9752bd7591" protocol=ttrpc version=3 May 15 12:27:35.179302 systemd[1]: Started cri-containerd-dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e.scope - libcontainer container dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e. May 15 12:27:35.207789 containerd[1734]: time="2025-05-15T12:27:35.207766118Z" level=info msg="StartContainer for \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" returns successfully" May 15 12:27:36.431934 kubelet[3331]: E0515 12:27:36.431897 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:38.432428 kubelet[3331]: E0515 12:27:38.432372 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:40.432240 kubelet[3331]: E0515 12:27:40.432203 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:41.240084 containerd[1734]: time="2025-05-15T12:27:41.240044452Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:27:41.241785 systemd[1]: cri-containerd-dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e.scope: Deactivated successfully. May 15 12:27:41.242022 systemd[1]: cri-containerd-dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e.scope: Consumed 343ms CPU time, 174.6M memory peak, 154M written to disk. May 15 12:27:41.244077 containerd[1734]: time="2025-05-15T12:27:41.243976214Z" level=info msg="received exit event container_id:\"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" id:\"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" pid:4144 exited_at:{seconds:1747312061 nanos:243792136}" May 15 12:27:41.244077 containerd[1734]: time="2025-05-15T12:27:41.244049318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" id:\"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" pid:4144 exited_at:{seconds:1747312061 nanos:243792136}" May 15 12:27:41.260308 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e-rootfs.mount: Deactivated successfully. May 15 12:27:41.330070 kubelet[3331]: I0515 12:27:41.330003 3331 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 15 12:27:41.415795 kubelet[3331]: I0515 12:27:41.415668 3331 topology_manager.go:215] "Topology Admit Handler" podUID="6ec4fd21-5481-4869-8926-5a6b5b6153c9" podNamespace="kube-system" podName="coredns-7db6d8ff4d-s4cpx" May 15 12:27:41.420857 kubelet[3331]: I0515 12:27:41.420418 3331 topology_manager.go:215] "Topology Admit Handler" podUID="d7dd18c0-69dd-4209-a984-7325dce5eb79" podNamespace="calico-system" podName="calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:27:41.420857 kubelet[3331]: I0515 12:27:41.420543 3331 topology_manager.go:215] "Topology Admit Handler" podUID="baa3369f-3b5a-4c16-a95f-d55a24190750" podNamespace="kube-system" podName="coredns-7db6d8ff4d-pkr4m" May 15 12:27:41.427746 systemd[1]: Created slice kubepods-burstable-pod6ec4fd21_5481_4869_8926_5a6b5b6153c9.slice - libcontainer container kubepods-burstable-pod6ec4fd21_5481_4869_8926_5a6b5b6153c9.slice. May 15 12:27:41.429106 kubelet[3331]: I0515 12:27:41.428974 3331 topology_manager.go:215] "Topology Admit Handler" podUID="1d54623e-7c9b-4f0a-8922-6bb47d7e26af" podNamespace="calico-apiserver" podName="calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:27:41.429265 kubelet[3331]: I0515 12:27:41.429202 3331 topology_manager.go:215] "Topology Admit Handler" podUID="dea655ce-5eab-4e67-a276-be0e9b39cc85" podNamespace="calico-apiserver" podName="calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:27:41.429874 kubelet[3331]: I0515 12:27:41.429660 3331 topology_manager.go:215] "Topology Admit Handler" podUID="5526c02b-5980-4359-8781-c8829963a4f3" podNamespace="calico-apiserver" podName="calico-apiserver-84478f496d-j8zht" May 15 12:27:41.445675 systemd[1]: Created slice kubepods-burstable-podbaa3369f_3b5a_4c16_a95f_d55a24190750.slice - libcontainer container kubepods-burstable-podbaa3369f_3b5a_4c16_a95f_d55a24190750.slice. May 15 12:27:41.451547 systemd[1]: Created slice kubepods-besteffort-podd7dd18c0_69dd_4209_a984_7325dce5eb79.slice - libcontainer container kubepods-besteffort-podd7dd18c0_69dd_4209_a984_7325dce5eb79.slice. May 15 12:27:41.456848 systemd[1]: Created slice kubepods-besteffort-pod1d54623e_7c9b_4f0a_8922_6bb47d7e26af.slice - libcontainer container kubepods-besteffort-pod1d54623e_7c9b_4f0a_8922_6bb47d7e26af.slice. May 15 12:27:41.461127 systemd[1]: Created slice kubepods-besteffort-poddea655ce_5eab_4e67_a276_be0e9b39cc85.slice - libcontainer container kubepods-besteffort-poddea655ce_5eab_4e67_a276_be0e9b39cc85.slice. May 15 12:27:41.464804 systemd[1]: Created slice kubepods-besteffort-pod5526c02b_5980_4359_8781_c8829963a4f3.slice - libcontainer container kubepods-besteffort-pod5526c02b_5980_4359_8781_c8829963a4f3.slice. May 15 12:27:41.591839 kubelet[3331]: I0515 12:27:41.591768 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7dd18c0-69dd-4209-a984-7325dce5eb79-tigera-ca-bundle\") pod \"calico-kube-controllers-75f9d647bb-zfbp5\" (UID: \"d7dd18c0-69dd-4209-a984-7325dce5eb79\") " pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:27:41.591839 kubelet[3331]: I0515 12:27:41.591800 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpjll\" (UniqueName: \"kubernetes.io/projected/d7dd18c0-69dd-4209-a984-7325dce5eb79-kube-api-access-dpjll\") pod \"calico-kube-controllers-75f9d647bb-zfbp5\" (UID: \"d7dd18c0-69dd-4209-a984-7325dce5eb79\") " pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:27:41.591839 kubelet[3331]: I0515 12:27:41.591821 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrfz\" (UniqueName: \"kubernetes.io/projected/5526c02b-5980-4359-8781-c8829963a4f3-kube-api-access-ltrfz\") pod \"calico-apiserver-84478f496d-j8zht\" (UID: \"5526c02b-5980-4359-8781-c8829963a4f3\") " pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:27:41.591839 kubelet[3331]: I0515 12:27:41.591840 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ec4fd21-5481-4869-8926-5a6b5b6153c9-config-volume\") pod \"coredns-7db6d8ff4d-s4cpx\" (UID: \"6ec4fd21-5481-4869-8926-5a6b5b6153c9\") " pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:27:41.592143 kubelet[3331]: I0515 12:27:41.591855 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn2jj\" (UniqueName: \"kubernetes.io/projected/6ec4fd21-5481-4869-8926-5a6b5b6153c9-kube-api-access-cn2jj\") pod \"coredns-7db6d8ff4d-s4cpx\" (UID: \"6ec4fd21-5481-4869-8926-5a6b5b6153c9\") " pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:27:41.592143 kubelet[3331]: I0515 12:27:41.591872 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/baa3369f-3b5a-4c16-a95f-d55a24190750-config-volume\") pod \"coredns-7db6d8ff4d-pkr4m\" (UID: \"baa3369f-3b5a-4c16-a95f-d55a24190750\") " pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:27:41.592143 kubelet[3331]: I0515 12:27:41.591887 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5526c02b-5980-4359-8781-c8829963a4f3-calico-apiserver-certs\") pod \"calico-apiserver-84478f496d-j8zht\" (UID: \"5526c02b-5980-4359-8781-c8829963a4f3\") " pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:27:41.592143 kubelet[3331]: I0515 12:27:41.591905 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-calico-apiserver-certs\") pod \"calico-apiserver-b7f8fc4d6-nk2q9\" (UID: \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\") " pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:27:41.592143 kubelet[3331]: I0515 12:27:41.591923 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8g5\" (UniqueName: \"kubernetes.io/projected/dea655ce-5eab-4e67-a276-be0e9b39cc85-kube-api-access-6g8g5\") pod \"calico-apiserver-b7f8fc4d6-6ch4b\" (UID: \"dea655ce-5eab-4e67-a276-be0e9b39cc85\") " pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:27:41.592266 kubelet[3331]: I0515 12:27:41.591939 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z85x\" (UniqueName: \"kubernetes.io/projected/baa3369f-3b5a-4c16-a95f-d55a24190750-kube-api-access-7z85x\") pod \"coredns-7db6d8ff4d-pkr4m\" (UID: \"baa3369f-3b5a-4c16-a95f-d55a24190750\") " pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:27:41.592266 kubelet[3331]: I0515 12:27:41.591956 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvldk\" (UniqueName: \"kubernetes.io/projected/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-kube-api-access-dvldk\") pod \"calico-apiserver-b7f8fc4d6-nk2q9\" (UID: \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\") " pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:27:41.592266 kubelet[3331]: I0515 12:27:41.591972 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dea655ce-5eab-4e67-a276-be0e9b39cc85-calico-apiserver-certs\") pod \"calico-apiserver-b7f8fc4d6-6ch4b\" (UID: \"dea655ce-5eab-4e67-a276-be0e9b39cc85\") " pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:27:41.742371 containerd[1734]: time="2025-05-15T12:27:41.742347769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,}" May 15 12:27:41.749847 containerd[1734]: time="2025-05-15T12:27:41.749822256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,}" May 15 12:27:41.754533 containerd[1734]: time="2025-05-15T12:27:41.754511473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75f9d647bb-zfbp5,Uid:d7dd18c0-69dd-4209-a984-7325dce5eb79,Namespace:calico-system,Attempt:0,}" May 15 12:27:41.759256 containerd[1734]: time="2025-05-15T12:27:41.759231028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,}" May 15 12:27:41.763190 containerd[1734]: time="2025-05-15T12:27:41.763148567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,}" May 15 12:27:41.766857 containerd[1734]: time="2025-05-15T12:27:41.766822403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,}" May 15 12:27:42.436273 systemd[1]: Created slice kubepods-besteffort-pod55ce30e6_eb73_4f69_aef9_927a3bcb6662.slice - libcontainer container kubepods-besteffort-pod55ce30e6_eb73_4f69_aef9_927a3bcb6662.slice. May 15 12:27:42.438047 containerd[1734]: time="2025-05-15T12:27:42.438025531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,}" May 15 12:27:49.911609 containerd[1734]: time="2025-05-15T12:27:49.911565980Z" level=error msg="Failed to destroy network for sandbox \"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.002647 containerd[1734]: time="2025-05-15T12:27:50.002566222Z" level=error msg="Failed to destroy network for sandbox \"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.019523 containerd[1734]: time="2025-05-15T12:27:50.019488143Z" level=error msg="Failed to destroy network for sandbox \"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.058432 containerd[1734]: time="2025-05-15T12:27:50.058405746Z" level=error msg="Failed to destroy network for sandbox \"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.149863 containerd[1734]: time="2025-05-15T12:27:50.149823879Z" level=error msg="Failed to destroy network for sandbox \"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.200990 containerd[1734]: time="2025-05-15T12:27:50.200751759Z" level=error msg="Failed to destroy network for sandbox \"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.249227 containerd[1734]: time="2025-05-15T12:27:50.249186222Z" level=error msg="Failed to destroy network for sandbox \"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.280013 containerd[1734]: time="2025-05-15T12:27:50.279958300Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.281560 kubelet[3331]: E0515 12:27:50.280261 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.281560 kubelet[3331]: E0515 12:27:50.280329 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:27:50.281560 kubelet[3331]: E0515 12:27:50.280348 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:27:50.281891 kubelet[3331]: E0515 12:27:50.280392 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f893373bea2859b93f485343ae09cdd293604c00b80ffa87e197b6cee3b27be7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-s4cpx" podUID="6ec4fd21-5481-4869-8926-5a6b5b6153c9" May 15 12:27:50.327584 containerd[1734]: time="2025-05-15T12:27:50.327544441Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.327824 kubelet[3331]: E0515 12:27:50.327793 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.327886 kubelet[3331]: E0515 12:27:50.327840 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:27:50.327886 kubelet[3331]: E0515 12:27:50.327859 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:27:50.327939 kubelet[3331]: E0515 12:27:50.327899 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d739454dc136c309a7cc7572fcc71b526d658b721781dc520faa7d7f14f7d0f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-pkr4m" podUID="baa3369f-3b5a-4c16-a95f-d55a24190750" May 15 12:27:50.375213 containerd[1734]: time="2025-05-15T12:27:50.375147006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75f9d647bb-zfbp5,Uid:d7dd18c0-69dd-4209-a984-7325dce5eb79,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.375344 kubelet[3331]: E0515 12:27:50.375308 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.375390 kubelet[3331]: E0515 12:27:50.375345 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:27:50.375390 kubelet[3331]: E0515 12:27:50.375377 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:27:50.375443 kubelet[3331]: E0515 12:27:50.375408 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75f9d647bb-zfbp5_calico-system(d7dd18c0-69dd-4209-a984-7325dce5eb79)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75f9d647bb-zfbp5_calico-system(d7dd18c0-69dd-4209-a984-7325dce5eb79)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c98f97f94a8852a7a562d06d59d7d26580780a86ce6cf0044df0a9697a2e479f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" podUID="d7dd18c0-69dd-4209-a984-7325dce5eb79" May 15 12:27:50.421227 containerd[1734]: time="2025-05-15T12:27:50.421181574Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.421386 kubelet[3331]: E0515 12:27:50.421350 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.421447 kubelet[3331]: E0515 12:27:50.421402 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:27:50.421447 kubelet[3331]: E0515 12:27:50.421419 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:27:50.421511 kubelet[3331]: E0515 12:27:50.421449 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70f3f68a76cd1ca0792a13022285f8849ebba3870a8b55e373ff7129ac5e3036\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" podUID="1d54623e-7c9b-4f0a-8922-6bb47d7e26af" May 15 12:27:50.423232 containerd[1734]: time="2025-05-15T12:27:50.423193912Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.423386 kubelet[3331]: E0515 12:27:50.423363 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.423441 kubelet[3331]: E0515 12:27:50.423399 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:27:50.423441 kubelet[3331]: E0515 12:27:50.423415 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:27:50.423494 kubelet[3331]: E0515 12:27:50.423444 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d93dcd121e15b57e133ad7ce56e4dcac02e57a5571aad5954ae9a8a6aa9b1cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" podUID="dea655ce-5eab-4e67-a276-be0e9b39cc85" May 15 12:27:50.471997 containerd[1734]: time="2025-05-15T12:27:50.471915571Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.472337 kubelet[3331]: E0515 12:27:50.472062 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.472337 kubelet[3331]: E0515 12:27:50.472095 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:27:50.472337 kubelet[3331]: E0515 12:27:50.472117 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:27:50.472438 kubelet[3331]: E0515 12:27:50.472148 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2678696189e117a970d81b8bba9c5063db91804d4d70f58c9d70847170aa0ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" podUID="5526c02b-5980-4359-8781-c8829963a4f3" May 15 12:27:50.515360 containerd[1734]: time="2025-05-15T12:27:50.515323383Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.515510 kubelet[3331]: E0515 12:27:50.515469 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:27:50.515510 kubelet[3331]: E0515 12:27:50.515504 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:27:50.515578 kubelet[3331]: E0515 12:27:50.515530 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:27:50.515578 kubelet[3331]: E0515 12:27:50.515563 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f660d2603738786c5be4aa6c0f51f27121031ecfb8bc377dcffef4216452f885\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:27:50.580889 containerd[1734]: time="2025-05-15T12:27:50.580770443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 15 12:27:50.823911 systemd[1]: run-netns-cni\x2d0eacc5cc\x2dcf5c\x2d7cd2\x2db16f\x2df53ed9b340b6.mount: Deactivated successfully. May 15 12:27:50.824393 systemd[1]: run-netns-cni\x2d660a7894\x2d7d6d\x2ddaed\x2db038\x2dfddd2dcb6ceb.mount: Deactivated successfully. May 15 12:27:50.824496 systemd[1]: run-netns-cni\x2daa1aa5ee\x2ddf11\x2dd3ef\x2d96e1\x2dbd5669031357.mount: Deactivated successfully. May 15 12:27:50.824581 systemd[1]: run-netns-cni\x2d6eabb65a\x2da05e\x2d9f35\x2d5e0f\x2d17bae8cf3923.mount: Deactivated successfully. May 15 12:27:50.824668 systemd[1]: run-netns-cni\x2dd62df651\x2df1ce\x2d8e91\x2d7641\x2d415bec6cfb6f.mount: Deactivated successfully. May 15 12:27:50.824750 systemd[1]: run-netns-cni\x2d94b416da\x2d1a09\x2da021\x2dd4d0\x2d8c943f951601.mount: Deactivated successfully. May 15 12:27:50.824825 systemd[1]: run-netns-cni\x2d177ba72c\x2d6785\x2d6d48\x2de8c1\x2ddeb899bd9929.mount: Deactivated successfully. May 15 12:28:01.369962 containerd[1734]: time="2025-05-15T12:28:01.369687685Z" level=info msg="StopContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" with timeout 300 (s)" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.371740084Z" level=info msg="Stop container \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" with signal terminated" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.388355804Z" level=info msg="received exit event container_id:\"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" id:\"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" pid:3902 exit_status:1 exited_at:{seconds:1747312081 nanos:388008498}" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.388903447Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" id:\"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" pid:3902 exit_status:1 exited_at:{seconds:1747312081 nanos:388008498}" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.432621141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,}" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.432647164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75f9d647bb-zfbp5,Uid:d7dd18c0-69dd-4209-a984-7325dce5eb79,Namespace:calico-system,Attempt:0,}" May 15 12:28:01.528318 containerd[1734]: time="2025-05-15T12:28:01.432778404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,}" May 15 12:28:01.386783 systemd[1]: cri-containerd-c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff.scope: Deactivated successfully. May 15 12:28:01.405086 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff-rootfs.mount: Deactivated successfully. May 15 12:28:02.432293 containerd[1734]: time="2025-05-15T12:28:02.432094706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,}" May 15 12:28:02.432293 containerd[1734]: time="2025-05-15T12:28:02.432205499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:02.432293 containerd[1734]: time="2025-05-15T12:28:02.432098210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:04.074971 containerd[1734]: time="2025-05-15T12:28:04.074844350Z" level=info msg="StopContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" returns successfully" May 15 12:28:04.076677 containerd[1734]: time="2025-05-15T12:28:04.076394363Z" level=info msg="StopPodSandbox for \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\"" May 15 12:28:04.076677 containerd[1734]: time="2025-05-15T12:28:04.076454902Z" level=info msg="Container to stop \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:28:04.082855 systemd[1]: cri-containerd-fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd.scope: Deactivated successfully. May 15 12:28:04.086139 containerd[1734]: time="2025-05-15T12:28:04.086028564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" id:\"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" pid:3732 exit_status:137 exited_at:{seconds:1747312084 nanos:83797158}" May 15 12:28:04.222799 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd-rootfs.mount: Deactivated successfully. May 15 12:28:04.628652 containerd[1734]: time="2025-05-15T12:28:04.628566244Z" level=info msg="shim disconnected" id=fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd namespace=k8s.io May 15 12:28:04.628652 containerd[1734]: time="2025-05-15T12:28:04.628597237Z" level=warning msg="cleaning up after shim disconnected" id=fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd namespace=k8s.io May 15 12:28:04.628652 containerd[1734]: time="2025-05-15T12:28:04.628604063Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:28:04.638765 containerd[1734]: time="2025-05-15T12:28:04.638288271Z" level=info msg="received exit event sandbox_id:\"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" exit_status:137 exited_at:{seconds:1747312084 nanos:83797158}" May 15 12:28:04.640597 containerd[1734]: time="2025-05-15T12:28:04.638840358Z" level=info msg="TearDown network for sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" successfully" May 15 12:28:04.640597 containerd[1734]: time="2025-05-15T12:28:04.638860034Z" level=info msg="StopPodSandbox for \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" returns successfully" May 15 12:28:04.640344 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd-shm.mount: Deactivated successfully. May 15 12:28:04.661384 kubelet[3331]: I0515 12:28:04.661138 3331 topology_manager.go:215] "Topology Admit Handler" podUID="6d220139-04b4-4ee6-92bf-c337bb582952" podNamespace="calico-system" podName="calico-typha-6f8d86f6d-xjrdd" May 15 12:28:04.662546 kubelet[3331]: E0515 12:28:04.662261 3331 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a28ba243-cea9-4d25-b6fc-ca52a3819ed6" containerName="calico-typha" May 15 12:28:04.662546 kubelet[3331]: I0515 12:28:04.662307 3331 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28ba243-cea9-4d25-b6fc-ca52a3819ed6" containerName="calico-typha" May 15 12:28:04.670914 systemd[1]: Created slice kubepods-besteffort-pod6d220139_04b4_4ee6_92bf_c337bb582952.slice - libcontainer container kubepods-besteffort-pod6d220139_04b4_4ee6_92bf_c337bb582952.slice. May 15 12:28:04.721152 containerd[1734]: time="2025-05-15T12:28:04.721124688Z" level=error msg="Failed to destroy network for sandbox \"5aeef91fb992dc465c4f76251b83abfba81557f53e54a659dfbeb878de192e9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.722827 systemd[1]: run-netns-cni\x2d3fa48439\x2d5eb2\x2d2085\x2dd684\x2db712e8d5135e.mount: Deactivated successfully. May 15 12:28:04.790742 kubelet[3331]: I0515 12:28:04.790284 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-tigera-ca-bundle\") pod \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " May 15 12:28:04.790742 kubelet[3331]: I0515 12:28:04.790319 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbfbb\" (UniqueName: \"kubernetes.io/projected/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-kube-api-access-zbfbb\") pod \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " May 15 12:28:04.791267 kubelet[3331]: I0515 12:28:04.791230 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-typha-certs\") pod \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\" (UID: \"a28ba243-cea9-4d25-b6fc-ca52a3819ed6\") " May 15 12:28:04.791328 kubelet[3331]: I0515 12:28:04.791314 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d220139-04b4-4ee6-92bf-c337bb582952-tigera-ca-bundle\") pod \"calico-typha-6f8d86f6d-xjrdd\" (UID: \"6d220139-04b4-4ee6-92bf-c337bb582952\") " pod="calico-system/calico-typha-6f8d86f6d-xjrdd" May 15 12:28:04.791355 kubelet[3331]: I0515 12:28:04.791344 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6d220139-04b4-4ee6-92bf-c337bb582952-typha-certs\") pod \"calico-typha-6f8d86f6d-xjrdd\" (UID: \"6d220139-04b4-4ee6-92bf-c337bb582952\") " pod="calico-system/calico-typha-6f8d86f6d-xjrdd" May 15 12:28:04.791382 kubelet[3331]: I0515 12:28:04.791363 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp8k7\" (UniqueName: \"kubernetes.io/projected/6d220139-04b4-4ee6-92bf-c337bb582952-kube-api-access-fp8k7\") pod \"calico-typha-6f8d86f6d-xjrdd\" (UID: \"6d220139-04b4-4ee6-92bf-c337bb582952\") " pod="calico-system/calico-typha-6f8d86f6d-xjrdd" May 15 12:28:04.796193 systemd[1]: var-lib-kubelet-pods-a28ba243\x2dcea9\x2d4d25\x2db6fc\x2dca52a3819ed6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzbfbb.mount: Deactivated successfully. May 15 12:28:04.799698 kubelet[3331]: I0515 12:28:04.799662 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-kube-api-access-zbfbb" (OuterVolumeSpecName: "kube-api-access-zbfbb") pod "a28ba243-cea9-4d25-b6fc-ca52a3819ed6" (UID: "a28ba243-cea9-4d25-b6fc-ca52a3819ed6"). InnerVolumeSpecName "kube-api-access-zbfbb". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:28:04.799846 systemd[1]: var-lib-kubelet-pods-a28ba243\x2dcea9\x2d4d25\x2db6fc\x2dca52a3819ed6-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. May 15 12:28:04.800225 systemd[1]: var-lib-kubelet-pods-a28ba243\x2dcea9\x2d4d25\x2db6fc\x2dca52a3819ed6-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. May 15 12:28:04.802096 kubelet[3331]: I0515 12:28:04.802071 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "a28ba243-cea9-4d25-b6fc-ca52a3819ed6" (UID: "a28ba243-cea9-4d25-b6fc-ca52a3819ed6"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:28:04.802567 kubelet[3331]: I0515 12:28:04.802550 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "a28ba243-cea9-4d25-b6fc-ca52a3819ed6" (UID: "a28ba243-cea9-4d25-b6fc-ca52a3819ed6"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:28:04.812844 containerd[1734]: time="2025-05-15T12:28:04.812818007Z" level=error msg="Failed to destroy network for sandbox \"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.858422 containerd[1734]: time="2025-05-15T12:28:04.858346236Z" level=error msg="Failed to destroy network for sandbox \"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.860946 containerd[1734]: time="2025-05-15T12:28:04.860912399Z" level=error msg="Failed to destroy network for sandbox \"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.892768 kubelet[3331]: I0515 12:28:04.892443 3331 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-tigera-ca-bundle\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:04.892768 kubelet[3331]: I0515 12:28:04.892466 3331 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-zbfbb\" (UniqueName: \"kubernetes.io/projected/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-kube-api-access-zbfbb\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:04.892768 kubelet[3331]: I0515 12:28:04.892477 3331 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a28ba243-cea9-4d25-b6fc-ca52a3819ed6-typha-certs\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:04.898581 containerd[1734]: time="2025-05-15T12:28:04.898557117Z" level=error msg="Failed to destroy network for sandbox \"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.961292 containerd[1734]: time="2025-05-15T12:28:04.961263908Z" level=error msg="Failed to destroy network for sandbox \"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.976499 containerd[1734]: time="2025-05-15T12:28:04.976472281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f8d86f6d-xjrdd,Uid:6d220139-04b4-4ee6-92bf-c337bb582952,Namespace:calico-system,Attempt:0,}" May 15 12:28:04.978582 containerd[1734]: time="2025-05-15T12:28:04.978553508Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75f9d647bb-zfbp5,Uid:d7dd18c0-69dd-4209-a984-7325dce5eb79,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aeef91fb992dc465c4f76251b83abfba81557f53e54a659dfbeb878de192e9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.978810 kubelet[3331]: E0515 12:28:04.978784 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aeef91fb992dc465c4f76251b83abfba81557f53e54a659dfbeb878de192e9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:04.978863 kubelet[3331]: E0515 12:28:04.978830 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aeef91fb992dc465c4f76251b83abfba81557f53e54a659dfbeb878de192e9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75f9d647bb-zfbp5" May 15 12:28:05.027180 containerd[1734]: time="2025-05-15T12:28:05.027133246Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.027361 kubelet[3331]: E0515 12:28:05.027338 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.027434 kubelet[3331]: E0515 12:28:05.027422 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:28:05.027460 kubelet[3331]: E0515 12:28:05.027443 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:28:05.027572 kubelet[3331]: E0515 12:28:05.027478 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8aba31e25690b6c195906e33462313212bb576542aaf7646f57893058403bdf3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-s4cpx" podUID="6ec4fd21-5481-4869-8926-5a6b5b6153c9" May 15 12:28:05.074686 containerd[1734]: time="2025-05-15T12:28:05.074624439Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.075184 kubelet[3331]: E0515 12:28:05.075042 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.075184 kubelet[3331]: E0515 12:28:05.075079 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:28:05.075184 kubelet[3331]: E0515 12:28:05.075096 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:28:05.075293 kubelet[3331]: E0515 12:28:05.075132 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"34ab3f46666aee93b61ca3803b133e71b09f94d7020cdb60042d4cc5bedb99b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:28:05.123526 containerd[1734]: time="2025-05-15T12:28:05.123363643Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.124083 kubelet[3331]: E0515 12:28:05.123996 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.124695 kubelet[3331]: E0515 12:28:05.124113 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:28:05.124695 kubelet[3331]: E0515 12:28:05.124133 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:28:05.124695 kubelet[3331]: E0515 12:28:05.124183 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"724d0b1a4dbd22fbfdb6a4fc1aa0e9137bf53a18a0b3b4bb0a5e1dc934afe042\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-pkr4m" podUID="baa3369f-3b5a-4c16-a95f-d55a24190750" May 15 12:28:05.169397 containerd[1734]: time="2025-05-15T12:28:05.169334402Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.169898 kubelet[3331]: E0515 12:28:05.169877 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.170015 kubelet[3331]: E0515 12:28:05.170002 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:28:05.170100 kubelet[3331]: E0515 12:28:05.170078 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:28:05.170356 kubelet[3331]: E0515 12:28:05.170141 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f662fe5e4cf71f6d09e7a918aeeb53921a1155d9c42356d8fa09248e5f21ca8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" podUID="1d54623e-7c9b-4f0a-8922-6bb47d7e26af" May 15 12:28:05.172658 containerd[1734]: time="2025-05-15T12:28:05.172555169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.173109 kubelet[3331]: E0515 12:28:05.173060 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.173244 kubelet[3331]: E0515 12:28:05.173194 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:28:05.173244 kubelet[3331]: E0515 12:28:05.173217 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:28:05.173300 kubelet[3331]: E0515 12:28:05.173264 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc5f9c4850f47daae6ba444d9037c52f3633e95d7287278cf9ea9e68c82b749e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" podUID="dea655ce-5eab-4e67-a276-be0e9b39cc85" May 15 12:28:05.225323 systemd[1]: run-netns-cni\x2df90d4376\x2d25f8\x2d9aac\x2dbf83\x2d258f55a262cb.mount: Deactivated successfully. May 15 12:28:05.225584 systemd[1]: run-netns-cni\x2d9c91a358\x2d0e52\x2d2eb6\x2d104d\x2d761e9e4cb4a6.mount: Deactivated successfully. May 15 12:28:05.225670 systemd[1]: run-netns-cni\x2d960cc88c\x2dc239\x2d41c1\x2dcc6c\x2d45357ec1fa33.mount: Deactivated successfully. May 15 12:28:05.225750 systemd[1]: run-netns-cni\x2de2bd5e17\x2d1860\x2dff16\x2de4b2\x2d2668a464f6fb.mount: Deactivated successfully. May 15 12:28:05.225828 systemd[1]: run-netns-cni\x2ddf8d1e30\x2d3d98\x2df764\x2d7a50\x2dee95660b40b7.mount: Deactivated successfully. May 15 12:28:05.434472 containerd[1734]: time="2025-05-15T12:28:05.433537735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:05.444122 systemd[1]: Removed slice kubepods-besteffort-poda28ba243_cea9_4d25_b6fc_ca52a3819ed6.slice - libcontainer container kubepods-besteffort-poda28ba243_cea9_4d25_b6fc_ca52a3819ed6.slice. May 15 12:28:05.595158 containerd[1734]: time="2025-05-15T12:28:05.595110433Z" level=info msg="connecting to shim 3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77" address="unix:///run/containerd/s/82807ac3534eec8244679d910ab280e9bd84e5c51748d427cc4dd1b3c9895821" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:05.612787 kubelet[3331]: I0515 12:28:05.612563 3331 scope.go:117] "RemoveContainer" containerID="c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff" May 15 12:28:05.620524 containerd[1734]: time="2025-05-15T12:28:05.620439118Z" level=info msg="RemoveContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\"" May 15 12:28:05.657435 systemd[1]: Started cri-containerd-3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77.scope - libcontainer container 3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77. May 15 12:28:05.720193 containerd[1734]: time="2025-05-15T12:28:05.718302158Z" level=error msg="Failed to destroy network for sandbox \"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.720028 systemd[1]: run-netns-cni\x2d7c26f561\x2dceb5\x2dc71d\x2d69de\x2dd87e7e433ae5.mount: Deactivated successfully. May 15 12:28:05.725592 containerd[1734]: time="2025-05-15T12:28:05.725550449Z" level=info msg="RemoveContainer for \"c9f32792864b0164d295293b779b0ac86cfd1a5837b466a6ca46f721b271f2ff\" returns successfully" May 15 12:28:05.771534 containerd[1734]: time="2025-05-15T12:28:05.771500231Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.771769 kubelet[3331]: E0515 12:28:05.771725 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:05.772112 kubelet[3331]: E0515 12:28:05.772016 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:28:05.772112 kubelet[3331]: E0515 12:28:05.772040 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:28:05.772112 kubelet[3331]: E0515 12:28:05.772077 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a15795849ab6f428912a5b64fc9e0a07442dd30ad79c1ffd3c1b7698ee28b7e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" podUID="5526c02b-5980-4359-8781-c8829963a4f3" May 15 12:28:05.796799 kubelet[3331]: I0515 12:28:05.796783 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpjll\" (UniqueName: \"kubernetes.io/projected/d7dd18c0-69dd-4209-a984-7325dce5eb79-kube-api-access-dpjll\") pod \"d7dd18c0-69dd-4209-a984-7325dce5eb79\" (UID: \"d7dd18c0-69dd-4209-a984-7325dce5eb79\") " May 15 12:28:05.796929 kubelet[3331]: I0515 12:28:05.796902 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7dd18c0-69dd-4209-a984-7325dce5eb79-tigera-ca-bundle\") pod \"d7dd18c0-69dd-4209-a984-7325dce5eb79\" (UID: \"d7dd18c0-69dd-4209-a984-7325dce5eb79\") " May 15 12:28:05.797741 kubelet[3331]: I0515 12:28:05.797709 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dd18c0-69dd-4209-a984-7325dce5eb79-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "d7dd18c0-69dd-4209-a984-7325dce5eb79" (UID: "d7dd18c0-69dd-4209-a984-7325dce5eb79"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:28:05.798920 kubelet[3331]: I0515 12:28:05.798803 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dd18c0-69dd-4209-a984-7325dce5eb79-kube-api-access-dpjll" (OuterVolumeSpecName: "kube-api-access-dpjll") pod "d7dd18c0-69dd-4209-a984-7325dce5eb79" (UID: "d7dd18c0-69dd-4209-a984-7325dce5eb79"). InnerVolumeSpecName "kube-api-access-dpjll". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:28:05.817907 containerd[1734]: time="2025-05-15T12:28:05.817884859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f8d86f6d-xjrdd,Uid:6d220139-04b4-4ee6-92bf-c337bb582952,Namespace:calico-system,Attempt:0,} returns sandbox id \"3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77\"" May 15 12:28:05.825190 containerd[1734]: time="2025-05-15T12:28:05.824623250Z" level=info msg="CreateContainer within sandbox \"3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 12:28:05.897350 kubelet[3331]: I0515 12:28:05.897333 3331 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7dd18c0-69dd-4209-a984-7325dce5eb79-tigera-ca-bundle\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:05.897514 kubelet[3331]: I0515 12:28:05.897504 3331 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-dpjll\" (UniqueName: \"kubernetes.io/projected/d7dd18c0-69dd-4209-a984-7325dce5eb79-kube-api-access-dpjll\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:05.977167 containerd[1734]: time="2025-05-15T12:28:05.977109693Z" level=info msg="Container 98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:06.182887 containerd[1734]: time="2025-05-15T12:28:06.182860437Z" level=info msg="CreateContainer within sandbox \"3e6ba816957190f9d45f9694ee3ccb1c067c19d1c8b43bf2b46000eaeb557d77\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263\"" May 15 12:28:06.183617 containerd[1734]: time="2025-05-15T12:28:06.183599229Z" level=info msg="StartContainer for \"98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263\"" May 15 12:28:06.184919 containerd[1734]: time="2025-05-15T12:28:06.184885535Z" level=info msg="connecting to shim 98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263" address="unix:///run/containerd/s/82807ac3534eec8244679d910ab280e9bd84e5c51748d427cc4dd1b3c9895821" protocol=ttrpc version=3 May 15 12:28:06.204329 systemd[1]: Started cri-containerd-98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263.scope - libcontainer container 98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263. May 15 12:28:06.226022 systemd[1]: var-lib-kubelet-pods-d7dd18c0\x2d69dd\x2d4209\x2da984\x2d7325dce5eb79-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddpjll.mount: Deactivated successfully. May 15 12:28:06.265322 containerd[1734]: time="2025-05-15T12:28:06.265179490Z" level=info msg="StartContainer for \"98464fcdb05f4ccfba9f4c0c3e7de03f72d25ca0928983aa1cc9d2f87b349263\" returns successfully" May 15 12:28:06.624942 systemd[1]: Removed slice kubepods-besteffort-podd7dd18c0_69dd_4209_a984_7325dce5eb79.slice - libcontainer container kubepods-besteffort-podd7dd18c0_69dd_4209_a984_7325dce5eb79.slice. May 15 12:28:06.722373 kubelet[3331]: I0515 12:28:06.722330 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f8d86f6d-xjrdd" podStartSLOduration=5.722313096 podStartE2EDuration="5.722313096s" podCreationTimestamp="2025-05-15 12:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:06.675286713 +0000 UTC m=+85.310620158" watchObservedRunningTime="2025-05-15 12:28:06.722313096 +0000 UTC m=+85.357646532" May 15 12:28:06.972701 kubelet[3331]: I0515 12:28:06.972639 3331 topology_manager.go:215] "Topology Admit Handler" podUID="f65466fd-ea14-4a19-aabd-5cdb3b868978" podNamespace="calico-system" podName="calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:06.982634 systemd[1]: Created slice kubepods-besteffort-podf65466fd_ea14_4a19_aabd_5cdb3b868978.slice - libcontainer container kubepods-besteffort-podf65466fd_ea14_4a19_aabd_5cdb3b868978.slice. May 15 12:28:07.060623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount49817000.mount: Deactivated successfully. May 15 12:28:07.104624 kubelet[3331]: I0515 12:28:07.104578 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frlsx\" (UniqueName: \"kubernetes.io/projected/f65466fd-ea14-4a19-aabd-5cdb3b868978-kube-api-access-frlsx\") pod \"calico-kube-controllers-5547fc8788-7mx2l\" (UID: \"f65466fd-ea14-4a19-aabd-5cdb3b868978\") " pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:07.104708 kubelet[3331]: I0515 12:28:07.104634 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f65466fd-ea14-4a19-aabd-5cdb3b868978-tigera-ca-bundle\") pod \"calico-kube-controllers-5547fc8788-7mx2l\" (UID: \"f65466fd-ea14-4a19-aabd-5cdb3b868978\") " pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:07.285895 containerd[1734]: time="2025-05-15T12:28:07.285820987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,}" May 15 12:28:07.371002 containerd[1734]: time="2025-05-15T12:28:07.370826432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:07.418234 containerd[1734]: time="2025-05-15T12:28:07.418202869Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 15 12:28:07.434559 kubelet[3331]: I0515 12:28:07.434520 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a28ba243-cea9-4d25-b6fc-ca52a3819ed6" path="/var/lib/kubelet/pods/a28ba243-cea9-4d25-b6fc-ca52a3819ed6/volumes" May 15 12:28:07.434831 kubelet[3331]: I0515 12:28:07.434807 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7dd18c0-69dd-4209-a984-7325dce5eb79" path="/var/lib/kubelet/pods/d7dd18c0-69dd-4209-a984-7325dce5eb79/volumes" May 15 12:28:07.480605 containerd[1734]: time="2025-05-15T12:28:07.480566488Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:07.512084 containerd[1734]: time="2025-05-15T12:28:07.512038007Z" level=error msg="Failed to destroy network for sandbox \"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:07.513897 systemd[1]: run-netns-cni\x2d6bd330ff\x2d3786\x2d897a\x2de0e8\x2d2c940d0206e0.mount: Deactivated successfully. May 15 12:28:07.528309 containerd[1734]: time="2025-05-15T12:28:07.528258402Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:07.528707 containerd[1734]: time="2025-05-15T12:28:07.528588874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 16.947529808s" May 15 12:28:07.528707 containerd[1734]: time="2025-05-15T12:28:07.528614396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 15 12:28:07.539018 containerd[1734]: time="2025-05-15T12:28:07.538071080Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 12:28:07.575468 containerd[1734]: time="2025-05-15T12:28:07.575423134Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:07.575629 kubelet[3331]: E0515 12:28:07.575591 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:07.575675 kubelet[3331]: E0515 12:28:07.575646 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:07.575675 kubelet[3331]: E0515 12:28:07.575665 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:07.575747 kubelet[3331]: E0515 12:28:07.575708 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5547fc8788-7mx2l_calico-system(f65466fd-ea14-4a19-aabd-5cdb3b868978)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5547fc8788-7mx2l_calico-system(f65466fd-ea14-4a19-aabd-5cdb3b868978)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6567879f19a45d22bf933f2a9fff8e0ac9692581f12202a4499192dead52a244\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" podUID="f65466fd-ea14-4a19-aabd-5cdb3b868978" May 15 12:28:07.782737 containerd[1734]: time="2025-05-15T12:28:07.782711816Z" level=info msg="Container 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:07.929658 containerd[1734]: time="2025-05-15T12:28:07.929635771Z" level=info msg="CreateContainer within sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\"" May 15 12:28:07.930040 containerd[1734]: time="2025-05-15T12:28:07.930017658Z" level=info msg="StartContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\"" May 15 12:28:07.931372 containerd[1734]: time="2025-05-15T12:28:07.931345755Z" level=info msg="connecting to shim 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" address="unix:///run/containerd/s/d9d0270194fbe9d61146192750bea91b220ecf8b8fb1a869b075ad9752bd7591" protocol=ttrpc version=3 May 15 12:28:07.945349 systemd[1]: Started cri-containerd-20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d.scope - libcontainer container 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d. May 15 12:28:07.977915 containerd[1734]: time="2025-05-15T12:28:07.977861204Z" level=info msg="StartContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" returns successfully" May 15 12:28:08.561192 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 15 12:28:08.561283 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 15 12:28:08.578704 systemd[1]: cri-containerd-20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d.scope: Deactivated successfully. May 15 12:28:08.581063 containerd[1734]: time="2025-05-15T12:28:08.581008410Z" level=info msg="received exit event container_id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" pid:4817 exit_status:1 exited_at:{seconds:1747312088 nanos:580789655}" May 15 12:28:08.581063 containerd[1734]: time="2025-05-15T12:28:08.581043891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" pid:4817 exit_status:1 exited_at:{seconds:1747312088 nanos:580789655}" May 15 12:28:08.598442 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d-rootfs.mount: Deactivated successfully. May 15 12:28:08.641528 kubelet[3331]: I0515 12:28:08.641477 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-j67jc" podStartSLOduration=1.887387449 podStartE2EDuration="1m8.641457809s" podCreationTimestamp="2025-05-15 12:27:00 +0000 UTC" firstStartedPulling="2025-05-15 12:27:00.775145884 +0000 UTC m=+19.410479322" lastFinishedPulling="2025-05-15 12:28:07.529216246 +0000 UTC m=+86.164549682" observedRunningTime="2025-05-15 12:28:08.641077975 +0000 UTC m=+87.276411441" watchObservedRunningTime="2025-05-15 12:28:08.641457809 +0000 UTC m=+87.276791318" May 15 12:28:10.625430 containerd[1734]: time="2025-05-15T12:28:10.625024450Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:10.625430 containerd[1734]: time="2025-05-15T12:28:10.625056510Z" level=warning msg="unknown status" status=0 May 15 12:28:11.216953 containerd[1734]: time="2025-05-15T12:28:10.625469691Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:11.216953 containerd[1734]: time="2025-05-15T12:28:10.625489085Z" level=warning msg="unknown status" status=0 May 15 12:28:13.623865 containerd[1734]: time="2025-05-15T12:28:13.623801099Z" level=info msg="StopContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" with timeout 2 (s)" May 15 12:28:15.624562 containerd[1734]: time="2025-05-15T12:28:15.624495528Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:15.624562 containerd[1734]: time="2025-05-15T12:28:15.624551859Z" level=warning msg="unknown status" status=0 May 15 12:28:15.625090 containerd[1734]: time="2025-05-15T12:28:15.624585722Z" level=info msg="Stop container \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" with signal terminated" May 15 12:28:17.278473 containerd[1734]: time="2025-05-15T12:28:16.433243803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:17.278473 containerd[1734]: time="2025-05-15T12:28:16.433243839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,}" May 15 12:28:17.432735 containerd[1734]: time="2025-05-15T12:28:17.432691937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,}" May 15 12:28:18.432587 containerd[1734]: time="2025-05-15T12:28:18.432487058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:18.433105 containerd[1734]: time="2025-05-15T12:28:18.432487022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,}" May 15 12:28:18.582221 containerd[1734]: time="2025-05-15T12:28:18.582149823Z" level=error msg="failed to handle container TaskExit event container_id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" pid:4817 exit_status:1 exited_at:{seconds:1747312088 nanos:580789655}" error="failed to stop container: failed to delete task: context deadline exceeded" May 15 12:28:20.433006 containerd[1734]: time="2025-05-15T12:28:20.432949839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:20.539184 containerd[1734]: time="2025-05-15T12:28:20.539143635Z" level=info msg="TaskExit event container_id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" id:\"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" pid:4817 exit_status:1 exited_at:{seconds:1747312088 nanos:580789655}" May 15 12:28:21.432665 containerd[1734]: time="2025-05-15T12:28:21.432455115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,}" May 15 12:28:22.540337 containerd[1734]: time="2025-05-15T12:28:22.540289754Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:22.540337 containerd[1734]: time="2025-05-15T12:28:22.540322069Z" level=warning msg="unknown status" status=0 May 15 12:28:24.541616 containerd[1734]: time="2025-05-15T12:28:24.541567560Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:24.541616 containerd[1734]: time="2025-05-15T12:28:24.541607555Z" level=warning msg="unknown status" status=0 May 15 12:28:26.542570 containerd[1734]: time="2025-05-15T12:28:26.542506552Z" level=error msg="get state for 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" error="context deadline exceeded" May 15 12:28:26.542570 containerd[1734]: time="2025-05-15T12:28:26.542562039Z" level=warning msg="unknown status" status=0 May 15 12:28:27.996986 systemd[1]: Started sshd@7-10.200.8.35:22-10.200.16.10:57154.service - OpenSSH per-connection server daemon (10.200.16.10:57154). May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076217513Z" level=error msg="ttrpc: received message on inactive stream" stream=31 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076281484Z" level=error msg="ttrpc: received message on inactive stream" stream=33 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076289699Z" level=error msg="ttrpc: received message on inactive stream" stream=35 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076299451Z" level=error msg="ttrpc: received message on inactive stream" stream=43 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076307655Z" level=error msg="ttrpc: received message on inactive stream" stream=47 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076315167Z" level=error msg="ttrpc: received message on inactive stream" stream=51 May 15 12:28:28.076349 containerd[1734]: time="2025-05-15T12:28:28.076321005Z" level=error msg="ttrpc: received message on inactive stream" stream=53 May 15 12:28:28.077974 containerd[1734]: time="2025-05-15T12:28:28.077858422Z" level=info msg="Ensure that container 20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d in task-service has been cleanup successfully" May 15 12:28:28.078183 containerd[1734]: time="2025-05-15T12:28:28.078148474Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"b0dc1424f4cde1e4102fdc0ee0501aa342a46fb50b0a57452863a80c10435c3e\": cannot exec in a deleted state" May 15 12:28:28.078341 containerd[1734]: time="2025-05-15T12:28:28.078149009Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"fec0629356aa470dbd63b9c03cc24f082dfa3860e7273449411d17b2650101d3\": cannot exec in a deleted state" May 15 12:28:28.078435 kubelet[3331]: E0515 12:28:28.078397 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"b0dc1424f4cde1e4102fdc0ee0501aa342a46fb50b0a57452863a80c10435c3e\": cannot exec in a deleted state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-shutdown"] May 15 12:28:28.078857 kubelet[3331]: E0515 12:28:28.078473 3331 kuberuntime_container.go:662] "PreStop hook failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"b0dc1424f4cde1e4102fdc0ee0501aa342a46fb50b0a57452863a80c10435c3e\": cannot exec in a deleted state" pod="calico-system/calico-node-j67jc" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" containerName="calico-node" containerID="containerd://20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" May 15 12:28:28.078857 kubelet[3331]: E0515 12:28:28.078415 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to create exec \"fec0629356aa470dbd63b9c03cc24f082dfa3860e7273449411d17b2650101d3\": cannot exec in a deleted state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.156993 containerd[1734]: time="2025-05-15T12:28:28.156957484Z" level=error msg="Failed to destroy network for sandbox \"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:28.158944 systemd[1]: run-netns-cni\x2d64e55ff3\x2d144a\x2d42e9\x2dc2fb\x2db60a4cf87a74.mount: Deactivated successfully. May 15 12:28:28.170857 containerd[1734]: time="2025-05-15T12:28:28.170040991Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 15 12:28:28.170857 containerd[1734]: time="2025-05-15T12:28:28.170795629Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 15 12:28:28.170965 kubelet[3331]: E0515 12:28:28.170249 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.171023 kubelet[3331]: E0515 12:28:28.170882 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.171258 containerd[1734]: time="2025-05-15T12:28:28.171241275Z" level=info msg="StopContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" returns successfully" May 15 12:28:28.176024 containerd[1734]: time="2025-05-15T12:28:28.175998171Z" level=info msg="StopPodSandbox for \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\"" May 15 12:28:28.176096 containerd[1734]: time="2025-05-15T12:28:28.176082416Z" level=info msg="Container to stop \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:28:28.176121 containerd[1734]: time="2025-05-15T12:28:28.176109653Z" level=info msg="Container to stop \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:28:28.176150 containerd[1734]: time="2025-05-15T12:28:28.176119821Z" level=info msg="Container to stop \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:28:28.177601 containerd[1734]: time="2025-05-15T12:28:28.177571296Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 15 12:28:28.177752 kubelet[3331]: E0515 12:28:28.177730 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.178018 containerd[1734]: time="2025-05-15T12:28:28.177994781Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 15 12:28:28.178205 kubelet[3331]: E0515 12:28:28.178112 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.178952 containerd[1734]: time="2025-05-15T12:28:28.178927214Z" level=error msg="ExecSync for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 15 12:28:28.179566 kubelet[3331]: E0515 12:28:28.179535 3331 remote_runtime.go:496] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 15 12:28:28.185834 systemd[1]: cri-containerd-586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278.scope: Deactivated successfully. May 15 12:28:28.188034 containerd[1734]: time="2025-05-15T12:28:28.188007006Z" level=info msg="TaskExit event in podsandbox handler container_id:\"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" id:\"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" pid:3817 exit_status:137 exited_at:{seconds:1747312108 nanos:187381650}" May 15 12:28:28.203736 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278-rootfs.mount: Deactivated successfully. May 15 12:28:28.975382 sshd[4867]: Accepted publickey for core from 10.200.16.10 port 57154 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:28.975717 sshd-session[4867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:28.980096 systemd-logind[1702]: New session 10 of user core. May 15 12:28:28.987301 systemd[1]: Started session-10.scope - Session 10 of User core. May 15 12:28:29.991753 sshd[4926]: Connection closed by 10.200.16.10 port 57154 May 15 12:28:29.992232 sshd-session[4867]: pam_unix(sshd:session): session closed for user core May 15 12:28:29.994933 systemd[1]: sshd@7-10.200.8.35:22-10.200.16.10:57154.service: Deactivated successfully. May 15 12:28:29.996477 systemd[1]: session-10.scope: Deactivated successfully. May 15 12:28:29.997195 systemd-logind[1702]: Session 10 logged out. Waiting for processes to exit. May 15 12:28:29.998262 systemd-logind[1702]: Removed session 10. May 15 12:28:30.023008 containerd[1734]: time="2025-05-15T12:28:30.022927211Z" level=info msg="received exit event sandbox_id:\"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" exit_status:137 exited_at:{seconds:1747312108 nanos:187381650}" May 15 12:28:30.023414 containerd[1734]: time="2025-05-15T12:28:30.023303597Z" level=info msg="shim disconnected" id=586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278 namespace=k8s.io May 15 12:28:30.023414 containerd[1734]: time="2025-05-15T12:28:30.023324995Z" level=warning msg="cleaning up after shim disconnected" id=586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278 namespace=k8s.io May 15 12:28:30.023414 containerd[1734]: time="2025-05-15T12:28:30.023345219Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:28:30.024861 containerd[1734]: time="2025-05-15T12:28:30.024836290Z" level=info msg="TearDown network for sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" successfully" May 15 12:28:30.024953 containerd[1734]: time="2025-05-15T12:28:30.024942142Z" level=info msg="StopPodSandbox for \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" returns successfully" May 15 12:28:30.029796 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278-shm.mount: Deactivated successfully. May 15 12:28:30.052501 containerd[1734]: time="2025-05-15T12:28:30.052466967Z" level=error msg="Failed to destroy network for sandbox \"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.054554 systemd[1]: run-netns-cni\x2df0aaed15\x2dbf7f\x2dbf4f\x2d3bbc\x2d4c106d754f09.mount: Deactivated successfully. May 15 12:28:30.075765 kubelet[3331]: I0515 12:28:30.075729 3331 topology_manager.go:215] "Topology Admit Handler" podUID="719e0812-a2ea-4a10-9491-fb67edb907c9" podNamespace="calico-system" podName="calico-node-fgrtv" May 15 12:28:30.116980 kubelet[3331]: E0515 12:28:30.075996 3331 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" containerName="flexvol-driver" May 15 12:28:30.116980 kubelet[3331]: E0515 12:28:30.076009 3331 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" containerName="calico-node" May 15 12:28:30.116980 kubelet[3331]: E0515 12:28:30.076016 3331 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" containerName="install-cni" May 15 12:28:30.116980 kubelet[3331]: I0515 12:28:30.076045 3331 memory_manager.go:354] "RemoveStaleState removing state" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" containerName="calico-node" May 15 12:28:30.082756 systemd[1]: Created slice kubepods-besteffort-pod719e0812_a2ea_4a10_9491_fb67edb907c9.slice - libcontainer container kubepods-besteffort-pod719e0812_a2ea_4a10_9491_fb67edb907c9.slice. May 15 12:28:30.150745 containerd[1734]: time="2025-05-15T12:28:30.150714613Z" level=error msg="Failed to destroy network for sandbox \"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.152405 systemd[1]: run-netns-cni\x2db6675dab\x2d583b\x2d60d9\x2d255e\x2df22f136fa35f.mount: Deactivated successfully. May 15 12:28:30.204360 kubelet[3331]: I0515 12:28:30.204274 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-xtables-lock\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204360 kubelet[3331]: I0515 12:28:30.204318 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-net-dir\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204360 kubelet[3331]: I0515 12:28:30.204321 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.204360 kubelet[3331]: I0515 12:28:30.204334 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.204503 kubelet[3331]: I0515 12:28:30.204453 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358caab1-faf4-41bf-a17e-7eb09b4eaabd-tigera-ca-bundle\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204652 kubelet[3331]: I0515 12:28:30.204581 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxqbx\" (UniqueName: \"kubernetes.io/projected/358caab1-faf4-41bf-a17e-7eb09b4eaabd-kube-api-access-bxqbx\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204652 kubelet[3331]: I0515 12:28:30.204598 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-lib-calico\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204652 kubelet[3331]: I0515 12:28:30.204615 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/358caab1-faf4-41bf-a17e-7eb09b4eaabd-node-certs\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204652 kubelet[3331]: I0515 12:28:30.204629 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-log-dir\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204652 kubelet[3331]: I0515 12:28:30.204642 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-flexvol-driver-host\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204761 kubelet[3331]: I0515 12:28:30.204656 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-lib-modules\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204761 kubelet[3331]: I0515 12:28:30.204670 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-policysync\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204761 kubelet[3331]: I0515 12:28:30.204686 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-run-calico\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204761 kubelet[3331]: I0515 12:28:30.204700 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-bin-dir\") pod \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\" (UID: \"358caab1-faf4-41bf-a17e-7eb09b4eaabd\") " May 15 12:28:30.204761 kubelet[3331]: I0515 12:28:30.204748 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-lib-modules\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204858 kubelet[3331]: I0515 12:28:30.204767 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-var-run-calico\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204858 kubelet[3331]: I0515 12:28:30.204785 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-policysync\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204858 kubelet[3331]: I0515 12:28:30.204801 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-cni-bin-dir\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204858 kubelet[3331]: I0515 12:28:30.204816 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/719e0812-a2ea-4a10-9491-fb67edb907c9-node-certs\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204858 kubelet[3331]: I0515 12:28:30.204833 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-cni-log-dir\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204961 kubelet[3331]: I0515 12:28:30.204849 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffl9x\" (UniqueName: \"kubernetes.io/projected/719e0812-a2ea-4a10-9491-fb67edb907c9-kube-api-access-ffl9x\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204961 kubelet[3331]: I0515 12:28:30.204866 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-xtables-lock\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204961 kubelet[3331]: I0515 12:28:30.204880 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-cni-net-dir\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204961 kubelet[3331]: I0515 12:28:30.204896 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719e0812-a2ea-4a10-9491-fb67edb907c9-tigera-ca-bundle\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.204961 kubelet[3331]: I0515 12:28:30.204911 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-var-lib-calico\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.205064 kubelet[3331]: I0515 12:28:30.204925 3331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/719e0812-a2ea-4a10-9491-fb67edb907c9-flexvol-driver-host\") pod \"calico-node-fgrtv\" (UID: \"719e0812-a2ea-4a10-9491-fb67edb907c9\") " pod="calico-system/calico-node-fgrtv" May 15 12:28:30.205064 kubelet[3331]: I0515 12:28:30.204941 3331 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-xtables-lock\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.205064 kubelet[3331]: I0515 12:28:30.204950 3331 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-net-dir\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.206192 kubelet[3331]: I0515 12:28:30.205506 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.208485 systemd[1]: var-lib-kubelet-pods-358caab1\x2dfaf4\x2d41bf\x2da17e\x2d7eb09b4eaabd-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 15 12:28:30.209909 kubelet[3331]: I0515 12:28:30.209660 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-policysync" (OuterVolumeSpecName: "policysync") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.209909 kubelet[3331]: I0515 12:28:30.209688 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.209909 kubelet[3331]: I0515 12:28:30.209703 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.209909 kubelet[3331]: I0515 12:28:30.209716 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.209909 kubelet[3331]: I0515 12:28:30.209729 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.210063 kubelet[3331]: I0515 12:28:30.209740 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:28:30.210518 kubelet[3331]: I0515 12:28:30.210476 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358caab1-faf4-41bf-a17e-7eb09b4eaabd-kube-api-access-bxqbx" (OuterVolumeSpecName: "kube-api-access-bxqbx") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "kube-api-access-bxqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:28:30.210623 kubelet[3331]: I0515 12:28:30.210612 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/358caab1-faf4-41bf-a17e-7eb09b4eaabd-node-certs" (OuterVolumeSpecName: "node-certs") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:28:30.210903 kubelet[3331]: I0515 12:28:30.210880 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358caab1-faf4-41bf-a17e-7eb09b4eaabd-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "358caab1-faf4-41bf-a17e-7eb09b4eaabd" (UID: "358caab1-faf4-41bf-a17e-7eb09b4eaabd"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:28:30.257206 containerd[1734]: time="2025-05-15T12:28:30.257102459Z" level=error msg="Failed to destroy network for sandbox \"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.306571 kubelet[3331]: I0515 12:28:30.306551 3331 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-lib-modules\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307290 3331 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-bin-dir\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307308 3331 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-bxqbx\" (UniqueName: \"kubernetes.io/projected/358caab1-faf4-41bf-a17e-7eb09b4eaabd-kube-api-access-bxqbx\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307319 3331 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/358caab1-faf4-41bf-a17e-7eb09b4eaabd-node-certs\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307329 3331 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-cni-log-dir\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307350 3331 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-flexvol-driver-host\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307358 3331 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-policysync\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307366 3331 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-run-calico\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307371 kubelet[3331]: I0515 12:28:30.307372 3331 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358caab1-faf4-41bf-a17e-7eb09b4eaabd-tigera-ca-bundle\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.307581 kubelet[3331]: I0515 12:28:30.307380 3331 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/358caab1-faf4-41bf-a17e-7eb09b4eaabd-var-lib-calico\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:28:30.311553 containerd[1734]: time="2025-05-15T12:28:30.311503619Z" level=error msg="Failed to destroy network for sandbox \"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.356195 containerd[1734]: time="2025-05-15T12:28:30.356159632Z" level=error msg="Failed to destroy network for sandbox \"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.370187 containerd[1734]: time="2025-05-15T12:28:30.370095393Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.370319 kubelet[3331]: E0515 12:28:30.370295 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.370370 kubelet[3331]: E0515 12:28:30.370353 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:28:30.370416 kubelet[3331]: E0515 12:28:30.370376 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" May 15 12:28:30.370445 kubelet[3331]: E0515 12:28:30.370427 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-nk2q9_calico-apiserver(1d54623e-7c9b-4f0a-8922-6bb47d7e26af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a683d5f5dbc9ad8b500026e8957b9d81f0c1107ea63dc655d0d9095c9d262634\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" podUID="1d54623e-7c9b-4f0a-8922-6bb47d7e26af" May 15 12:28:30.419420 containerd[1734]: time="2025-05-15T12:28:30.419267784Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.419617 kubelet[3331]: E0515 12:28:30.419593 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.419737 kubelet[3331]: E0515 12:28:30.419720 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:28:30.419884 kubelet[3331]: E0515 12:28:30.419820 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fthxd" May 15 12:28:30.419942 kubelet[3331]: E0515 12:28:30.419865 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fthxd_calico-system(55ce30e6-eb73-4f69-aef9-927a3bcb6662)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ea0efc282360faa0bd490542e2757445d42da4cc4c934c7330127c07d142dd8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fthxd" podUID="55ce30e6-eb73-4f69-aef9-927a3bcb6662" May 15 12:28:30.420621 containerd[1734]: time="2025-05-15T12:28:30.420602918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fgrtv,Uid:719e0812-a2ea-4a10-9491-fb67edb907c9,Namespace:calico-system,Attempt:0,}" May 15 12:28:30.447920 containerd[1734]: time="2025-05-15T12:28:30.447896694Z" level=error msg="Failed to destroy network for sandbox \"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.466041 containerd[1734]: time="2025-05-15T12:28:30.465979445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.466151 kubelet[3331]: E0515 12:28:30.466126 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.466207 kubelet[3331]: E0515 12:28:30.466168 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:30.466232 kubelet[3331]: E0515 12:28:30.466201 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" May 15 12:28:30.466254 kubelet[3331]: E0515 12:28:30.466235 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5547fc8788-7mx2l_calico-system(f65466fd-ea14-4a19-aabd-5cdb3b868978)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5547fc8788-7mx2l_calico-system(f65466fd-ea14-4a19-aabd-5cdb3b868978)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3dff631a39f09f04e3c0ddeb2fcaa66c4e8d2765d4880a253ac19928f76ef9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" podUID="f65466fd-ea14-4a19-aabd-5cdb3b868978" May 15 12:28:30.528464 containerd[1734]: time="2025-05-15T12:28:30.528398614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.528959 kubelet[3331]: E0515 12:28:30.528556 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.528959 kubelet[3331]: E0515 12:28:30.528608 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:28:30.528959 kubelet[3331]: E0515 12:28:30.528629 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pkr4m" May 15 12:28:30.529126 kubelet[3331]: E0515 12:28:30.528745 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-pkr4m_kube-system(baa3369f-3b5a-4c16-a95f-d55a24190750)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f498e4ab4d2e4c0756532a629e4c6c98b200e4653db33a29af77642b7c1c88c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-pkr4m" podUID="baa3369f-3b5a-4c16-a95f-d55a24190750" May 15 12:28:30.532111 containerd[1734]: time="2025-05-15T12:28:30.532048375Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.532233 kubelet[3331]: E0515 12:28:30.532188 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.532233 kubelet[3331]: E0515 12:28:30.532223 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:28:30.532305 kubelet[3331]: E0515 12:28:30.532241 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" May 15 12:28:30.532305 kubelet[3331]: E0515 12:28:30.532272 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84478f496d-j8zht_calico-apiserver(5526c02b-5980-4359-8781-c8829963a4f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b290dc74134d1f3d30167bdae7ea8f8ebf0f8c7f15727bf8bcce92efee945ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" podUID="5526c02b-5980-4359-8781-c8829963a4f3" May 15 12:28:30.575512 containerd[1734]: time="2025-05-15T12:28:30.575466057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.575644 kubelet[3331]: E0515 12:28:30.575615 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.575700 kubelet[3331]: E0515 12:28:30.575657 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:28:30.575700 kubelet[3331]: E0515 12:28:30.575676 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s4cpx" May 15 12:28:30.575768 kubelet[3331]: E0515 12:28:30.575714 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-s4cpx_kube-system(6ec4fd21-5481-4869-8926-5a6b5b6153c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12b9ac93f2e3e47ca1df11bbc0f1d18864fb38e7a688e23fa29727d62e9d28e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-s4cpx" podUID="6ec4fd21-5481-4869-8926-5a6b5b6153c9" May 15 12:28:30.624599 containerd[1734]: time="2025-05-15T12:28:30.624531200Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.624714 kubelet[3331]: E0515 12:28:30.624677 3331 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:28:30.624753 kubelet[3331]: E0515 12:28:30.624722 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:28:30.624753 kubelet[3331]: E0515 12:28:30.624739 3331 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" May 15 12:28:30.624816 kubelet[3331]: E0515 12:28:30.624789 3331 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b7f8fc4d6-6ch4b_calico-apiserver(dea655ce-5eab-4e67-a276-be0e9b39cc85)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14c2088cae758eb9d1ea19d282caebd01dd23341247316963ecb69c61c88fa20\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" podUID="dea655ce-5eab-4e67-a276-be0e9b39cc85" May 15 12:28:30.665887 kubelet[3331]: I0515 12:28:30.665812 3331 scope.go:117] "RemoveContainer" containerID="20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d" May 15 12:28:30.669372 containerd[1734]: time="2025-05-15T12:28:30.669349794Z" level=info msg="RemoveContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\"" May 15 12:28:30.670559 systemd[1]: Removed slice kubepods-besteffort-pod358caab1_faf4_41bf_a17e_7eb09b4eaabd.slice - libcontainer container kubepods-besteffort-pod358caab1_faf4_41bf_a17e_7eb09b4eaabd.slice. May 15 12:28:30.670724 systemd[1]: kubepods-besteffort-pod358caab1_faf4_41bf_a17e_7eb09b4eaabd.slice: Consumed 419ms CPU time, 188.3M memory peak, 160.4M written to disk. May 15 12:28:30.771568 systemd[1]: run-netns-cni\x2dfb6f28bb\x2db23b\x2dcfd5\x2db1d1\x2db8301e391842.mount: Deactivated successfully. May 15 12:28:30.771718 systemd[1]: run-netns-cni\x2dec0e934b\x2d23a7\x2da4e2\x2d7da8\x2d528473ce5865.mount: Deactivated successfully. May 15 12:28:30.771809 systemd[1]: run-netns-cni\x2d8d7ee559\x2df064\x2d09b0\x2d85e5\x2d30bc2ad6944c.mount: Deactivated successfully. May 15 12:28:30.771894 systemd[1]: run-netns-cni\x2d627b7959\x2dd9dc\x2d816b\x2daeca\x2d914408fcfcfa.mount: Deactivated successfully. May 15 12:28:30.771982 systemd[1]: var-lib-kubelet-pods-358caab1\x2dfaf4\x2d41bf\x2da17e\x2d7eb09b4eaabd-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 15 12:28:30.772077 systemd[1]: var-lib-kubelet-pods-358caab1\x2dfaf4\x2d41bf\x2da17e\x2d7eb09b4eaabd-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbxqbx.mount: Deactivated successfully. May 15 12:28:30.876880 containerd[1734]: time="2025-05-15T12:28:30.876817373Z" level=info msg="RemoveContainer for \"20f8afd5617e583c53e75228fe387b57827b3cba25588e6bb18309f60bc8f34d\" returns successfully" May 15 12:28:30.877127 kubelet[3331]: I0515 12:28:30.877037 3331 scope.go:117] "RemoveContainer" containerID="dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e" May 15 12:28:30.879235 containerd[1734]: time="2025-05-15T12:28:30.879189969Z" level=info msg="RemoveContainer for \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\"" May 15 12:28:31.067747 containerd[1734]: time="2025-05-15T12:28:31.067706205Z" level=info msg="RemoveContainer for \"dabed820d95ecd4eaf8b92b9815ec8f68e13118a3bda36c913d88e1ff779af5e\" returns successfully" May 15 12:28:31.068153 kubelet[3331]: I0515 12:28:31.068004 3331 scope.go:117] "RemoveContainer" containerID="fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07" May 15 12:28:31.070015 containerd[1734]: time="2025-05-15T12:28:31.069993355Z" level=info msg="RemoveContainer for \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\"" May 15 12:28:31.186442 containerd[1734]: time="2025-05-15T12:28:31.186408302Z" level=info msg="connecting to shim 888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a" address="unix:///run/containerd/s/73f237ca4eea3704223cf40dd309daec80c7d4b38d6c43169ee1f25b7c79d200" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:31.210366 systemd[1]: Started cri-containerd-888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a.scope - libcontainer container 888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a. May 15 12:28:31.226103 containerd[1734]: time="2025-05-15T12:28:31.226071850Z" level=info msg="RemoveContainer for \"fce7805e691711b9971246be5822f531007036317749633587a5e005a4c03c07\" returns successfully" May 15 12:28:31.272279 containerd[1734]: time="2025-05-15T12:28:31.272243379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fgrtv,Uid:719e0812-a2ea-4a10-9491-fb67edb907c9,Namespace:calico-system,Attempt:0,} returns sandbox id \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\"" May 15 12:28:31.274226 containerd[1734]: time="2025-05-15T12:28:31.274201454Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 12:28:31.416555 containerd[1734]: time="2025-05-15T12:28:31.415008005Z" level=info msg="Container 33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:31.433550 kubelet[3331]: I0515 12:28:31.433529 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358caab1-faf4-41bf-a17e-7eb09b4eaabd" path="/var/lib/kubelet/pods/358caab1-faf4-41bf-a17e-7eb09b4eaabd/volumes" May 15 12:28:31.575552 containerd[1734]: time="2025-05-15T12:28:31.575449550Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\"" May 15 12:28:31.576028 containerd[1734]: time="2025-05-15T12:28:31.576005428Z" level=info msg="StartContainer for \"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\"" May 15 12:28:31.577306 containerd[1734]: time="2025-05-15T12:28:31.577257447Z" level=info msg="connecting to shim 33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45" address="unix:///run/containerd/s/73f237ca4eea3704223cf40dd309daec80c7d4b38d6c43169ee1f25b7c79d200" protocol=ttrpc version=3 May 15 12:28:31.595314 systemd[1]: Started cri-containerd-33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45.scope - libcontainer container 33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45. May 15 12:28:31.625027 containerd[1734]: time="2025-05-15T12:28:31.624996167Z" level=info msg="StartContainer for \"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\" returns successfully" May 15 12:28:31.629951 systemd[1]: cri-containerd-33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45.scope: Deactivated successfully. May 15 12:28:31.630249 systemd[1]: cri-containerd-33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45.scope: Consumed 22ms CPU time, 8.1M memory peak, 6.3M written to disk. May 15 12:28:31.632396 containerd[1734]: time="2025-05-15T12:28:31.632371268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\" id:\"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\" pid:5190 exited_at:{seconds:1747312111 nanos:632164367}" May 15 12:28:31.632470 containerd[1734]: time="2025-05-15T12:28:31.632403470Z" level=info msg="received exit event container_id:\"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\" id:\"33a6cb901c1c0eddaa5a0de6c49f2d752802c46ae2334f6c628e0fe09ca37a45\" pid:5190 exited_at:{seconds:1747312111 nanos:632164367}" May 15 12:28:34.677168 containerd[1734]: time="2025-05-15T12:28:34.676936015Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 12:28:34.869460 containerd[1734]: time="2025-05-15T12:28:34.869420744Z" level=info msg="Container d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:34.985846 containerd[1734]: time="2025-05-15T12:28:34.985743962Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\"" May 15 12:28:34.986487 containerd[1734]: time="2025-05-15T12:28:34.986266057Z" level=info msg="StartContainer for \"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\"" May 15 12:28:34.987597 containerd[1734]: time="2025-05-15T12:28:34.987559598Z" level=info msg="connecting to shim d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e" address="unix:///run/containerd/s/73f237ca4eea3704223cf40dd309daec80c7d4b38d6c43169ee1f25b7c79d200" protocol=ttrpc version=3 May 15 12:28:35.007318 systemd[1]: Started cri-containerd-d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e.scope - libcontainer container d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e. May 15 12:28:35.035575 containerd[1734]: time="2025-05-15T12:28:35.035517633Z" level=info msg="StartContainer for \"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\" returns successfully" May 15 12:28:35.105406 systemd[1]: Started sshd@8-10.200.8.35:22-10.200.16.10:34944.service - OpenSSH per-connection server daemon (10.200.16.10:34944). May 15 12:28:35.305304 systemd[1]: cri-containerd-d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e.scope: Deactivated successfully. May 15 12:28:35.307318 containerd[1734]: time="2025-05-15T12:28:35.307292933Z" level=info msg="received exit event container_id:\"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\" id:\"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\" pid:5239 exited_at:{seconds:1747312115 nanos:307098873}" May 15 12:28:35.307448 containerd[1734]: time="2025-05-15T12:28:35.307434423Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\" id:\"d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e\" pid:5239 exited_at:{seconds:1747312115 nanos:307098873}" May 15 12:28:35.321289 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d26c251c4af8fc853c5fa1481340882969d5c7c74dc685f0941b2c609536c23e-rootfs.mount: Deactivated successfully. May 15 12:28:35.744926 sshd[5256]: Accepted publickey for core from 10.200.16.10 port 34944 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:35.745849 sshd-session[5256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:35.749507 systemd-logind[1702]: New session 11 of user core. May 15 12:28:35.752308 systemd[1]: Started session-11.scope - Session 11 of User core. May 15 12:28:36.240480 sshd[5272]: Connection closed by 10.200.16.10 port 34944 May 15 12:28:36.278367 sshd-session[5256]: pam_unix(sshd:session): session closed for user core May 15 12:28:36.280943 systemd[1]: sshd@8-10.200.8.35:22-10.200.16.10:34944.service: Deactivated successfully. May 15 12:28:36.282709 systemd[1]: session-11.scope: Deactivated successfully. May 15 12:28:36.284025 systemd-logind[1702]: Session 11 logged out. Waiting for processes to exit. May 15 12:28:36.285408 systemd-logind[1702]: Removed session 11. May 15 12:28:36.696085 containerd[1734]: time="2025-05-15T12:28:36.695530522Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 12:28:36.819192 containerd[1734]: time="2025-05-15T12:28:36.817130844Z" level=info msg="Container 6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:36.980086 containerd[1734]: time="2025-05-15T12:28:36.980020610Z" level=info msg="CreateContainer within sandbox \"888d25aa3b2ee90a5f72facc0c4d492bbce66d38f3fd03fb48f53e0bb036a55a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\"" May 15 12:28:36.980748 containerd[1734]: time="2025-05-15T12:28:36.980560257Z" level=info msg="StartContainer for \"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\"" May 15 12:28:36.981862 containerd[1734]: time="2025-05-15T12:28:36.981821426Z" level=info msg="connecting to shim 6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d" address="unix:///run/containerd/s/73f237ca4eea3704223cf40dd309daec80c7d4b38d6c43169ee1f25b7c79d200" protocol=ttrpc version=3 May 15 12:28:37.001314 systemd[1]: Started cri-containerd-6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d.scope - libcontainer container 6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d. May 15 12:28:37.030793 containerd[1734]: time="2025-05-15T12:28:37.030767200Z" level=info msg="StartContainer for \"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" returns successfully" May 15 12:28:37.737327 containerd[1734]: time="2025-05-15T12:28:37.737293912Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"a94c7ffba138343fac978263bf7fa1638c02fc83fac796fc1b93daef41f2c6bb\" pid:5349 exit_status:1 exited_at:{seconds:1747312117 nanos:736968799}" May 15 12:28:38.739225 containerd[1734]: time="2025-05-15T12:28:38.739181162Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"e11d0571ceaffb331df8f1ebc1d824c96a9a5e17d622de6f139920aff21b0c23\" pid:5497 exit_status:1 exited_at:{seconds:1747312118 nanos:738987935}" May 15 12:28:38.814699 systemd-networkd[1361]: vxlan.calico: Link UP May 15 12:28:38.814708 systemd-networkd[1361]: vxlan.calico: Gained carrier May 15 12:28:40.392304 systemd-networkd[1361]: vxlan.calico: Gained IPv6LL May 15 12:28:41.354162 systemd[1]: Started sshd@9-10.200.8.35:22-10.200.16.10:45666.service - OpenSSH per-connection server daemon (10.200.16.10:45666). May 15 12:28:41.432569 containerd[1734]: time="2025-05-15T12:28:41.432513950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,}" May 15 12:28:41.447285 containerd[1734]: time="2025-05-15T12:28:41.447259010Z" level=info msg="StopPodSandbox for \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\"" May 15 12:28:41.447406 containerd[1734]: time="2025-05-15T12:28:41.447391288Z" level=info msg="TearDown network for sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" successfully" May 15 12:28:41.447461 containerd[1734]: time="2025-05-15T12:28:41.447405272Z" level=info msg="StopPodSandbox for \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" returns successfully" May 15 12:28:41.447964 containerd[1734]: time="2025-05-15T12:28:41.447903103Z" level=info msg="RemovePodSandbox for \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\"" May 15 12:28:41.447964 containerd[1734]: time="2025-05-15T12:28:41.447928967Z" level=info msg="Forcibly stopping sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\"" May 15 12:28:41.448185 containerd[1734]: time="2025-05-15T12:28:41.448115521Z" level=info msg="TearDown network for sandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" successfully" May 15 12:28:41.449072 containerd[1734]: time="2025-05-15T12:28:41.449058028Z" level=info msg="Ensure that sandbox 586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278 in task-service has been cleanup successfully" May 15 12:28:41.566242 systemd-networkd[1361]: calia33d8da3e73: Link UP May 15 12:28:41.567002 systemd-networkd[1361]: calia33d8da3e73: Gained carrier May 15 12:28:41.578602 kubelet[3331]: I0515 12:28:41.578559 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fgrtv" podStartSLOduration=11.578541576 podStartE2EDuration="11.578541576s" podCreationTimestamp="2025-05-15 12:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:37.724700263 +0000 UTC m=+116.360033708" watchObservedRunningTime="2025-05-15 12:28:41.578541576 +0000 UTC m=+120.213875085" May 15 12:28:41.579736 containerd[1734]: 2025-05-15 12:28:41.513 [INFO][5582] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0 calico-kube-controllers-5547fc8788- calico-system f65466fd-ea14-4a19-aabd-5cdb3b868978 958 0 2025-05-15 12:28:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5547fc8788 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 calico-kube-controllers-5547fc8788-7mx2l eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia33d8da3e73 [] []}} ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-" May 15 12:28:41.579736 containerd[1734]: 2025-05-15 12:28:41.513 [INFO][5582] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.579736 containerd[1734]: 2025-05-15 12:28:41.536 [INFO][5595] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" HandleID="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.542 [INFO][5595] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" HandleID="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030f730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"calico-kube-controllers-5547fc8788-7mx2l", "timestamp":"2025-05-15 12:28:41.536117227 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.542 [INFO][5595] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.542 [INFO][5595] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.543 [INFO][5595] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.544 [INFO][5595] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.546 [INFO][5595] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.549 [INFO][5595] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.551 [INFO][5595] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.579897 containerd[1734]: 2025-05-15 12:28:41.552 [INFO][5595] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.552 [INFO][5595] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.553 [INFO][5595] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.558 [INFO][5595] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.562 [INFO][5595] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.129/26] block=192.168.41.128/26 handle="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.562 [INFO][5595] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.129/26] handle="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.562 [INFO][5595] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:41.580122 containerd[1734]: 2025-05-15 12:28:41.562 [INFO][5595] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.129/26] IPv6=[] ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" HandleID="k8s-pod-network.3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.580279 containerd[1734]: 2025-05-15 12:28:41.563 [INFO][5582] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0", GenerateName:"calico-kube-controllers-5547fc8788-", Namespace:"calico-system", SelfLink:"", UID:"f65466fd-ea14-4a19-aabd-5cdb3b868978", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 28, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5547fc8788", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"calico-kube-controllers-5547fc8788-7mx2l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.41.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia33d8da3e73", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:41.580334 containerd[1734]: 2025-05-15 12:28:41.564 [INFO][5582] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.129/32] ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.580334 containerd[1734]: 2025-05-15 12:28:41.564 [INFO][5582] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia33d8da3e73 ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.580334 containerd[1734]: 2025-05-15 12:28:41.567 [INFO][5582] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.580394 containerd[1734]: 2025-05-15 12:28:41.567 [INFO][5582] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0", GenerateName:"calico-kube-controllers-5547fc8788-", Namespace:"calico-system", SelfLink:"", UID:"f65466fd-ea14-4a19-aabd-5cdb3b868978", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 28, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5547fc8788", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f", Pod:"calico-kube-controllers-5547fc8788-7mx2l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.41.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia33d8da3e73", MAC:"02:d6:72:e1:fa:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:41.580440 containerd[1734]: 2025-05-15 12:28:41.576 [INFO][5582] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" Namespace="calico-system" Pod="calico-kube-controllers-5547fc8788-7mx2l" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--kube--controllers--5547fc8788--7mx2l-eth0" May 15 12:28:41.628620 containerd[1734]: time="2025-05-15T12:28:41.628490933Z" level=info msg="RemovePodSandbox \"586431069cbf84e50bf569b8f9a1266e7bd3d1cec927c1c2a0ffa0e1cd2c2278\" returns successfully" May 15 12:28:41.628885 containerd[1734]: time="2025-05-15T12:28:41.628864833Z" level=info msg="StopPodSandbox for \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\"" May 15 12:28:41.629072 containerd[1734]: time="2025-05-15T12:28:41.629052177Z" level=info msg="TearDown network for sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" successfully" May 15 12:28:41.629105 containerd[1734]: time="2025-05-15T12:28:41.629079847Z" level=info msg="StopPodSandbox for \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" returns successfully" May 15 12:28:41.629490 containerd[1734]: time="2025-05-15T12:28:41.629395970Z" level=info msg="RemovePodSandbox for \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\"" May 15 12:28:41.629490 containerd[1734]: time="2025-05-15T12:28:41.629454480Z" level=info msg="Forcibly stopping sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\"" May 15 12:28:41.629565 containerd[1734]: time="2025-05-15T12:28:41.629532864Z" level=info msg="TearDown network for sandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" successfully" May 15 12:28:41.630423 containerd[1734]: time="2025-05-15T12:28:41.630403694Z" level=info msg="Ensure that sandbox fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd in task-service has been cleanup successfully" May 15 12:28:41.867218 containerd[1734]: time="2025-05-15T12:28:41.867127601Z" level=info msg="RemovePodSandbox \"fb75360b9e2c61758d88df4c05fa45696cc612c10de6005a4571ae0eb5905bfd\" returns successfully" May 15 12:28:41.992125 sshd[5578]: Accepted publickey for core from 10.200.16.10 port 45666 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:41.993039 sshd-session[5578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:41.996902 systemd-logind[1702]: New session 12 of user core. May 15 12:28:42.001315 systemd[1]: Started session-12.scope - Session 12 of User core. May 15 12:28:42.177411 containerd[1734]: time="2025-05-15T12:28:42.177379677Z" level=info msg="connecting to shim 3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f" address="unix:///run/containerd/s/e14c98647b9723ca25ee59bb6a2b3153607f528c3dc7cdb998d5e212f3d6c401" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:42.199283 systemd[1]: Started cri-containerd-3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f.scope - libcontainer container 3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f. May 15 12:28:42.232741 containerd[1734]: time="2025-05-15T12:28:42.232719436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5547fc8788-7mx2l,Uid:f65466fd-ea14-4a19-aabd-5cdb3b868978,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f\"" May 15 12:28:42.234139 containerd[1734]: time="2025-05-15T12:28:42.233835912Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 15 12:28:42.432370 containerd[1734]: time="2025-05-15T12:28:42.432341526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,}" May 15 12:28:42.432573 containerd[1734]: time="2025-05-15T12:28:42.432345995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,}" May 15 12:28:42.485947 sshd[5613]: Connection closed by 10.200.16.10 port 45666 May 15 12:28:42.486335 sshd-session[5578]: pam_unix(sshd:session): session closed for user core May 15 12:28:42.488798 systemd[1]: sshd@9-10.200.8.35:22-10.200.16.10:45666.service: Deactivated successfully. May 15 12:28:42.490259 systemd[1]: session-12.scope: Deactivated successfully. May 15 12:28:42.490840 systemd-logind[1702]: Session 12 logged out. Waiting for processes to exit. May 15 12:28:42.491902 systemd-logind[1702]: Removed session 12. May 15 12:28:42.588240 systemd-networkd[1361]: cali7175ecdbc97: Link UP May 15 12:28:42.588861 systemd-networkd[1361]: cali7175ecdbc97: Gained carrier May 15 12:28:42.600549 containerd[1734]: 2025-05-15 12:28:42.538 [INFO][5671] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0 coredns-7db6d8ff4d- kube-system baa3369f-3b5a-4c16-a95f-d55a24190750 820 0 2025-05-15 12:26:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 coredns-7db6d8ff4d-pkr4m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7175ecdbc97 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-" May 15 12:28:42.600549 containerd[1734]: 2025-05-15 12:28:42.539 [INFO][5671] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.600549 containerd[1734]: 2025-05-15 12:28:42.557 [INFO][5684] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" HandleID="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.563 [INFO][5684] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" HandleID="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000295d30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"coredns-7db6d8ff4d-pkr4m", "timestamp":"2025-05-15 12:28:42.557911887 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.563 [INFO][5684] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.563 [INFO][5684] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.563 [INFO][5684] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.564 [INFO][5684] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.567 [INFO][5684] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.570 [INFO][5684] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.571 [INFO][5684] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.600870 containerd[1734]: 2025-05-15 12:28:42.573 [INFO][5684] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.573 [INFO][5684] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.574 [INFO][5684] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.578 [INFO][5684] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.585 [INFO][5684] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.130/26] block=192.168.41.128/26 handle="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.585 [INFO][5684] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.130/26] handle="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.585 [INFO][5684] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:42.601069 containerd[1734]: 2025-05-15 12:28:42.585 [INFO][5684] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.130/26] IPv6=[] ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" HandleID="k8s-pod-network.9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.586 [INFO][5671] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"baa3369f-3b5a-4c16-a95f-d55a24190750", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"coredns-7db6d8ff4d-pkr4m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.41.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7175ecdbc97", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.586 [INFO][5671] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.130/32] ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.586 [INFO][5671] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7175ecdbc97 ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.588 [INFO][5671] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.589 [INFO][5671] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"baa3369f-3b5a-4c16-a95f-d55a24190750", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f", Pod:"coredns-7db6d8ff4d-pkr4m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.41.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7175ecdbc97", MAC:"da:e6:5a:c5:75:32", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:42.601686 containerd[1734]: 2025-05-15 12:28:42.598 [INFO][5671] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pkr4m" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--pkr4m-eth0" May 15 12:28:42.723785 systemd-networkd[1361]: cali029b4dc67bf: Link UP May 15 12:28:42.723998 systemd-networkd[1361]: cali029b4dc67bf: Gained carrier May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.643 [INFO][5703] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0 coredns-7db6d8ff4d- kube-system 6ec4fd21-5481-4869-8926-5a6b5b6153c9 811 0 2025-05-15 12:26:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 coredns-7db6d8ff4d-s4cpx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali029b4dc67bf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.643 [INFO][5703] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.660 [INFO][5715] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" HandleID="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.666 [INFO][5715] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" HandleID="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b430), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"coredns-7db6d8ff4d-s4cpx", "timestamp":"2025-05-15 12:28:42.660276531 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.666 [INFO][5715] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.666 [INFO][5715] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.666 [INFO][5715] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.667 [INFO][5715] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.669 [INFO][5715] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.672 [INFO][5715] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.673 [INFO][5715] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.674 [INFO][5715] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.674 [INFO][5715] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.675 [INFO][5715] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.716 [INFO][5715] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.720 [INFO][5715] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.131/26] block=192.168.41.128/26 handle="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.720 [INFO][5715] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.131/26] handle="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.720 [INFO][5715] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:42.737498 containerd[1734]: 2025-05-15 12:28:42.720 [INFO][5715] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.131/26] IPv6=[] ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" HandleID="k8s-pod-network.42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.721 [INFO][5703] cni-plugin/k8s.go 386: Populated endpoint ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6ec4fd21-5481-4869-8926-5a6b5b6153c9", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"coredns-7db6d8ff4d-s4cpx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.41.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali029b4dc67bf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.722 [INFO][5703] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.131/32] ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.722 [INFO][5703] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali029b4dc67bf ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.724 [INFO][5703] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.725 [INFO][5703] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6ec4fd21-5481-4869-8926-5a6b5b6153c9", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a", Pod:"coredns-7db6d8ff4d-s4cpx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.41.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali029b4dc67bf", MAC:"5a:06:cc:8d:ba:db", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:42.738878 containerd[1734]: 2025-05-15 12:28:42.735 [INFO][5703] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s4cpx" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-coredns--7db6d8ff4d--s4cpx-eth0" May 15 12:28:43.183812 containerd[1734]: time="2025-05-15T12:28:43.183773475Z" level=info msg="connecting to shim 9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f" address="unix:///run/containerd/s/9a0a6703e292d8b5423d3ef730339e699085234306ef4099489cd2ecdde717ac" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:43.202328 systemd[1]: Started cri-containerd-9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f.scope - libcontainer container 9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f. May 15 12:28:43.320003 containerd[1734]: time="2025-05-15T12:28:43.319977821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pkr4m,Uid:baa3369f-3b5a-4c16-a95f-d55a24190750,Namespace:kube-system,Attempt:0,} returns sandbox id \"9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f\"" May 15 12:28:43.322341 containerd[1734]: time="2025-05-15T12:28:43.322317897Z" level=info msg="CreateContainer within sandbox \"9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:28:43.400335 systemd-networkd[1361]: calia33d8da3e73: Gained IPv6LL May 15 12:28:43.432291 containerd[1734]: time="2025-05-15T12:28:43.432270324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:43.485569 containerd[1734]: time="2025-05-15T12:28:43.485335273Z" level=info msg="connecting to shim 42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a" address="unix:///run/containerd/s/29ff19c5e730b16952fbcac017fb64ed25684db4549a50934039f7a7260f66cb" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:43.505462 systemd[1]: Started cri-containerd-42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a.scope - libcontainer container 42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a. May 15 12:28:43.779026 containerd[1734]: time="2025-05-15T12:28:43.778942204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s4cpx,Uid:6ec4fd21-5481-4869-8926-5a6b5b6153c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a\"" May 15 12:28:43.782239 containerd[1734]: time="2025-05-15T12:28:43.782162501Z" level=info msg="CreateContainer within sandbox \"42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:28:43.797865 systemd-networkd[1361]: cali02c7a9c392f: Link UP May 15 12:28:43.798977 systemd-networkd[1361]: cali02c7a9c392f: Gained carrier May 15 12:28:43.826821 containerd[1734]: time="2025-05-15T12:28:43.826802167Z" level=info msg="Container eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:43.848266 systemd-networkd[1361]: cali029b4dc67bf: Gained IPv6LL May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.747 [INFO][5829] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0 calico-apiserver-84478f496d- calico-apiserver 5526c02b-5980-4359-8781-c8829963a4f3 814 0 2025-05-15 12:27:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84478f496d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 calico-apiserver-84478f496d-j8zht eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali02c7a9c392f [] []}} ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.747 [INFO][5829] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.767 [INFO][5842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" HandleID="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.772 [INFO][5842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" HandleID="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b20), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"calico-apiserver-84478f496d-j8zht", "timestamp":"2025-05-15 12:28:43.767183278 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.772 [INFO][5842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.772 [INFO][5842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.772 [INFO][5842] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.773 [INFO][5842] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.777 [INFO][5842] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.780 [INFO][5842] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.782 [INFO][5842] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.784 [INFO][5842] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.784 [INFO][5842] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.785 [INFO][5842] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12 May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.790 [INFO][5842] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.794 [INFO][5842] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.132/26] block=192.168.41.128/26 handle="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.794 [INFO][5842] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.132/26] handle="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.794 [INFO][5842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:43.921358 containerd[1734]: 2025-05-15 12:28:43.794 [INFO][5842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.132/26] IPv6=[] ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" HandleID="k8s-pod-network.71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.796 [INFO][5829] cni-plugin/k8s.go 386: Populated endpoint ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0", GenerateName:"calico-apiserver-84478f496d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5526c02b-5980-4359-8781-c8829963a4f3", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84478f496d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"calico-apiserver-84478f496d-j8zht", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali02c7a9c392f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.796 [INFO][5829] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.132/32] ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.796 [INFO][5829] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali02c7a9c392f ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.798 [INFO][5829] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.799 [INFO][5829] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0", GenerateName:"calico-apiserver-84478f496d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5526c02b-5980-4359-8781-c8829963a4f3", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84478f496d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12", Pod:"calico-apiserver-84478f496d-j8zht", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali02c7a9c392f", MAC:"e2:d6:8b:67:3e:ab", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:43.922141 containerd[1734]: 2025-05-15 12:28:43.919 [INFO][5829] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" Namespace="calico-apiserver" Pod="calico-apiserver-84478f496d-j8zht" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--84478f496d--j8zht-eth0" May 15 12:28:44.170945 containerd[1734]: time="2025-05-15T12:28:44.170922906Z" level=info msg="CreateContainer within sandbox \"9a9d1595de8ab76d96881d0e59a97b548ad133480bb590adbaac464a9639ea9f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa\"" May 15 12:28:44.171315 containerd[1734]: time="2025-05-15T12:28:44.171295259Z" level=info msg="StartContainer for \"eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa\"" May 15 12:28:44.172265 containerd[1734]: time="2025-05-15T12:28:44.172228757Z" level=info msg="connecting to shim eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa" address="unix:///run/containerd/s/9a0a6703e292d8b5423d3ef730339e699085234306ef4099489cd2ecdde717ac" protocol=ttrpc version=3 May 15 12:28:44.220927 containerd[1734]: time="2025-05-15T12:28:44.220879338Z" level=info msg="Container 4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:44.232345 systemd[1]: Started cri-containerd-eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa.scope - libcontainer container eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa. May 15 12:28:44.329992 containerd[1734]: time="2025-05-15T12:28:44.329969734Z" level=info msg="StartContainer for \"eb58a4b229d7009c3ac5ef2a88d0993dd58e636e0bfd431ab1bb05cfe6b5fdfa\" returns successfully" May 15 12:28:44.471709 containerd[1734]: time="2025-05-15T12:28:44.432227910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:44.473805 containerd[1734]: time="2025-05-15T12:28:44.473781484Z" level=info msg="CreateContainer within sandbox \"42f659d551572d8f55c22425ae54a7e58d86e4db4390c67df9af668f15550d3a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3\"" May 15 12:28:44.474222 containerd[1734]: time="2025-05-15T12:28:44.474195736Z" level=info msg="StartContainer for \"4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3\"" May 15 12:28:44.475062 containerd[1734]: time="2025-05-15T12:28:44.475014128Z" level=info msg="connecting to shim 4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3" address="unix:///run/containerd/s/29ff19c5e730b16952fbcac017fb64ed25684db4549a50934039f7a7260f66cb" protocol=ttrpc version=3 May 15 12:28:44.495320 systemd[1]: Started cri-containerd-4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3.scope - libcontainer container 4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3. May 15 12:28:44.567964 containerd[1734]: time="2025-05-15T12:28:44.567905390Z" level=info msg="StartContainer for \"4be3730f13554601ba9da639b988f30141a878ff4e5d25207eae48e6932389f3\" returns successfully" May 15 12:28:44.616276 systemd-networkd[1361]: cali7175ecdbc97: Gained IPv6LL May 15 12:28:44.687006 containerd[1734]: time="2025-05-15T12:28:44.686961378Z" level=info msg="connecting to shim 71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12" address="unix:///run/containerd/s/9ea6969a1b82b5458667673d3b57b90e34ae72430673e9cb65458b7eac0506e5" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:44.711349 systemd[1]: Started cri-containerd-71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12.scope - libcontainer container 71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12. May 15 12:28:44.779137 containerd[1734]: time="2025-05-15T12:28:44.779004143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84478f496d-j8zht,Uid:5526c02b-5980-4359-8781-c8829963a4f3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12\"" May 15 12:28:44.823244 kubelet[3331]: I0515 12:28:44.823191 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-s4cpx" podStartSLOduration=112.823076383 podStartE2EDuration="1m52.823076383s" podCreationTimestamp="2025-05-15 12:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:44.716006692 +0000 UTC m=+123.351340152" watchObservedRunningTime="2025-05-15 12:28:44.823076383 +0000 UTC m=+123.458409830" May 15 12:28:44.869870 kubelet[3331]: I0515 12:28:44.869454 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-pkr4m" podStartSLOduration=112.86944166 podStartE2EDuration="1m52.86944166s" podCreationTimestamp="2025-05-15 12:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:28:44.869286712 +0000 UTC m=+123.504620153" watchObservedRunningTime="2025-05-15 12:28:44.86944166 +0000 UTC m=+123.504775103" May 15 12:28:45.121631 systemd-networkd[1361]: cali392e1acd4dd: Link UP May 15 12:28:45.122596 systemd-networkd[1361]: cali392e1acd4dd: Gained carrier May 15 12:28:45.129460 systemd-networkd[1361]: cali02c7a9c392f: Gained IPv6LL May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:44.823 [INFO][5973] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0 calico-apiserver-b7f8fc4d6- calico-apiserver 1d54623e-7c9b-4f0a-8922-6bb47d7e26af 819 0 2025-05-15 12:27:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b7f8fc4d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 calico-apiserver-b7f8fc4d6-nk2q9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali392e1acd4dd [] []}} ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:44.823 [INFO][5973] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:44.980 [INFO][5996] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.026 [INFO][5996] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003133c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"calico-apiserver-b7f8fc4d6-nk2q9", "timestamp":"2025-05-15 12:28:44.979999139 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.026 [INFO][5996] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.026 [INFO][5996] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.026 [INFO][5996] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.028 [INFO][5996] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.030 [INFO][5996] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.033 [INFO][5996] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.034 [INFO][5996] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.036 [INFO][5996] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.036 [INFO][5996] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.037 [INFO][5996] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525 May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.041 [INFO][5996] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.118 [INFO][5996] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.133/26] block=192.168.41.128/26 handle="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.118 [INFO][5996] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.133/26] handle="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.118 [INFO][5996] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:45.133823 containerd[1734]: 2025-05-15 12:28:45.118 [INFO][5996] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.133/26] IPv6=[] ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.119 [INFO][5973] cni-plugin/k8s.go 386: Populated endpoint ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0", GenerateName:"calico-apiserver-b7f8fc4d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"1d54623e-7c9b-4f0a-8922-6bb47d7e26af", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b7f8fc4d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"calico-apiserver-b7f8fc4d6-nk2q9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali392e1acd4dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.119 [INFO][5973] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.133/32] ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.119 [INFO][5973] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali392e1acd4dd ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.122 [INFO][5973] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.123 [INFO][5973] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0", GenerateName:"calico-apiserver-b7f8fc4d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"1d54623e-7c9b-4f0a-8922-6bb47d7e26af", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b7f8fc4d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525", Pod:"calico-apiserver-b7f8fc4d6-nk2q9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali392e1acd4dd", MAC:"56:74:6e:d4:f0:ab", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:45.134795 containerd[1734]: 2025-05-15 12:28:45.132 [INFO][5973] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-nk2q9" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:28:45.433304 containerd[1734]: time="2025-05-15T12:28:45.433005747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,}" May 15 12:28:45.433304 containerd[1734]: time="2025-05-15T12:28:45.433165953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,}" May 15 12:28:45.527550 containerd[1734]: time="2025-05-15T12:28:45.527490232Z" level=info msg="connecting to shim 93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" address="unix:///run/containerd/s/0b2afe43d8f0f6dda0ff08f5fceb0cd01eb53f657c2242d75c7da13398c0ebea" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:45.550325 systemd[1]: Started cri-containerd-93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525.scope - libcontainer container 93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525. May 15 12:28:45.652283 systemd-networkd[1361]: calie19ca3bb5bb: Link UP May 15 12:28:45.653073 systemd-networkd[1361]: calie19ca3bb5bb: Gained carrier May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.602 [INFO][6061] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0 csi-node-driver- calico-system 55ce30e6-eb73-4f69-aef9-927a3bcb6662 635 0 2025-05-15 12:27:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 csi-node-driver-fthxd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie19ca3bb5bb [] []}} ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.602 [INFO][6061] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.621 [INFO][6074] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" HandleID="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.629 [INFO][6074] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" HandleID="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bce20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"csi-node-driver-fthxd", "timestamp":"2025-05-15 12:28:45.621849035 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.630 [INFO][6074] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.630 [INFO][6074] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.630 [INFO][6074] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.631 [INFO][6074] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.633 [INFO][6074] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.636 [INFO][6074] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.637 [INFO][6074] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.638 [INFO][6074] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.638 [INFO][6074] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.639 [INFO][6074] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034 May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.643 [INFO][6074] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.649 [INFO][6074] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.134/26] block=192.168.41.128/26 handle="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.649 [INFO][6074] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.134/26] handle="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.649 [INFO][6074] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:45.663723 containerd[1734]: 2025-05-15 12:28:45.649 [INFO][6074] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.134/26] IPv6=[] ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" HandleID="k8s-pod-network.c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.650 [INFO][6061] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"55ce30e6-eb73-4f69-aef9-927a3bcb6662", ResourceVersion:"635", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"csi-node-driver-fthxd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.41.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie19ca3bb5bb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.650 [INFO][6061] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.134/32] ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.650 [INFO][6061] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie19ca3bb5bb ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.652 [INFO][6061] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.652 [INFO][6061] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"55ce30e6-eb73-4f69-aef9-927a3bcb6662", ResourceVersion:"635", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034", Pod:"csi-node-driver-fthxd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.41.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie19ca3bb5bb", MAC:"d2:46:03:3e:4f:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:45.664260 containerd[1734]: 2025-05-15 12:28:45.661 [INFO][6061] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" Namespace="calico-system" Pod="csi-node-driver-fthxd" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-csi--node--driver--fthxd-eth0" May 15 12:28:45.674691 containerd[1734]: time="2025-05-15T12:28:45.674653559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-nk2q9,Uid:1d54623e-7c9b-4f0a-8922-6bb47d7e26af,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\"" May 15 12:28:45.769971 systemd-networkd[1361]: cali45ba0ebc4b2: Link UP May 15 12:28:45.771296 systemd-networkd[1361]: cali45ba0ebc4b2: Gained carrier May 15 12:28:46.030278 containerd[1734]: time="2025-05-15T12:28:46.030210495Z" level=info msg="connecting to shim c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034" address="unix:///run/containerd/s/194a212d6993460147836338e92322d3da6b755cf092f352c13968992de7bf91" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:46.047324 systemd[1]: Started cri-containerd-c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034.scope - libcontainer container c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034. May 15 12:28:46.069281 containerd[1734]: time="2025-05-15T12:28:46.069255252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fthxd,Uid:55ce30e6-eb73-4f69-aef9-927a3bcb6662,Namespace:calico-system,Attempt:0,} returns sandbox id \"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034\"" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.695 [INFO][6094] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0 calico-apiserver-b7f8fc4d6- calico-apiserver dea655ce-5eab-4e67-a276-be0e9b39cc85 817 0 2025-05-15 12:27:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b7f8fc4d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334.0.0-a-9b1bbdffc7 calico-apiserver-b7f8fc4d6-6ch4b eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali45ba0ebc4b2 [] []}} ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.695 [INFO][6094] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.714 [INFO][6105] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.719 [INFO][6105] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000458af0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334.0.0-a-9b1bbdffc7", "pod":"calico-apiserver-b7f8fc4d6-6ch4b", "timestamp":"2025-05-15 12:28:45.714473505 +0000 UTC"}, Hostname:"ci-4334.0.0-a-9b1bbdffc7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.719 [INFO][6105] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.719 [INFO][6105] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.719 [INFO][6105] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334.0.0-a-9b1bbdffc7' May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.720 [INFO][6105] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.725 [INFO][6105] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.727 [INFO][6105] ipam/ipam.go 489: Trying affinity for 192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.728 [INFO][6105] ipam/ipam.go 155: Attempting to load block cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.730 [INFO][6105] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.41.128/26 host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.730 [INFO][6105] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.41.128/26 handle="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.733 [INFO][6105] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.738 [INFO][6105] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.41.128/26 handle="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.766 [INFO][6105] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.41.135/26] block=192.168.41.128/26 handle="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.766 [INFO][6105] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.41.135/26] handle="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" host="ci-4334.0.0-a-9b1bbdffc7" May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.766 [INFO][6105] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:28:46.134443 containerd[1734]: 2025-05-15 12:28:45.766 [INFO][6105] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.41.135/26] IPv6=[] ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:45.767 [INFO][6094] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0", GenerateName:"calico-apiserver-b7f8fc4d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"dea655ce-5eab-4e67-a276-be0e9b39cc85", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b7f8fc4d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"", Pod:"calico-apiserver-b7f8fc4d6-6ch4b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45ba0ebc4b2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:45.767 [INFO][6094] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.41.135/32] ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:45.767 [INFO][6094] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali45ba0ebc4b2 ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:45.771 [INFO][6094] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:45.771 [INFO][6094] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0", GenerateName:"calico-apiserver-b7f8fc4d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"dea655ce-5eab-4e67-a276-be0e9b39cc85", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 27, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b7f8fc4d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334.0.0-a-9b1bbdffc7", ContainerID:"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b", Pod:"calico-apiserver-b7f8fc4d6-6ch4b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.41.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45ba0ebc4b2", MAC:"6a:bc:73:be:1b:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:28:46.134943 containerd[1734]: 2025-05-15 12:28:46.132 [INFO][6094] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Namespace="calico-apiserver" Pod="calico-apiserver-b7f8fc4d6-6ch4b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:28:46.424816 containerd[1734]: time="2025-05-15T12:28:46.424788230Z" level=info msg="connecting to shim 4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" address="unix:///run/containerd/s/e262020aa9b346abcc55aa71caf65f0a52410fb8be6193bd05a1d85db760603b" namespace=k8s.io protocol=ttrpc version=3 May 15 12:28:46.443331 systemd[1]: Started cri-containerd-4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b.scope - libcontainer container 4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b. May 15 12:28:46.578304 containerd[1734]: time="2025-05-15T12:28:46.578279399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b7f8fc4d6-6ch4b,Uid:dea655ce-5eab-4e67-a276-be0e9b39cc85,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\"" May 15 12:28:46.792307 systemd-networkd[1361]: cali392e1acd4dd: Gained IPv6LL May 15 12:28:47.240367 systemd-networkd[1361]: cali45ba0ebc4b2: Gained IPv6LL May 15 12:28:47.304338 systemd-networkd[1361]: calie19ca3bb5bb: Gained IPv6LL May 15 12:28:47.598261 systemd[1]: Started sshd@10-10.200.8.35:22-10.200.16.10:45680.service - OpenSSH per-connection server daemon (10.200.16.10:45680). May 15 12:28:48.235031 sshd[6220]: Accepted publickey for core from 10.200.16.10 port 45680 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:48.236049 sshd-session[6220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:48.239981 systemd-logind[1702]: New session 13 of user core. May 15 12:28:48.244320 systemd[1]: Started session-13.scope - Session 13 of User core. May 15 12:28:48.724358 sshd[6222]: Connection closed by 10.200.16.10 port 45680 May 15 12:28:48.724764 sshd-session[6220]: pam_unix(sshd:session): session closed for user core May 15 12:28:48.727295 systemd[1]: sshd@10-10.200.8.35:22-10.200.16.10:45680.service: Deactivated successfully. May 15 12:28:48.728920 systemd[1]: session-13.scope: Deactivated successfully. May 15 12:28:48.729598 systemd-logind[1702]: Session 13 logged out. Waiting for processes to exit. May 15 12:28:48.730843 systemd-logind[1702]: Removed session 13. May 15 12:28:53.841067 systemd[1]: Started sshd@11-10.200.8.35:22-10.200.16.10:36502.service - OpenSSH per-connection server daemon (10.200.16.10:36502). May 15 12:28:54.167938 containerd[1734]: time="2025-05-15T12:28:54.167904855Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:54.173686 containerd[1734]: time="2025-05-15T12:28:54.173652294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 15 12:28:54.215372 containerd[1734]: time="2025-05-15T12:28:54.215309688Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:54.279860 containerd[1734]: time="2025-05-15T12:28:54.279802078Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:28:54.280587 containerd[1734]: time="2025-05-15T12:28:54.280278253Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 12.04638418s" May 15 12:28:54.280587 containerd[1734]: time="2025-05-15T12:28:54.280304533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 15 12:28:54.281064 containerd[1734]: time="2025-05-15T12:28:54.281045007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:28:54.292487 containerd[1734]: time="2025-05-15T12:28:54.292460891Z" level=info msg="CreateContainer within sandbox \"3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 12:28:54.471078 containerd[1734]: time="2025-05-15T12:28:54.470985033Z" level=info msg="Container 14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583: CDI devices from CRI Config.CDIDevices: []" May 15 12:28:54.483925 sshd[6251]: Accepted publickey for core from 10.200.16.10 port 36502 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:28:54.484848 sshd-session[6251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:28:54.488628 systemd-logind[1702]: New session 14 of user core. May 15 12:28:54.491354 systemd[1]: Started session-14.scope - Session 14 of User core. May 15 12:28:54.569595 containerd[1734]: time="2025-05-15T12:28:54.569573048Z" level=info msg="CreateContainer within sandbox \"3b5233077cbe73a610dfc4e28881a4e0d8ba920446a9ac5f0e3ac0168dba007f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\"" May 15 12:28:54.570191 containerd[1734]: time="2025-05-15T12:28:54.569967742Z" level=info msg="StartContainer for \"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\"" May 15 12:28:54.571054 containerd[1734]: time="2025-05-15T12:28:54.571018414Z" level=info msg="connecting to shim 14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583" address="unix:///run/containerd/s/e14c98647b9723ca25ee59bb6a2b3153607f528c3dc7cdb998d5e212f3d6c401" protocol=ttrpc version=3 May 15 12:28:54.592338 systemd[1]: Started cri-containerd-14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583.scope - libcontainer container 14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583. May 15 12:28:54.631241 containerd[1734]: time="2025-05-15T12:28:54.631218122Z" level=info msg="StartContainer for \"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" returns successfully" May 15 12:28:54.813154 kubelet[3331]: I0515 12:28:54.812958 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5547fc8788-7mx2l" podStartSLOduration=36.765558461 podStartE2EDuration="48.812853004s" podCreationTimestamp="2025-05-15 12:28:06 +0000 UTC" firstStartedPulling="2025-05-15 12:28:42.233654806 +0000 UTC m=+120.868988240" lastFinishedPulling="2025-05-15 12:28:54.280949328 +0000 UTC m=+132.916282783" observedRunningTime="2025-05-15 12:28:54.811971645 +0000 UTC m=+133.447305091" watchObservedRunningTime="2025-05-15 12:28:54.812853004 +0000 UTC m=+133.448186445" May 15 12:28:54.976860 sshd[6255]: Connection closed by 10.200.16.10 port 36502 May 15 12:28:54.977284 sshd-session[6251]: pam_unix(sshd:session): session closed for user core May 15 12:28:54.980340 systemd[1]: sshd@11-10.200.8.35:22-10.200.16.10:36502.service: Deactivated successfully. May 15 12:28:54.982036 systemd[1]: session-14.scope: Deactivated successfully. May 15 12:28:54.982760 systemd-logind[1702]: Session 14 logged out. Waiting for processes to exit. May 15 12:28:54.984148 systemd-logind[1702]: Removed session 14. May 15 12:28:55.766753 containerd[1734]: time="2025-05-15T12:28:55.766701051Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"af34571b181f0d1ce570d4f4c5403bc78a8c2368b022c324c4b0cb1d84077174\" pid:6311 exited_at:{seconds:1747312135 nanos:766448124}" May 15 12:29:00.101313 systemd[1]: Started sshd@12-10.200.8.35:22-10.200.16.10:50886.service - OpenSSH per-connection server daemon (10.200.16.10:50886). May 15 12:29:00.465449 containerd[1734]: time="2025-05-15T12:29:00.465415816Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"8562799712b91b6ace5bd10c652f69f8e46a764de08d30d384078dc9769f1d09\" pid:6343 exited_at:{seconds:1747312140 nanos:465162491}" May 15 12:29:00.734112 sshd[6329]: Accepted publickey for core from 10.200.16.10 port 50886 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:00.735257 sshd-session[6329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:00.739333 systemd-logind[1702]: New session 15 of user core. May 15 12:29:00.743324 systemd[1]: Started session-15.scope - Session 15 of User core. May 15 12:29:01.225680 sshd[6355]: Connection closed by 10.200.16.10 port 50886 May 15 12:29:01.226066 sshd-session[6329]: pam_unix(sshd:session): session closed for user core May 15 12:29:01.228215 systemd[1]: sshd@12-10.200.8.35:22-10.200.16.10:50886.service: Deactivated successfully. May 15 12:29:01.229692 systemd[1]: session-15.scope: Deactivated successfully. May 15 12:29:01.231323 systemd-logind[1702]: Session 15 logged out. Waiting for processes to exit. May 15 12:29:01.232138 systemd-logind[1702]: Removed session 15. May 15 12:29:05.875852 containerd[1734]: time="2025-05-15T12:29:05.875812862Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:05.924853 containerd[1734]: time="2025-05-15T12:29:05.922861481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 15 12:29:05.971576 containerd[1734]: time="2025-05-15T12:29:05.971508042Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:06.019472 containerd[1734]: time="2025-05-15T12:29:06.019406617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:06.019912 containerd[1734]: time="2025-05-15T12:29:06.019863424Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 11.738789909s" May 15 12:29:06.019912 containerd[1734]: time="2025-05-15T12:29:06.019888738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:29:06.020935 containerd[1734]: time="2025-05-15T12:29:06.020913817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:29:06.022326 containerd[1734]: time="2025-05-15T12:29:06.022298462Z" level=info msg="CreateContainer within sandbox \"71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:29:06.226997 containerd[1734]: time="2025-05-15T12:29:06.226882911Z" level=info msg="Container 80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:06.330581 containerd[1734]: time="2025-05-15T12:29:06.330539625Z" level=info msg="CreateContainer within sandbox \"71ad1db5bdb98ad5fd0d6251136e1ad0f839dfb27698dc0f2bbf41bdbf5b2e12\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240\"" May 15 12:29:06.331182 containerd[1734]: time="2025-05-15T12:29:06.331021046Z" level=info msg="StartContainer for \"80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240\"" May 15 12:29:06.332069 containerd[1734]: time="2025-05-15T12:29:06.332036460Z" level=info msg="connecting to shim 80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240" address="unix:///run/containerd/s/9ea6969a1b82b5458667673d3b57b90e34ae72430673e9cb65458b7eac0506e5" protocol=ttrpc version=3 May 15 12:29:06.351310 systemd[1]: Started cri-containerd-80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240.scope - libcontainer container 80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240. May 15 12:29:06.352407 systemd[1]: Started sshd@13-10.200.8.35:22-10.200.16.10:50898.service - OpenSSH per-connection server daemon (10.200.16.10:50898). May 15 12:29:06.397217 containerd[1734]: time="2025-05-15T12:29:06.397167040Z" level=info msg="StartContainer for \"80f7528c04e896f058502ccf9bb8efc5ce5a14e2a07e07faefa5832dabaf9240\" returns successfully" May 15 12:29:06.966264 kubelet[3331]: I0515 12:29:06.966118 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-84478f496d-j8zht" podStartSLOduration=104.725979601 podStartE2EDuration="2m5.966101401s" podCreationTimestamp="2025-05-15 12:27:01 +0000 UTC" firstStartedPulling="2025-05-15 12:28:44.780572513 +0000 UTC m=+123.415905953" lastFinishedPulling="2025-05-15 12:29:06.020694314 +0000 UTC m=+144.656027753" observedRunningTime="2025-05-15 12:29:06.764123513 +0000 UTC m=+145.399456972" watchObservedRunningTime="2025-05-15 12:29:06.966101401 +0000 UTC m=+145.601434844" May 15 12:29:06.991479 sshd[6391]: Accepted publickey for core from 10.200.16.10 port 50898 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:06.992626 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:07.001223 systemd-logind[1702]: New session 16 of user core. May 15 12:29:07.006328 systemd[1]: Started session-16.scope - Session 16 of User core. May 15 12:29:07.325845 containerd[1734]: time="2025-05-15T12:29:07.325736666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"be50b161ffb28c396d95c62c2db739ba17934977fea649bb7a96f5edb8abdfd8\" pid:6447 exited_at:{seconds:1747312147 nanos:325012024}" May 15 12:29:07.326552 containerd[1734]: time="2025-05-15T12:29:07.326501035Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"e3f3fd1298442506486e6fbdf1492a417cf6bf1d43d37c585ff69ff6c7fb5e76\" pid:6445 exited_at:{seconds:1747312147 nanos:326254416}" May 15 12:29:07.483518 sshd[6419]: Connection closed by 10.200.16.10 port 50898 May 15 12:29:07.483888 sshd-session[6391]: pam_unix(sshd:session): session closed for user core May 15 12:29:07.486242 systemd[1]: sshd@13-10.200.8.35:22-10.200.16.10:50898.service: Deactivated successfully. May 15 12:29:07.487804 systemd[1]: session-16.scope: Deactivated successfully. May 15 12:29:07.488924 systemd-logind[1702]: Session 16 logged out. Waiting for processes to exit. May 15 12:29:07.490093 systemd-logind[1702]: Removed session 16. May 15 12:29:07.595698 systemd[1]: Started sshd@14-10.200.8.35:22-10.200.16.10:50908.service - OpenSSH per-connection server daemon (10.200.16.10:50908). May 15 12:29:08.231886 sshd[6475]: Accepted publickey for core from 10.200.16.10 port 50908 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:08.232831 sshd-session[6475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:08.236216 systemd-logind[1702]: New session 17 of user core. May 15 12:29:08.240311 systemd[1]: Started session-17.scope - Session 17 of User core. May 15 12:29:08.749783 sshd[6477]: Connection closed by 10.200.16.10 port 50908 May 15 12:29:08.750210 sshd-session[6475]: pam_unix(sshd:session): session closed for user core May 15 12:29:08.752164 systemd[1]: sshd@14-10.200.8.35:22-10.200.16.10:50908.service: Deactivated successfully. May 15 12:29:08.753691 systemd[1]: session-17.scope: Deactivated successfully. May 15 12:29:08.754814 systemd-logind[1702]: Session 17 logged out. Waiting for processes to exit. May 15 12:29:08.756268 systemd-logind[1702]: Removed session 17. May 15 12:29:08.861832 systemd[1]: Started sshd@15-10.200.8.35:22-10.200.16.10:60830.service - OpenSSH per-connection server daemon (10.200.16.10:60830). May 15 12:29:09.495530 sshd[6486]: Accepted publickey for core from 10.200.16.10 port 60830 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:09.496435 sshd-session[6486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:09.500060 systemd-logind[1702]: New session 18 of user core. May 15 12:29:09.506310 systemd[1]: Started session-18.scope - Session 18 of User core. May 15 12:29:09.622794 containerd[1734]: time="2025-05-15T12:29:09.622749951Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:09.668523 containerd[1734]: time="2025-05-15T12:29:09.668496221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 12:29:09.669920 containerd[1734]: time="2025-05-15T12:29:09.669880272Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 3.648921562s" May 15 12:29:09.669985 containerd[1734]: time="2025-05-15T12:29:09.669923890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:29:09.671396 containerd[1734]: time="2025-05-15T12:29:09.671292400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 15 12:29:09.672369 containerd[1734]: time="2025-05-15T12:29:09.672335137Z" level=info msg="CreateContainer within sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:29:09.924359 containerd[1734]: time="2025-05-15T12:29:09.924297269Z" level=info msg="Container 566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:09.930403 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3986685248.mount: Deactivated successfully. May 15 12:29:09.984199 sshd[6488]: Connection closed by 10.200.16.10 port 60830 May 15 12:29:09.984529 sshd-session[6486]: pam_unix(sshd:session): session closed for user core May 15 12:29:09.986343 systemd[1]: sshd@15-10.200.8.35:22-10.200.16.10:60830.service: Deactivated successfully. May 15 12:29:09.987687 systemd[1]: session-18.scope: Deactivated successfully. May 15 12:29:09.989309 systemd-logind[1702]: Session 18 logged out. Waiting for processes to exit. May 15 12:29:09.990043 systemd-logind[1702]: Removed session 18. May 15 12:29:10.066218 containerd[1734]: time="2025-05-15T12:29:10.066183167Z" level=info msg="CreateContainer within sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\"" May 15 12:29:10.067243 containerd[1734]: time="2025-05-15T12:29:10.066555143Z" level=info msg="StartContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\"" May 15 12:29:10.067600 containerd[1734]: time="2025-05-15T12:29:10.067580278Z" level=info msg="connecting to shim 566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713" address="unix:///run/containerd/s/0b2afe43d8f0f6dda0ff08f5fceb0cd01eb53f657c2242d75c7da13398c0ebea" protocol=ttrpc version=3 May 15 12:29:10.085338 systemd[1]: Started cri-containerd-566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713.scope - libcontainer container 566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713. May 15 12:29:10.122316 containerd[1734]: time="2025-05-15T12:29:10.122283519Z" level=info msg="StartContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" returns successfully" May 15 12:29:10.773571 kubelet[3331]: I0515 12:29:10.773029 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-nk2q9" podStartSLOduration=106.778381016 podStartE2EDuration="2m10.773013266s" podCreationTimestamp="2025-05-15 12:27:00 +0000 UTC" firstStartedPulling="2025-05-15 12:28:45.67627335 +0000 UTC m=+124.311606787" lastFinishedPulling="2025-05-15 12:29:09.670905597 +0000 UTC m=+148.306239037" observedRunningTime="2025-05-15 12:29:10.772593694 +0000 UTC m=+149.407927134" watchObservedRunningTime="2025-05-15 12:29:10.773013266 +0000 UTC m=+149.408346707" May 15 12:29:15.120164 systemd[1]: Started sshd@16-10.200.8.35:22-10.200.16.10:60838.service - OpenSSH per-connection server daemon (10.200.16.10:60838). May 15 12:29:15.621846 containerd[1734]: time="2025-05-15T12:29:15.621801721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:15.668245 containerd[1734]: time="2025-05-15T12:29:15.668215548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 15 12:29:15.670705 containerd[1734]: time="2025-05-15T12:29:15.670665017Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:15.717372 containerd[1734]: time="2025-05-15T12:29:15.717317803Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:15.717993 containerd[1734]: time="2025-05-15T12:29:15.717899651Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 6.046578676s" May 15 12:29:15.717993 containerd[1734]: time="2025-05-15T12:29:15.717925325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 15 12:29:15.718807 containerd[1734]: time="2025-05-15T12:29:15.718781793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:29:15.719831 containerd[1734]: time="2025-05-15T12:29:15.719812190Z" level=info msg="CreateContainer within sandbox \"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 15 12:29:15.753046 sshd[6539]: Accepted publickey for core from 10.200.16.10 port 60838 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:15.753990 sshd-session[6539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:15.758277 systemd-logind[1702]: New session 19 of user core. May 15 12:29:15.763315 systemd[1]: Started session-19.scope - Session 19 of User core. May 15 12:29:15.925383 containerd[1734]: time="2025-05-15T12:29:15.925359426Z" level=info msg="Container cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:15.931517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3755875491.mount: Deactivated successfully. May 15 12:29:16.079396 containerd[1734]: time="2025-05-15T12:29:16.079375117Z" level=info msg="CreateContainer within sandbox \"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4\"" May 15 12:29:16.079755 containerd[1734]: time="2025-05-15T12:29:16.079733252Z" level=info msg="StartContainer for \"cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4\"" May 15 12:29:16.080765 containerd[1734]: time="2025-05-15T12:29:16.080727006Z" level=info msg="connecting to shim cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4" address="unix:///run/containerd/s/194a212d6993460147836338e92322d3da6b755cf092f352c13968992de7bf91" protocol=ttrpc version=3 May 15 12:29:16.102377 systemd[1]: Started cri-containerd-cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4.scope - libcontainer container cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4. May 15 12:29:16.137023 containerd[1734]: time="2025-05-15T12:29:16.136989547Z" level=info msg="StartContainer for \"cbfae31c382f8e90b2dc924095ad8d08266b9d2032882aa692dcb01a4bff8eb4\" returns successfully" May 15 12:29:16.258057 sshd[6548]: Connection closed by 10.200.16.10 port 60838 May 15 12:29:16.258917 sshd-session[6539]: pam_unix(sshd:session): session closed for user core May 15 12:29:16.262000 systemd[1]: sshd@16-10.200.8.35:22-10.200.16.10:60838.service: Deactivated successfully. May 15 12:29:16.263498 systemd[1]: session-19.scope: Deactivated successfully. May 15 12:29:16.264110 systemd-logind[1702]: Session 19 logged out. Waiting for processes to exit. May 15 12:29:16.265073 systemd-logind[1702]: Removed session 19. May 15 12:29:17.219742 containerd[1734]: time="2025-05-15T12:29:17.219706877Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:17.221947 containerd[1734]: time="2025-05-15T12:29:17.221918460Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 12:29:17.223283 containerd[1734]: time="2025-05-15T12:29:17.223258125Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 1.504450398s" May 15 12:29:17.223354 containerd[1734]: time="2025-05-15T12:29:17.223288199Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:29:17.224117 containerd[1734]: time="2025-05-15T12:29:17.224030152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 15 12:29:17.225298 containerd[1734]: time="2025-05-15T12:29:17.225245388Z" level=info msg="CreateContainer within sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:29:17.476219 containerd[1734]: time="2025-05-15T12:29:17.475475318Z" level=info msg="Container 817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:17.616148 containerd[1734]: time="2025-05-15T12:29:17.616111476Z" level=info msg="CreateContainer within sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\"" May 15 12:29:17.616596 containerd[1734]: time="2025-05-15T12:29:17.616448822Z" level=info msg="StartContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\"" May 15 12:29:17.617604 containerd[1734]: time="2025-05-15T12:29:17.617581380Z" level=info msg="connecting to shim 817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e" address="unix:///run/containerd/s/e262020aa9b346abcc55aa71caf65f0a52410fb8be6193bd05a1d85db760603b" protocol=ttrpc version=3 May 15 12:29:17.635326 systemd[1]: Started cri-containerd-817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e.scope - libcontainer container 817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e. May 15 12:29:17.673616 containerd[1734]: time="2025-05-15T12:29:17.673590324Z" level=info msg="StartContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" returns successfully" May 15 12:29:17.776646 containerd[1734]: time="2025-05-15T12:29:17.776585956Z" level=info msg="StopContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" with timeout 30 (s)" May 15 12:29:17.778270 containerd[1734]: time="2025-05-15T12:29:17.778162649Z" level=info msg="Stop container \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" with signal terminated" May 15 12:29:17.787871 systemd[1]: cri-containerd-817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e.scope: Deactivated successfully. May 15 12:29:17.790941 containerd[1734]: time="2025-05-15T12:29:17.790903435Z" level=info msg="received exit event container_id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" pid:6601 exit_status:1 exited_at:{seconds:1747312157 nanos:790638373}" May 15 12:29:17.793035 containerd[1734]: time="2025-05-15T12:29:17.792919121Z" level=info msg="TaskExit event in podsandbox handler container_id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" pid:6601 exit_status:1 exited_at:{seconds:1747312157 nanos:790638373}" May 15 12:29:17.828296 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e-rootfs.mount: Deactivated successfully. May 15 12:29:21.369947 systemd[1]: Started sshd@17-10.200.8.35:22-10.200.16.10:53190.service - OpenSSH per-connection server daemon (10.200.16.10:53190). May 15 12:29:22.000560 sshd[6649]: Accepted publickey for core from 10.200.16.10 port 53190 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:22.001506 sshd-session[6649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:22.005584 systemd-logind[1702]: New session 20 of user core. May 15 12:29:22.009321 systemd[1]: Started session-20.scope - Session 20 of User core. May 15 12:29:22.490761 sshd[6651]: Connection closed by 10.200.16.10 port 53190 May 15 12:29:22.491148 sshd-session[6649]: pam_unix(sshd:session): session closed for user core May 15 12:29:22.493800 systemd[1]: sshd@17-10.200.8.35:22-10.200.16.10:53190.service: Deactivated successfully. May 15 12:29:22.495492 systemd[1]: session-20.scope: Deactivated successfully. May 15 12:29:22.496246 systemd-logind[1702]: Session 20 logged out. Waiting for processes to exit. May 15 12:29:22.497249 systemd-logind[1702]: Removed session 20. May 15 12:29:27.604279 systemd[1]: Started sshd@18-10.200.8.35:22-10.200.16.10:53198.service - OpenSSH per-connection server daemon (10.200.16.10:53198). May 15 12:29:27.791596 containerd[1734]: time="2025-05-15T12:29:27.791556670Z" level=error msg="failed to handle container TaskExit event container_id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" pid:6601 exit_status:1 exited_at:{seconds:1747312157 nanos:790638373}" error="failed to stop container: failed to delete task: context deadline exceeded" May 15 12:29:28.242787 sshd[6673]: Accepted publickey for core from 10.200.16.10 port 53198 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:28.269456 sshd-session[6673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:28.273258 systemd-logind[1702]: New session 21 of user core. May 15 12:29:28.279304 systemd[1]: Started session-21.scope - Session 21 of User core. May 15 12:29:28.412333 containerd[1734]: time="2025-05-15T12:29:28.412272514Z" level=error msg="ttrpc: received message on inactive stream" stream=35 May 15 12:29:28.733888 sshd[6675]: Connection closed by 10.200.16.10 port 53198 May 15 12:29:28.734283 sshd-session[6673]: pam_unix(sshd:session): session closed for user core May 15 12:29:28.736279 systemd[1]: sshd@18-10.200.8.35:22-10.200.16.10:53198.service: Deactivated successfully. May 15 12:29:28.737821 systemd[1]: session-21.scope: Deactivated successfully. May 15 12:29:28.739422 systemd-logind[1702]: Session 21 logged out. Waiting for processes to exit. May 15 12:29:28.740469 systemd-logind[1702]: Removed session 21. May 15 12:29:29.539186 containerd[1734]: time="2025-05-15T12:29:29.539123878Z" level=info msg="TaskExit event container_id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" id:\"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" pid:6601 exit_status:1 exited_at:{seconds:1747312157 nanos:790638373}" May 15 12:29:29.540702 containerd[1734]: time="2025-05-15T12:29:29.540677544Z" level=info msg="Ensure that container 817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e in task-service has been cleanup successfully" May 15 12:29:29.545899 containerd[1734]: time="2025-05-15T12:29:29.545874791Z" level=info msg="StopContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" returns successfully" May 15 12:29:29.546233 containerd[1734]: time="2025-05-15T12:29:29.546216873Z" level=info msg="StopPodSandbox for \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\"" May 15 12:29:29.546310 containerd[1734]: time="2025-05-15T12:29:29.546282377Z" level=info msg="Container to stop \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:29:29.551696 systemd[1]: cri-containerd-4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b.scope: Deactivated successfully. May 15 12:29:29.553154 containerd[1734]: time="2025-05-15T12:29:29.553135199Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" id:\"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" pid:6203 exit_status:137 exited_at:{seconds:1747312169 nanos:552529309}" May 15 12:29:29.572978 containerd[1734]: time="2025-05-15T12:29:29.572883961Z" level=info msg="received exit event sandbox_id:\"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" exit_status:137 exited_at:{seconds:1747312169 nanos:552529309}" May 15 12:29:29.573129 containerd[1734]: time="2025-05-15T12:29:29.573105979Z" level=info msg="shim disconnected" id=4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b namespace=k8s.io May 15 12:29:29.573313 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b-rootfs.mount: Deactivated successfully. May 15 12:29:29.577413 containerd[1734]: time="2025-05-15T12:29:29.573767936Z" level=warning msg="cleaning up after shim disconnected" id=4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b namespace=k8s.io May 15 12:29:29.577413 containerd[1734]: time="2025-05-15T12:29:29.573786278Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:29:29.577220 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b-shm.mount: Deactivated successfully. May 15 12:29:29.620366 kubelet[3331]: I0515 12:29:29.619987 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b7f8fc4d6-6ch4b" podStartSLOduration=118.975411977 podStartE2EDuration="2m29.619969281s" podCreationTimestamp="2025-05-15 12:27:00 +0000 UTC" firstStartedPulling="2025-05-15 12:28:46.579260505 +0000 UTC m=+125.214593936" lastFinishedPulling="2025-05-15 12:29:17.223817812 +0000 UTC m=+155.859151240" observedRunningTime="2025-05-15 12:29:17.793651393 +0000 UTC m=+156.428984832" watchObservedRunningTime="2025-05-15 12:29:29.619969281 +0000 UTC m=+168.255302719" May 15 12:29:29.622296 systemd-networkd[1361]: cali45ba0ebc4b2: Link DOWN May 15 12:29:29.622880 systemd-networkd[1361]: cali45ba0ebc4b2: Lost carrier May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.617 [INFO][6740] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.619 [INFO][6740] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" iface="eth0" netns="/var/run/netns/cni-eb3dc3e3-de42-5087-d89c-18618a7d7f7f" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.621 [INFO][6740] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" iface="eth0" netns="/var/run/netns/cni-eb3dc3e3-de42-5087-d89c-18618a7d7f7f" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.631 [INFO][6740] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" after=10.883897ms iface="eth0" netns="/var/run/netns/cni-eb3dc3e3-de42-5087-d89c-18618a7d7f7f" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.631 [INFO][6740] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.631 [INFO][6740] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.650 [INFO][6752] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.650 [INFO][6752] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.650 [INFO][6752] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.681 [INFO][6752] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.681 [INFO][6752] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.681 [INFO][6752] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:29:29.683159 containerd[1734]: 2025-05-15 12:29:29.682 [INFO][6740] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:29.685120 systemd[1]: run-netns-cni\x2deb3dc3e3\x2dde42\x2d5087\x2dd89c\x2d18618a7d7f7f.mount: Deactivated successfully. May 15 12:29:29.685383 containerd[1734]: time="2025-05-15T12:29:29.685281367Z" level=info msg="TearDown network for sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" successfully" May 15 12:29:29.685383 containerd[1734]: time="2025-05-15T12:29:29.685308066Z" level=info msg="StopPodSandbox for \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" returns successfully" May 15 12:29:29.793997 kubelet[3331]: I0515 12:29:29.793936 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dea655ce-5eab-4e67-a276-be0e9b39cc85-calico-apiserver-certs\") pod \"dea655ce-5eab-4e67-a276-be0e9b39cc85\" (UID: \"dea655ce-5eab-4e67-a276-be0e9b39cc85\") " May 15 12:29:29.793997 kubelet[3331]: I0515 12:29:29.793972 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g8g5\" (UniqueName: \"kubernetes.io/projected/dea655ce-5eab-4e67-a276-be0e9b39cc85-kube-api-access-6g8g5\") pod \"dea655ce-5eab-4e67-a276-be0e9b39cc85\" (UID: \"dea655ce-5eab-4e67-a276-be0e9b39cc85\") " May 15 12:29:29.798784 kubelet[3331]: I0515 12:29:29.798743 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dea655ce-5eab-4e67-a276-be0e9b39cc85-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "dea655ce-5eab-4e67-a276-be0e9b39cc85" (UID: "dea655ce-5eab-4e67-a276-be0e9b39cc85"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:29:29.799610 kubelet[3331]: I0515 12:29:29.799589 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea655ce-5eab-4e67-a276-be0e9b39cc85-kube-api-access-6g8g5" (OuterVolumeSpecName: "kube-api-access-6g8g5") pod "dea655ce-5eab-4e67-a276-be0e9b39cc85" (UID: "dea655ce-5eab-4e67-a276-be0e9b39cc85"). InnerVolumeSpecName "kube-api-access-6g8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:29:29.799811 systemd[1]: var-lib-kubelet-pods-dea655ce\x2d5eab\x2d4e67\x2da276\x2dbe0e9b39cc85-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6g8g5.mount: Deactivated successfully. May 15 12:29:29.801283 kubelet[3331]: I0515 12:29:29.801074 3331 scope.go:117] "RemoveContainer" containerID="817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e" May 15 12:29:29.803982 systemd[1]: var-lib-kubelet-pods-dea655ce\x2d5eab\x2d4e67\x2da276\x2dbe0e9b39cc85-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 15 12:29:29.805101 containerd[1734]: time="2025-05-15T12:29:29.805083681Z" level=info msg="RemoveContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\"" May 15 12:29:29.807809 systemd[1]: Removed slice kubepods-besteffort-poddea655ce_5eab_4e67_a276_be0e9b39cc85.slice - libcontainer container kubepods-besteffort-poddea655ce_5eab_4e67_a276_be0e9b39cc85.slice. May 15 12:29:29.894271 kubelet[3331]: I0515 12:29:29.894248 3331 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-6g8g5\" (UniqueName: \"kubernetes.io/projected/dea655ce-5eab-4e67-a276-be0e9b39cc85-kube-api-access-6g8g5\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:29:29.894363 kubelet[3331]: I0515 12:29:29.894285 3331 reconciler_common.go:289] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dea655ce-5eab-4e67-a276-be0e9b39cc85-calico-apiserver-certs\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:29:30.465428 containerd[1734]: time="2025-05-15T12:29:30.465386046Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"6a681b15dedf32dc20cc90bb503eddfc3a26455ba7366f63c044ca9a926775a4\" pid:6777 exited_at:{seconds:1747312170 nanos:465033626}" May 15 12:29:31.434290 kubelet[3331]: I0515 12:29:31.434262 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea655ce-5eab-4e67-a276-be0e9b39cc85" path="/var/lib/kubelet/pods/dea655ce-5eab-4e67-a276-be0e9b39cc85/volumes" May 15 12:29:31.930426 containerd[1734]: time="2025-05-15T12:29:31.930383066Z" level=info msg="RemoveContainer for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" returns successfully" May 15 12:29:31.930722 kubelet[3331]: I0515 12:29:31.930566 3331 scope.go:117] "RemoveContainer" containerID="817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e" May 15 12:29:31.930876 containerd[1734]: time="2025-05-15T12:29:31.930834145Z" level=error msg="ContainerStatus for \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\": not found" May 15 12:29:31.930987 kubelet[3331]: E0515 12:29:31.930956 3331 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\": not found" containerID="817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e" May 15 12:29:31.931078 kubelet[3331]: I0515 12:29:31.930994 3331 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e"} err="failed to get container status \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\": rpc error: code = NotFound desc = an error occurred when try to find container \"817299cc37575a0fa3268df3f0dcf4c0b1211e6099b7512574c661eb63fc0a1e\": not found" May 15 12:29:33.849011 systemd[1]: Started sshd@19-10.200.8.35:22-10.200.16.10:51114.service - OpenSSH per-connection server daemon (10.200.16.10:51114). May 15 12:29:34.475568 containerd[1734]: time="2025-05-15T12:29:34.475518063Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:34.485132 sshd[6791]: Accepted publickey for core from 10.200.16.10 port 51114 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:34.486121 sshd-session[6791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:34.489959 systemd-logind[1702]: New session 22 of user core. May 15 12:29:34.494321 systemd[1]: Started session-22.scope - Session 22 of User core. May 15 12:29:34.522871 containerd[1734]: time="2025-05-15T12:29:34.522836560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 15 12:29:34.570713 containerd[1734]: time="2025-05-15T12:29:34.570663894Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:34.618118 containerd[1734]: time="2025-05-15T12:29:34.618056825Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:29:34.618647 containerd[1734]: time="2025-05-15T12:29:34.618549361Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 17.394490764s" May 15 12:29:34.618647 containerd[1734]: time="2025-05-15T12:29:34.618575757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 15 12:29:34.620459 containerd[1734]: time="2025-05-15T12:29:34.620428284Z" level=info msg="CreateContainer within sandbox \"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 15 12:29:34.777731 containerd[1734]: time="2025-05-15T12:29:34.777669725Z" level=info msg="Container c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274: CDI devices from CRI Config.CDIDevices: []" May 15 12:29:34.924802 containerd[1734]: time="2025-05-15T12:29:34.924778911Z" level=info msg="CreateContainer within sandbox \"c274e41b43f14e904b98f8cc0378eaa1801754a53280774dde37e4416dcbc034\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274\"" May 15 12:29:34.925147 containerd[1734]: time="2025-05-15T12:29:34.925122530Z" level=info msg="StartContainer for \"c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274\"" May 15 12:29:34.926244 containerd[1734]: time="2025-05-15T12:29:34.926220733Z" level=info msg="connecting to shim c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274" address="unix:///run/containerd/s/194a212d6993460147836338e92322d3da6b755cf092f352c13968992de7bf91" protocol=ttrpc version=3 May 15 12:29:34.947329 systemd[1]: Started cri-containerd-c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274.scope - libcontainer container c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274. May 15 12:29:34.975527 sshd[6801]: Connection closed by 10.200.16.10 port 51114 May 15 12:29:34.975767 sshd-session[6791]: pam_unix(sshd:session): session closed for user core May 15 12:29:34.976744 containerd[1734]: time="2025-05-15T12:29:34.976726475Z" level=info msg="StartContainer for \"c44d04f97ee1c80f58485ae60e25aafe5cb730485504392451943a79362c7274\" returns successfully" May 15 12:29:34.979740 systemd[1]: sshd@19-10.200.8.35:22-10.200.16.10:51114.service: Deactivated successfully. May 15 12:29:34.981451 systemd[1]: session-22.scope: Deactivated successfully. May 15 12:29:34.982318 systemd-logind[1702]: Session 22 logged out. Waiting for processes to exit. May 15 12:29:34.983299 systemd-logind[1702]: Removed session 22. May 15 12:29:35.174621 kubelet[3331]: I0515 12:29:35.174601 3331 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 15 12:29:35.174621 kubelet[3331]: I0515 12:29:35.174627 3331 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 15 12:29:35.824194 kubelet[3331]: I0515 12:29:35.823755 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fthxd" podStartSLOduration=107.274583079 podStartE2EDuration="2m35.823739463s" podCreationTimestamp="2025-05-15 12:27:00 +0000 UTC" firstStartedPulling="2025-05-15 12:28:46.070114101 +0000 UTC m=+124.705447533" lastFinishedPulling="2025-05-15 12:29:34.619270477 +0000 UTC m=+173.254603917" observedRunningTime="2025-05-15 12:29:35.82270655 +0000 UTC m=+174.458039992" watchObservedRunningTime="2025-05-15 12:29:35.823739463 +0000 UTC m=+174.459072936" May 15 12:29:37.317881 containerd[1734]: time="2025-05-15T12:29:37.317845585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"cee1d4ff9da5501b4331aa80339fa19af0af3902d8fd514be42d0f01d8b33da5\" pid:6860 exited_at:{seconds:1747312177 nanos:317655706}" May 15 12:29:40.090211 systemd[1]: Started sshd@20-10.200.8.35:22-10.200.16.10:47698.service - OpenSSH per-connection server daemon (10.200.16.10:47698). May 15 12:29:40.722757 sshd[6873]: Accepted publickey for core from 10.200.16.10 port 47698 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:40.723677 sshd-session[6873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:40.727306 systemd-logind[1702]: New session 23 of user core. May 15 12:29:40.731311 systemd[1]: Started session-23.scope - Session 23 of User core. May 15 12:29:41.212762 sshd[6875]: Connection closed by 10.200.16.10 port 47698 May 15 12:29:41.213144 sshd-session[6873]: pam_unix(sshd:session): session closed for user core May 15 12:29:41.215133 systemd[1]: sshd@20-10.200.8.35:22-10.200.16.10:47698.service: Deactivated successfully. May 15 12:29:41.216920 systemd[1]: session-23.scope: Deactivated successfully. May 15 12:29:41.218142 systemd-logind[1702]: Session 23 logged out. Waiting for processes to exit. May 15 12:29:41.219858 systemd-logind[1702]: Removed session 23. May 15 12:29:41.869234 containerd[1734]: time="2025-05-15T12:29:41.869192475Z" level=info msg="StopPodSandbox for \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\"" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.892 [WARNING][6901] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.892 [INFO][6901] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.892 [INFO][6901] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" iface="eth0" netns="" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.892 [INFO][6901] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.892 [INFO][6901] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.907 [INFO][6908] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.907 [INFO][6908] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.907 [INFO][6908] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.911 [WARNING][6908] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.911 [INFO][6908] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.912 [INFO][6908] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:29:41.913435 containerd[1734]: 2025-05-15 12:29:41.912 [INFO][6901] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.913861 containerd[1734]: time="2025-05-15T12:29:41.913495659Z" level=info msg="TearDown network for sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" successfully" May 15 12:29:41.913861 containerd[1734]: time="2025-05-15T12:29:41.913521082Z" level=info msg="StopPodSandbox for \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" returns successfully" May 15 12:29:41.913861 containerd[1734]: time="2025-05-15T12:29:41.913836413Z" level=info msg="RemovePodSandbox for \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\"" May 15 12:29:41.913861 containerd[1734]: time="2025-05-15T12:29:41.913858977Z" level=info msg="Forcibly stopping sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\"" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.935 [WARNING][6926] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.935 [INFO][6926] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.935 [INFO][6926] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" iface="eth0" netns="" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.936 [INFO][6926] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.936 [INFO][6926] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.948 [INFO][6934] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.948 [INFO][6934] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.948 [INFO][6934] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.952 [WARNING][6934] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.952 [INFO][6934] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" HandleID="k8s-pod-network.4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--6ch4b-eth0" May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.953 [INFO][6934] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:29:41.954832 containerd[1734]: 2025-05-15 12:29:41.954 [INFO][6926] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b" May 15 12:29:41.955116 containerd[1734]: time="2025-05-15T12:29:41.954928069Z" level=info msg="TearDown network for sandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" successfully" May 15 12:29:41.957689 containerd[1734]: time="2025-05-15T12:29:41.957621702Z" level=info msg="Ensure that sandbox 4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b in task-service has been cleanup successfully" May 15 12:29:41.967409 containerd[1734]: time="2025-05-15T12:29:41.967386091Z" level=info msg="RemovePodSandbox \"4772b911a9c954a194b183672bf766233cd91d1ecabc5c6f821732559dd36d6b\" returns successfully" May 15 12:29:46.325128 systemd[1]: Started sshd@21-10.200.8.35:22-10.200.16.10:47706.service - OpenSSH per-connection server daemon (10.200.16.10:47706). May 15 12:29:46.958209 sshd[6941]: Accepted publickey for core from 10.200.16.10 port 47706 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:46.959138 sshd-session[6941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:46.963382 systemd-logind[1702]: New session 24 of user core. May 15 12:29:46.969548 systemd[1]: Started session-24.scope - Session 24 of User core. May 15 12:29:47.447945 sshd[6943]: Connection closed by 10.200.16.10 port 47706 May 15 12:29:47.449503 sshd-session[6941]: pam_unix(sshd:session): session closed for user core May 15 12:29:47.452528 systemd[1]: sshd@21-10.200.8.35:22-10.200.16.10:47706.service: Deactivated successfully. May 15 12:29:47.454034 systemd[1]: session-24.scope: Deactivated successfully. May 15 12:29:47.454775 systemd-logind[1702]: Session 24 logged out. Waiting for processes to exit. May 15 12:29:47.455750 systemd-logind[1702]: Removed session 24. May 15 12:29:47.669375 containerd[1734]: time="2025-05-15T12:29:47.669324676Z" level=info msg="StopContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" with timeout 30 (s)" May 15 12:29:47.669772 containerd[1734]: time="2025-05-15T12:29:47.669754660Z" level=info msg="Stop container \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" with signal terminated" May 15 12:29:47.767618 systemd[1]: cri-containerd-566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713.scope: Deactivated successfully. May 15 12:29:47.769809 containerd[1734]: time="2025-05-15T12:29:47.769733417Z" level=info msg="received exit event container_id:\"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" id:\"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" pid:6512 exit_status:1 exited_at:{seconds:1747312187 nanos:769469959}" May 15 12:29:47.769883 containerd[1734]: time="2025-05-15T12:29:47.769733379Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" id:\"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" pid:6512 exit_status:1 exited_at:{seconds:1747312187 nanos:769469959}" May 15 12:29:47.789617 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713-rootfs.mount: Deactivated successfully. May 15 12:29:51.363871 containerd[1734]: time="2025-05-15T12:29:51.363837434Z" level=info msg="StopContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" returns successfully" May 15 12:29:51.364426 containerd[1734]: time="2025-05-15T12:29:51.364234391Z" level=info msg="StopPodSandbox for \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\"" May 15 12:29:51.364426 containerd[1734]: time="2025-05-15T12:29:51.364307699Z" level=info msg="Container to stop \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:29:51.370406 systemd[1]: cri-containerd-93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525.scope: Deactivated successfully. May 15 12:29:51.375740 containerd[1734]: time="2025-05-15T12:29:51.375702702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" pid:6048 exit_status:137 exited_at:{seconds:1747312191 nanos:375403503}" May 15 12:29:51.396972 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525-rootfs.mount: Deactivated successfully. May 15 12:29:51.398077 containerd[1734]: time="2025-05-15T12:29:51.398025134Z" level=info msg="shim disconnected" id=93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525 namespace=k8s.io May 15 12:29:51.398494 containerd[1734]: time="2025-05-15T12:29:51.398407829Z" level=warning msg="cleaning up after shim disconnected" id=93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525 namespace=k8s.io May 15 12:29:51.398494 containerd[1734]: time="2025-05-15T12:29:51.398424367Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:29:51.919187 containerd[1734]: time="2025-05-15T12:29:51.917242550Z" level=error msg="Failed to handle event container_id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" pid:6048 exit_status:137 exited_at:{seconds:1747312191 nanos:375403503} for 93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" error="failed to handle container TaskExit event: failed to stop sandbox: failed to delete task: ttrpc: closed" May 15 12:29:51.919187 containerd[1734]: time="2025-05-15T12:29:51.917241437Z" level=info msg="received exit event sandbox_id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" exit_status:137 exited_at:{seconds:1747312191 nanos:375403503}" May 15 12:29:51.919464 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525-shm.mount: Deactivated successfully. May 15 12:29:51.956466 systemd-networkd[1361]: cali392e1acd4dd: Link DOWN May 15 12:29:51.958138 systemd-networkd[1361]: cali392e1acd4dd: Lost carrier May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.954 [INFO][7033] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.954 [INFO][7033] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" iface="eth0" netns="/var/run/netns/cni-9049174f-f09e-4ee0-fab7-7aa96fed3bcc" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.955 [INFO][7033] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" iface="eth0" netns="/var/run/netns/cni-9049174f-f09e-4ee0-fab7-7aa96fed3bcc" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.964 [INFO][7033] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" after=9.708209ms iface="eth0" netns="/var/run/netns/cni-9049174f-f09e-4ee0-fab7-7aa96fed3bcc" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.964 [INFO][7033] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.964 [INFO][7033] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.985 [INFO][7042] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.985 [INFO][7042] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:51.985 [INFO][7042] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:52.013 [INFO][7042] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:52.013 [INFO][7042] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:52.014 [INFO][7042] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:29:52.015930 containerd[1734]: 2025-05-15 12:29:52.015 [INFO][7033] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:29:52.018531 containerd[1734]: time="2025-05-15T12:29:52.016255662Z" level=info msg="TearDown network for sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" successfully" May 15 12:29:52.018531 containerd[1734]: time="2025-05-15T12:29:52.016278862Z" level=info msg="StopPodSandbox for \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" returns successfully" May 15 12:29:52.018802 systemd[1]: run-netns-cni\x2d9049174f\x2df09e\x2d4ee0\x2dfab7\x2d7aa96fed3bcc.mount: Deactivated successfully. May 15 12:29:52.198164 kubelet[3331]: I0515 12:29:52.198007 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-calico-apiserver-certs\") pod \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\" (UID: \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\") " May 15 12:29:52.198164 kubelet[3331]: I0515 12:29:52.198045 3331 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvldk\" (UniqueName: \"kubernetes.io/projected/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-kube-api-access-dvldk\") pod \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\" (UID: \"1d54623e-7c9b-4f0a-8922-6bb47d7e26af\") " May 15 12:29:52.201266 kubelet[3331]: I0515 12:29:52.200812 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "1d54623e-7c9b-4f0a-8922-6bb47d7e26af" (UID: "1d54623e-7c9b-4f0a-8922-6bb47d7e26af"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:29:52.201677 kubelet[3331]: I0515 12:29:52.201652 3331 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-kube-api-access-dvldk" (OuterVolumeSpecName: "kube-api-access-dvldk") pod "1d54623e-7c9b-4f0a-8922-6bb47d7e26af" (UID: "1d54623e-7c9b-4f0a-8922-6bb47d7e26af"). InnerVolumeSpecName "kube-api-access-dvldk". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:29:52.202956 systemd[1]: var-lib-kubelet-pods-1d54623e\x2d7c9b\x2d4f0a\x2d8922\x2d6bb47d7e26af-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddvldk.mount: Deactivated successfully. May 15 12:29:52.203046 systemd[1]: var-lib-kubelet-pods-1d54623e\x2d7c9b\x2d4f0a\x2d8922\x2d6bb47d7e26af-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 15 12:29:52.298493 kubelet[3331]: I0515 12:29:52.298459 3331 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-dvldk\" (UniqueName: \"kubernetes.io/projected/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-kube-api-access-dvldk\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:29:52.298493 kubelet[3331]: I0515 12:29:52.298492 3331 reconciler_common.go:289] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1d54623e-7c9b-4f0a-8922-6bb47d7e26af-calico-apiserver-certs\") on node \"ci-4334.0.0-a-9b1bbdffc7\" DevicePath \"\"" May 15 12:29:52.559935 systemd[1]: Started sshd@22-10.200.8.35:22-10.200.16.10:41784.service - OpenSSH per-connection server daemon (10.200.16.10:41784). May 15 12:29:52.843954 kubelet[3331]: I0515 12:29:52.843894 3331 scope.go:117] "RemoveContainer" containerID="566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713" May 15 12:29:52.847497 containerd[1734]: time="2025-05-15T12:29:52.847230177Z" level=info msg="RemoveContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\"" May 15 12:29:52.849573 systemd[1]: Removed slice kubepods-besteffort-pod1d54623e_7c9b_4f0a_8922_6bb47d7e26af.slice - libcontainer container kubepods-besteffort-pod1d54623e_7c9b_4f0a_8922_6bb47d7e26af.slice. May 15 12:29:52.870273 containerd[1734]: time="2025-05-15T12:29:52.870160783Z" level=info msg="RemoveContainer for \"566c283f07aad937c537c81fe72f373d93224e8ed584df2077de2020d5c5d713\" returns successfully" May 15 12:29:53.193016 sshd[7058]: Accepted publickey for core from 10.200.16.10 port 41784 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:53.193945 sshd-session[7058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:53.197842 systemd-logind[1702]: New session 25 of user core. May 15 12:29:53.202312 systemd[1]: Started session-25.scope - Session 25 of User core. May 15 12:29:53.281831 containerd[1734]: time="2025-05-15T12:29:53.281792479Z" level=info msg="TaskExit event in podsandbox handler container_id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" id:\"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" pid:6048 exit_status:137 exited_at:{seconds:1747312191 nanos:375403503}" May 15 12:29:53.434382 kubelet[3331]: I0515 12:29:53.434365 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d54623e-7c9b-4f0a-8922-6bb47d7e26af" path="/var/lib/kubelet/pods/1d54623e-7c9b-4f0a-8922-6bb47d7e26af/volumes" May 15 12:29:53.684770 sshd[7062]: Connection closed by 10.200.16.10 port 41784 May 15 12:29:53.685127 sshd-session[7058]: pam_unix(sshd:session): session closed for user core May 15 12:29:53.687686 systemd[1]: sshd@22-10.200.8.35:22-10.200.16.10:41784.service: Deactivated successfully. May 15 12:29:53.689303 systemd[1]: session-25.scope: Deactivated successfully. May 15 12:29:53.689982 systemd-logind[1702]: Session 25 logged out. Waiting for processes to exit. May 15 12:29:53.691281 systemd-logind[1702]: Removed session 25. May 15 12:29:58.797072 systemd[1]: Started sshd@23-10.200.8.35:22-10.200.16.10:55438.service - OpenSSH per-connection server daemon (10.200.16.10:55438). May 15 12:29:59.431698 sshd[7076]: Accepted publickey for core from 10.200.16.10 port 55438 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:29:59.432989 sshd-session[7076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:29:59.437827 systemd-logind[1702]: New session 26 of user core. May 15 12:29:59.443304 systemd[1]: Started session-26.scope - Session 26 of User core. May 15 12:29:59.923818 sshd[7084]: Connection closed by 10.200.16.10 port 55438 May 15 12:29:59.924210 sshd-session[7076]: pam_unix(sshd:session): session closed for user core May 15 12:29:59.926709 systemd[1]: sshd@23-10.200.8.35:22-10.200.16.10:55438.service: Deactivated successfully. May 15 12:29:59.928208 systemd[1]: session-26.scope: Deactivated successfully. May 15 12:29:59.928881 systemd-logind[1702]: Session 26 logged out. Waiting for processes to exit. May 15 12:29:59.929969 systemd-logind[1702]: Removed session 26. May 15 12:30:00.465479 containerd[1734]: time="2025-05-15T12:30:00.465370267Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"c81473f58339c7eb67442770e81b3a4827b50ed1fdfce4bdb565e1a3f2c45522\" pid:7107 exited_at:{seconds:1747312200 nanos:465150807}" May 15 12:30:05.037304 systemd[1]: Started sshd@24-10.200.8.35:22-10.200.16.10:55450.service - OpenSSH per-connection server daemon (10.200.16.10:55450). May 15 12:30:05.675105 sshd[7124]: Accepted publickey for core from 10.200.16.10 port 55450 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:05.676063 sshd-session[7124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:05.680007 systemd-logind[1702]: New session 27 of user core. May 15 12:30:05.686324 systemd[1]: Started session-27.scope - Session 27 of User core. May 15 12:30:06.171080 sshd[7126]: Connection closed by 10.200.16.10 port 55450 May 15 12:30:06.171475 sshd-session[7124]: pam_unix(sshd:session): session closed for user core May 15 12:30:06.174266 systemd-logind[1702]: Session 27 logged out. Waiting for processes to exit. May 15 12:30:06.174724 systemd[1]: sshd@24-10.200.8.35:22-10.200.16.10:55450.service: Deactivated successfully. May 15 12:30:06.177801 systemd[1]: session-27.scope: Deactivated successfully. May 15 12:30:06.182777 systemd-logind[1702]: Removed session 27. May 15 12:30:07.331317 containerd[1734]: time="2025-05-15T12:30:07.331278505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"89435d5018b14a103c1c1da5b155fcc4680d55d8d1b792d6b9e53cb43ddd443b\" pid:7158 exited_at:{seconds:1747312207 nanos:330266188}" May 15 12:30:07.332538 containerd[1734]: time="2025-05-15T12:30:07.332504276Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"70a5489212d23d1c628de8683616cd3b9067539912f165ee14401b98aeb76ca1\" pid:7169 exited_at:{seconds:1747312207 nanos:332361419}" May 15 12:30:11.284453 systemd[1]: Started sshd@25-10.200.8.35:22-10.200.16.10:52070.service - OpenSSH per-connection server daemon (10.200.16.10:52070). May 15 12:30:11.919780 sshd[7182]: Accepted publickey for core from 10.200.16.10 port 52070 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:11.920733 sshd-session[7182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:11.924232 systemd-logind[1702]: New session 28 of user core. May 15 12:30:11.930315 systemd[1]: Started session-28.scope - Session 28 of User core. May 15 12:30:12.429402 sshd[7189]: Connection closed by 10.200.16.10 port 52070 May 15 12:30:12.429810 sshd-session[7182]: pam_unix(sshd:session): session closed for user core May 15 12:30:12.433212 systemd[1]: sshd@25-10.200.8.35:22-10.200.16.10:52070.service: Deactivated successfully. May 15 12:30:12.435791 systemd[1]: session-28.scope: Deactivated successfully. May 15 12:30:12.437092 systemd-logind[1702]: Session 28 logged out. Waiting for processes to exit. May 15 12:30:12.438587 systemd-logind[1702]: Removed session 28. May 15 12:30:17.542287 systemd[1]: Started sshd@26-10.200.8.35:22-10.200.16.10:52082.service - OpenSSH per-connection server daemon (10.200.16.10:52082). May 15 12:30:18.182550 sshd[7214]: Accepted publickey for core from 10.200.16.10 port 52082 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:18.183499 sshd-session[7214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:18.186947 systemd-logind[1702]: New session 29 of user core. May 15 12:30:18.192333 systemd[1]: Started session-29.scope - Session 29 of User core. May 15 12:30:18.671003 sshd[7216]: Connection closed by 10.200.16.10 port 52082 May 15 12:30:18.671383 sshd-session[7214]: pam_unix(sshd:session): session closed for user core May 15 12:30:18.673469 systemd[1]: sshd@26-10.200.8.35:22-10.200.16.10:52082.service: Deactivated successfully. May 15 12:30:18.675069 systemd[1]: session-29.scope: Deactivated successfully. May 15 12:30:18.676365 systemd-logind[1702]: Session 29 logged out. Waiting for processes to exit. May 15 12:30:18.677516 systemd-logind[1702]: Removed session 29. May 15 12:30:23.786290 systemd[1]: Started sshd@27-10.200.8.35:22-10.200.16.10:54688.service - OpenSSH per-connection server daemon (10.200.16.10:54688). May 15 12:30:24.421658 sshd[7230]: Accepted publickey for core from 10.200.16.10 port 54688 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:24.422633 sshd-session[7230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:24.426237 systemd-logind[1702]: New session 30 of user core. May 15 12:30:24.432324 systemd[1]: Started session-30.scope - Session 30 of User core. May 15 12:30:24.912582 sshd[7232]: Connection closed by 10.200.16.10 port 54688 May 15 12:30:24.912914 sshd-session[7230]: pam_unix(sshd:session): session closed for user core May 15 12:30:24.915560 systemd[1]: sshd@27-10.200.8.35:22-10.200.16.10:54688.service: Deactivated successfully. May 15 12:30:24.917135 systemd[1]: session-30.scope: Deactivated successfully. May 15 12:30:24.917765 systemd-logind[1702]: Session 30 logged out. Waiting for processes to exit. May 15 12:30:24.918874 systemd-logind[1702]: Removed session 30. May 15 12:30:30.026123 systemd[1]: Started sshd@28-10.200.8.35:22-10.200.16.10:38640.service - OpenSSH per-connection server daemon (10.200.16.10:38640). May 15 12:30:30.466364 containerd[1734]: time="2025-05-15T12:30:30.466328943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"ce84b35bf03e45e0209c27423a451ed52aebcbe6f096c648479547006f6d47fd\" pid:7257 exited_at:{seconds:1747312230 nanos:466054704}" May 15 12:30:30.661277 sshd[7243]: Accepted publickey for core from 10.200.16.10 port 38640 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:30.662230 sshd-session[7243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:30.666122 systemd-logind[1702]: New session 31 of user core. May 15 12:30:30.671312 systemd[1]: Started session-31.scope - Session 31 of User core. May 15 12:30:31.150610 sshd[7270]: Connection closed by 10.200.16.10 port 38640 May 15 12:30:31.150935 sshd-session[7243]: pam_unix(sshd:session): session closed for user core May 15 12:30:31.153378 systemd[1]: sshd@28-10.200.8.35:22-10.200.16.10:38640.service: Deactivated successfully. May 15 12:30:31.155015 systemd[1]: session-31.scope: Deactivated successfully. May 15 12:30:31.155677 systemd-logind[1702]: Session 31 logged out. Waiting for processes to exit. May 15 12:30:31.156946 systemd-logind[1702]: Removed session 31. May 15 12:30:31.264729 systemd[1]: Started sshd@29-10.200.8.35:22-10.200.16.10:38656.service - OpenSSH per-connection server daemon (10.200.16.10:38656). May 15 12:30:31.898855 sshd[7282]: Accepted publickey for core from 10.200.16.10 port 38656 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:31.899759 sshd-session[7282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:31.903758 systemd-logind[1702]: New session 32 of user core. May 15 12:30:31.906315 systemd[1]: Started session-32.scope - Session 32 of User core. May 15 12:30:32.508145 sshd[7284]: Connection closed by 10.200.16.10 port 38656 May 15 12:30:32.508591 sshd-session[7282]: pam_unix(sshd:session): session closed for user core May 15 12:30:32.511046 systemd[1]: sshd@29-10.200.8.35:22-10.200.16.10:38656.service: Deactivated successfully. May 15 12:30:32.512764 systemd[1]: session-32.scope: Deactivated successfully. May 15 12:30:32.514450 systemd-logind[1702]: Session 32 logged out. Waiting for processes to exit. May 15 12:30:32.515305 systemd-logind[1702]: Removed session 32. May 15 12:30:32.621046 systemd[1]: Started sshd@30-10.200.8.35:22-10.200.16.10:38660.service - OpenSSH per-connection server daemon (10.200.16.10:38660). May 15 12:30:33.259083 sshd[7296]: Accepted publickey for core from 10.200.16.10 port 38660 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:33.260078 sshd-session[7296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:33.264021 systemd-logind[1702]: New session 33 of user core. May 15 12:30:33.269324 systemd[1]: Started session-33.scope - Session 33 of User core. May 15 12:30:35.247922 sshd[7298]: Connection closed by 10.200.16.10 port 38660 May 15 12:30:35.248427 sshd-session[7296]: pam_unix(sshd:session): session closed for user core May 15 12:30:35.251049 systemd[1]: sshd@30-10.200.8.35:22-10.200.16.10:38660.service: Deactivated successfully. May 15 12:30:35.252694 systemd[1]: session-33.scope: Deactivated successfully. May 15 12:30:35.252893 systemd[1]: session-33.scope: Consumed 383ms CPU time, 74M memory peak. May 15 12:30:35.253491 systemd-logind[1702]: Session 33 logged out. Waiting for processes to exit. May 15 12:30:35.255164 systemd-logind[1702]: Removed session 33. May 15 12:30:35.363869 systemd[1]: Started sshd@31-10.200.8.35:22-10.200.16.10:38672.service - OpenSSH per-connection server daemon (10.200.16.10:38672). May 15 12:30:35.996051 sshd[7315]: Accepted publickey for core from 10.200.16.10 port 38672 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:35.996996 sshd-session[7315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:36.000834 systemd-logind[1702]: New session 34 of user core. May 15 12:30:36.005336 systemd[1]: Started session-34.scope - Session 34 of User core. May 15 12:30:36.558016 sshd[7317]: Connection closed by 10.200.16.10 port 38672 May 15 12:30:36.558451 sshd-session[7315]: pam_unix(sshd:session): session closed for user core May 15 12:30:36.561134 systemd[1]: sshd@31-10.200.8.35:22-10.200.16.10:38672.service: Deactivated successfully. May 15 12:30:36.562726 systemd[1]: session-34.scope: Deactivated successfully. May 15 12:30:36.563352 systemd-logind[1702]: Session 34 logged out. Waiting for processes to exit. May 15 12:30:36.564538 systemd-logind[1702]: Removed session 34. May 15 12:30:36.668611 systemd[1]: Started sshd@32-10.200.8.35:22-10.200.16.10:38678.service - OpenSSH per-connection server daemon (10.200.16.10:38678). May 15 12:30:37.300270 sshd[7326]: Accepted publickey for core from 10.200.16.10 port 38678 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:37.301268 sshd-session[7326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:37.306387 systemd[1]: Started session-35.scope - Session 35 of User core. May 15 12:30:37.306423 systemd-logind[1702]: New session 35 of user core. May 15 12:30:37.321300 containerd[1734]: time="2025-05-15T12:30:37.321267724Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"d5f385de603b320adb058f6a3a216153a5c1b4385e2aace7598daa4e59513e1e\" pid:7341 exited_at:{seconds:1747312237 nanos:321006217}" May 15 12:30:37.797490 sshd[7346]: Connection closed by 10.200.16.10 port 38678 May 15 12:30:37.797872 sshd-session[7326]: pam_unix(sshd:session): session closed for user core May 15 12:30:37.800393 systemd[1]: sshd@32-10.200.8.35:22-10.200.16.10:38678.service: Deactivated successfully. May 15 12:30:37.801941 systemd[1]: session-35.scope: Deactivated successfully. May 15 12:30:37.802729 systemd-logind[1702]: Session 35 logged out. Waiting for processes to exit. May 15 12:30:37.803825 systemd-logind[1702]: Removed session 35. May 15 12:30:41.971794 containerd[1734]: time="2025-05-15T12:30:41.971687336Z" level=info msg="StopPodSandbox for \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\"" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:41.999 [WARNING][7374] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:41.999 [INFO][7374] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:41.999 [INFO][7374] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" iface="eth0" netns="" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:41.999 [INFO][7374] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:41.999 [INFO][7374] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.015 [INFO][7382] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.015 [INFO][7382] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.015 [INFO][7382] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.018 [WARNING][7382] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.018 [INFO][7382] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.019 [INFO][7382] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:30:42.020656 containerd[1734]: 2025-05-15 12:30:42.020 [INFO][7374] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.021087 containerd[1734]: time="2025-05-15T12:30:42.020685068Z" level=info msg="TearDown network for sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" successfully" May 15 12:30:42.021087 containerd[1734]: time="2025-05-15T12:30:42.020704675Z" level=info msg="StopPodSandbox for \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" returns successfully" May 15 12:30:42.021087 containerd[1734]: time="2025-05-15T12:30:42.020961128Z" level=info msg="RemovePodSandbox for \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\"" May 15 12:30:42.021087 containerd[1734]: time="2025-05-15T12:30:42.020983721Z" level=info msg="Forcibly stopping sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\"" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.043 [WARNING][7400] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" WorkloadEndpoint="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.043 [INFO][7400] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.043 [INFO][7400] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" iface="eth0" netns="" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.043 [INFO][7400] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.043 [INFO][7400] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.056 [INFO][7407] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.056 [INFO][7407] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.056 [INFO][7407] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.060 [WARNING][7407] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.060 [INFO][7407] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" HandleID="k8s-pod-network.93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" Workload="ci--4334.0.0--a--9b1bbdffc7-k8s-calico--apiserver--b7f8fc4d6--nk2q9-eth0" May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.061 [INFO][7407] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:30:42.062691 containerd[1734]: 2025-05-15 12:30:42.062 [INFO][7400] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525" May 15 12:30:42.063119 containerd[1734]: time="2025-05-15T12:30:42.062774826Z" level=info msg="TearDown network for sandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" successfully" May 15 12:30:42.064033 containerd[1734]: time="2025-05-15T12:30:42.064009828Z" level=info msg="Ensure that sandbox 93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525 in task-service has been cleanup successfully" May 15 12:30:42.120888 containerd[1734]: time="2025-05-15T12:30:42.120864043Z" level=info msg="RemovePodSandbox \"93ee5f82526b1209c5bcb721eb397fb1e6fcef85f966471ecb30c35c0211f525\" returns successfully" May 15 12:30:42.918923 systemd[1]: Started sshd@33-10.200.8.35:22-10.200.16.10:45298.service - OpenSSH per-connection server daemon (10.200.16.10:45298). May 15 12:30:43.552308 sshd[7414]: Accepted publickey for core from 10.200.16.10 port 45298 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:43.553259 sshd-session[7414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:43.557112 systemd-logind[1702]: New session 36 of user core. May 15 12:30:43.560292 systemd[1]: Started session-36.scope - Session 36 of User core. May 15 12:30:44.059137 sshd[7416]: Connection closed by 10.200.16.10 port 45298 May 15 12:30:44.059519 sshd-session[7414]: pam_unix(sshd:session): session closed for user core May 15 12:30:44.061954 systemd[1]: sshd@33-10.200.8.35:22-10.200.16.10:45298.service: Deactivated successfully. May 15 12:30:44.063592 systemd[1]: session-36.scope: Deactivated successfully. May 15 12:30:44.064248 systemd-logind[1702]: Session 36 logged out. Waiting for processes to exit. May 15 12:30:44.065451 systemd-logind[1702]: Removed session 36. May 15 12:30:49.181168 systemd[1]: Started sshd@34-10.200.8.35:22-10.200.16.10:48424.service - OpenSSH per-connection server daemon (10.200.16.10:48424). May 15 12:30:49.813266 sshd[7428]: Accepted publickey for core from 10.200.16.10 port 48424 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:49.814222 sshd-session[7428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:49.818151 systemd-logind[1702]: New session 37 of user core. May 15 12:30:49.822345 systemd[1]: Started session-37.scope - Session 37 of User core. May 15 12:30:50.317954 sshd[7430]: Connection closed by 10.200.16.10 port 48424 May 15 12:30:50.318345 sshd-session[7428]: pam_unix(sshd:session): session closed for user core May 15 12:30:50.320841 systemd[1]: sshd@34-10.200.8.35:22-10.200.16.10:48424.service: Deactivated successfully. May 15 12:30:50.322334 systemd[1]: session-37.scope: Deactivated successfully. May 15 12:30:50.323020 systemd-logind[1702]: Session 37 logged out. Waiting for processes to exit. May 15 12:30:50.324138 systemd-logind[1702]: Removed session 37. May 15 12:30:57.588037 systemd[1]: Started sshd@35-10.200.8.35:22-10.200.16.10:48434.service - OpenSSH per-connection server daemon (10.200.16.10:48434). May 15 12:30:58.233609 sshd[7444]: Accepted publickey for core from 10.200.16.10 port 48434 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:30:58.234604 sshd-session[7444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:30:58.238226 systemd-logind[1702]: New session 38 of user core. May 15 12:30:58.242314 systemd[1]: Started session-38.scope - Session 38 of User core. May 15 12:30:58.721278 sshd[7446]: Connection closed by 10.200.16.10 port 48434 May 15 12:30:58.721674 sshd-session[7444]: pam_unix(sshd:session): session closed for user core May 15 12:30:58.724232 systemd[1]: sshd@35-10.200.8.35:22-10.200.16.10:48434.service: Deactivated successfully. May 15 12:30:58.725736 systemd[1]: session-38.scope: Deactivated successfully. May 15 12:30:58.726436 systemd-logind[1702]: Session 38 logged out. Waiting for processes to exit. May 15 12:30:58.727524 systemd-logind[1702]: Removed session 38. May 15 12:31:00.473955 containerd[1734]: time="2025-05-15T12:31:00.473914435Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"92674bcd754a7ee3e49051f7cede8aacc8f89487a5127fcdffac73808cce5245\" pid:7469 exited_at:{seconds:1747312260 nanos:473559832}" May 15 12:31:03.834166 systemd[1]: Started sshd@36-10.200.8.35:22-10.200.16.10:58704.service - OpenSSH per-connection server daemon (10.200.16.10:58704). May 15 12:31:04.468439 sshd[7486]: Accepted publickey for core from 10.200.16.10 port 58704 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:04.469433 sshd-session[7486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:04.473347 systemd-logind[1702]: New session 39 of user core. May 15 12:31:04.477311 systemd[1]: Started session-39.scope - Session 39 of User core. May 15 12:31:04.959266 sshd[7488]: Connection closed by 10.200.16.10 port 58704 May 15 12:31:04.959653 sshd-session[7486]: pam_unix(sshd:session): session closed for user core May 15 12:31:04.961583 systemd[1]: sshd@36-10.200.8.35:22-10.200.16.10:58704.service: Deactivated successfully. May 15 12:31:04.964056 systemd[1]: session-39.scope: Deactivated successfully. May 15 12:31:04.965963 systemd-logind[1702]: Session 39 logged out. Waiting for processes to exit. May 15 12:31:04.967033 systemd-logind[1702]: Removed session 39. May 15 12:31:07.327848 containerd[1734]: time="2025-05-15T12:31:07.327807829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"94fb0faf0f762e660ecb051a3417e01a7d8473ef3ae87f1531bbf423987a0601\" pid:7524 exited_at:{seconds:1747312267 nanos:327379594}" May 15 12:31:07.328242 containerd[1734]: time="2025-05-15T12:31:07.327960763Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"b338476750ae3f79e21d81ba047c65621a4ee50ebd02de2680c743a5f76fd188\" pid:7523 exited_at:{seconds:1747312267 nanos:327801886}" May 15 12:31:10.077135 systemd[1]: Started sshd@37-10.200.8.35:22-10.200.16.10:58880.service - OpenSSH per-connection server daemon (10.200.16.10:58880). May 15 12:31:10.815883 sshd[7542]: Accepted publickey for core from 10.200.16.10 port 58880 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:10.817038 sshd-session[7542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:10.820986 systemd-logind[1702]: New session 40 of user core. May 15 12:31:10.827316 systemd[1]: Started session-40.scope - Session 40 of User core. May 15 12:31:11.302999 sshd[7544]: Connection closed by 10.200.16.10 port 58880 May 15 12:31:11.303436 sshd-session[7542]: pam_unix(sshd:session): session closed for user core May 15 12:31:11.306073 systemd[1]: sshd@37-10.200.8.35:22-10.200.16.10:58880.service: Deactivated successfully. May 15 12:31:11.307825 systemd[1]: session-40.scope: Deactivated successfully. May 15 12:31:11.308550 systemd-logind[1702]: Session 40 logged out. Waiting for processes to exit. May 15 12:31:11.309920 systemd-logind[1702]: Removed session 40. May 15 12:31:16.421215 systemd[1]: Started sshd@38-10.200.8.35:22-10.200.16.10:58882.service - OpenSSH per-connection server daemon (10.200.16.10:58882). May 15 12:31:17.099845 sshd[7558]: Accepted publickey for core from 10.200.16.10 port 58882 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:17.100821 sshd-session[7558]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:17.104226 systemd-logind[1702]: New session 41 of user core. May 15 12:31:17.109307 systemd[1]: Started session-41.scope - Session 41 of User core. May 15 12:31:17.599671 sshd[7560]: Connection closed by 10.200.16.10 port 58882 May 15 12:31:17.600068 sshd-session[7558]: pam_unix(sshd:session): session closed for user core May 15 12:31:17.602675 systemd[1]: sshd@38-10.200.8.35:22-10.200.16.10:58882.service: Deactivated successfully. May 15 12:31:17.604304 systemd[1]: session-41.scope: Deactivated successfully. May 15 12:31:17.604923 systemd-logind[1702]: Session 41 logged out. Waiting for processes to exit. May 15 12:31:17.606066 systemd-logind[1702]: Removed session 41. May 15 12:31:22.714959 systemd[1]: Started sshd@39-10.200.8.35:22-10.200.16.10:45154.service - OpenSSH per-connection server daemon (10.200.16.10:45154). May 15 12:31:23.353931 sshd[7578]: Accepted publickey for core from 10.200.16.10 port 45154 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:23.354932 sshd-session[7578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:23.358975 systemd-logind[1702]: New session 42 of user core. May 15 12:31:23.363333 systemd[1]: Started session-42.scope - Session 42 of User core. May 15 12:31:23.842220 sshd[7582]: Connection closed by 10.200.16.10 port 45154 May 15 12:31:23.842627 sshd-session[7578]: pam_unix(sshd:session): session closed for user core May 15 12:31:23.845214 systemd[1]: sshd@39-10.200.8.35:22-10.200.16.10:45154.service: Deactivated successfully. May 15 12:31:23.846738 systemd[1]: session-42.scope: Deactivated successfully. May 15 12:31:23.847409 systemd-logind[1702]: Session 42 logged out. Waiting for processes to exit. May 15 12:31:23.848454 systemd-logind[1702]: Removed session 42. May 15 12:31:28.960783 systemd[1]: Started sshd@40-10.200.8.35:22-10.200.16.10:45330.service - OpenSSH per-connection server daemon (10.200.16.10:45330). May 15 12:31:29.594483 sshd[7594]: Accepted publickey for core from 10.200.16.10 port 45330 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:29.595471 sshd-session[7594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:29.599683 systemd-logind[1702]: New session 43 of user core. May 15 12:31:29.606305 systemd[1]: Started session-43.scope - Session 43 of User core. May 15 12:31:30.083737 sshd[7596]: Connection closed by 10.200.16.10 port 45330 May 15 12:31:30.084123 sshd-session[7594]: pam_unix(sshd:session): session closed for user core May 15 12:31:30.086719 systemd[1]: sshd@40-10.200.8.35:22-10.200.16.10:45330.service: Deactivated successfully. May 15 12:31:30.088393 systemd[1]: session-43.scope: Deactivated successfully. May 15 12:31:30.089047 systemd-logind[1702]: Session 43 logged out. Waiting for processes to exit. May 15 12:31:30.090415 systemd-logind[1702]: Removed session 43. May 15 12:31:30.467595 containerd[1734]: time="2025-05-15T12:31:30.467549305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3538d4d98068af62053dba94b18585d28518dccba039cd134b61c00ee8af7d\" id:\"4ac2f0036e283bd8eb315966e9e320a0666a2d885916afbc1990a903df95766d\" pid:7620 exited_at:{seconds:1747312290 nanos:467303213}" May 15 12:31:31.748369 containerd[1734]: time="2025-05-15T12:31:31.748287903Z" level=warning msg="container event discarded" container=fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a type=CONTAINER_CREATED_EVENT May 15 12:31:31.759517 containerd[1734]: time="2025-05-15T12:31:31.759469203Z" level=warning msg="container event discarded" container=fe3b5393091ee576d32f3c456ad0b0f8c3351f3f0b276f6e0a88c687423cb30a type=CONTAINER_STARTED_EVENT May 15 12:31:31.792696 containerd[1734]: time="2025-05-15T12:31:31.792668292Z" level=warning msg="container event discarded" container=0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71 type=CONTAINER_CREATED_EVENT May 15 12:31:31.792696 containerd[1734]: time="2025-05-15T12:31:31.792693454Z" level=warning msg="container event discarded" container=0bd366c51f612378b249eb726925139573e5e1aa4c4ffcfa7fd42e7367a14d71 type=CONTAINER_STARTED_EVENT May 15 12:31:31.887920 containerd[1734]: time="2025-05-15T12:31:31.887885616Z" level=warning msg="container event discarded" container=c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531 type=CONTAINER_CREATED_EVENT May 15 12:31:31.887920 containerd[1734]: time="2025-05-15T12:31:31.887917514Z" level=warning msg="container event discarded" container=c0e8f7b363b0759157c313e0bcdb91d417ea700e4ad91a8928099692154e9531 type=CONTAINER_STARTED_EVENT May 15 12:31:32.584472 containerd[1734]: time="2025-05-15T12:31:32.584408867Z" level=warning msg="container event discarded" container=89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07 type=CONTAINER_CREATED_EVENT May 15 12:31:32.695722 containerd[1734]: time="2025-05-15T12:31:32.695677794Z" level=warning msg="container event discarded" container=89840f2815328e0f0000d050449d9d28b646ed9f17378e82b68b085852060e07 type=CONTAINER_STARTED_EVENT May 15 12:31:32.740922 containerd[1734]: time="2025-05-15T12:31:32.740875394Z" level=warning msg="container event discarded" container=6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c type=CONTAINER_CREATED_EVENT May 15 12:31:32.740922 containerd[1734]: time="2025-05-15T12:31:32.740912220Z" level=warning msg="container event discarded" container=a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc type=CONTAINER_CREATED_EVENT May 15 12:31:32.871644 containerd[1734]: time="2025-05-15T12:31:32.839134694Z" level=warning msg="container event discarded" container=6fb03d3ad558c05de6e7df7af2b7343d3f1b4b24f9c6fadcb2f3c42dc4196d8c type=CONTAINER_STARTED_EVENT May 15 12:31:32.871644 containerd[1734]: time="2025-05-15T12:31:32.862286170Z" level=warning msg="container event discarded" container=a8824afbdeff9232c5f692445a308d6293067ca4db723734293ae0441cdd3edc type=CONTAINER_STARTED_EVENT May 15 12:31:35.200207 systemd[1]: Started sshd@41-10.200.8.35:22-10.200.16.10:45344.service - OpenSSH per-connection server daemon (10.200.16.10:45344). May 15 12:31:35.837946 sshd[7633]: Accepted publickey for core from 10.200.16.10 port 45344 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:35.839056 sshd-session[7633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:35.843028 systemd-logind[1702]: New session 44 of user core. May 15 12:31:35.846342 systemd[1]: Started session-44.scope - Session 44 of User core. May 15 12:31:36.327930 sshd[7635]: Connection closed by 10.200.16.10 port 45344 May 15 12:31:36.328347 sshd-session[7633]: pam_unix(sshd:session): session closed for user core May 15 12:31:36.330342 systemd[1]: sshd@41-10.200.8.35:22-10.200.16.10:45344.service: Deactivated successfully. May 15 12:31:36.332034 systemd[1]: session-44.scope: Deactivated successfully. May 15 12:31:36.333319 systemd-logind[1702]: Session 44 logged out. Waiting for processes to exit. May 15 12:31:36.334310 systemd-logind[1702]: Removed session 44. May 15 12:31:37.318442 containerd[1734]: time="2025-05-15T12:31:37.318347021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"14a3f8f65399a7a5f1b36c32c1688a00f655e302de44be23c8d9c837423e5583\" id:\"c2487f82b5d1083d79dca2be16b11b18dd9d5b26510b1034ddbf0864ddba2ac6\" pid:7659 exited_at:{seconds:1747312297 nanos:318108754}" May 15 12:31:41.439022 systemd[1]: Started sshd@42-10.200.8.35:22-10.200.16.10:54510.service - OpenSSH per-connection server daemon (10.200.16.10:54510). May 15 12:31:42.072983 sshd[7672]: Accepted publickey for core from 10.200.16.10 port 54510 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:42.073934 sshd-session[7672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:42.077922 systemd-logind[1702]: New session 45 of user core. May 15 12:31:42.084309 systemd[1]: Started session-45.scope - Session 45 of User core. May 15 12:31:42.579127 sshd[7674]: Connection closed by 10.200.16.10 port 54510 May 15 12:31:42.579565 sshd-session[7672]: pam_unix(sshd:session): session closed for user core May 15 12:31:42.582275 systemd[1]: sshd@42-10.200.8.35:22-10.200.16.10:54510.service: Deactivated successfully. May 15 12:31:42.583986 systemd[1]: session-45.scope: Deactivated successfully. May 15 12:31:42.584670 systemd-logind[1702]: Session 45 logged out. Waiting for processes to exit. May 15 12:31:42.585926 systemd-logind[1702]: Removed session 45. May 15 12:31:47.704073 systemd[1]: Started sshd@43-10.200.8.35:22-10.200.16.10:54516.service - OpenSSH per-connection server daemon (10.200.16.10:54516). May 15 12:31:48.338555 sshd[7686]: Accepted publickey for core from 10.200.16.10 port 54516 ssh2: RSA SHA256:ZDl06a3Sf9XP4eO+idWT+NJh8pNyolZk1CfkC8ApwGk May 15 12:31:48.339577 sshd-session[7686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:31:48.343617 systemd-logind[1702]: New session 46 of user core. May 15 12:31:48.347319 systemd[1]: Started session-46.scope - Session 46 of User core. May 15 12:31:48.845986 sshd[7688]: Connection closed by 10.200.16.10 port 54516 May 15 12:31:48.846421 sshd-session[7686]: pam_unix(sshd:session): session closed for user core May 15 12:31:48.849229 systemd[1]: sshd@43-10.200.8.35:22-10.200.16.10:54516.service: Deactivated successfully. May 15 12:31:48.850781 systemd[1]: session-46.scope: Deactivated successfully. May 15 12:31:48.851687 systemd-logind[1702]: Session 46 logged out. Waiting for processes to exit. May 15 12:31:48.852746 systemd-logind[1702]: Removed session 46.