Jul 15 05:17:27.994580 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Jul 15 03:28:48 -00 2025 Jul 15 05:17:27.994604 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:17:27.994613 kernel: BIOS-provided physical RAM map: Jul 15 05:17:27.994619 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jul 15 05:17:27.994624 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Jul 15 05:17:27.994630 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000044fdfff] usable Jul 15 05:17:27.994637 kernel: BIOS-e820: [mem 0x00000000044fe000-0x00000000048fdfff] reserved Jul 15 05:17:27.994644 kernel: BIOS-e820: [mem 0x00000000048fe000-0x000000003ff1efff] usable Jul 15 05:17:27.994650 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ffc8fff] reserved Jul 15 05:17:27.994656 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Jul 15 05:17:27.994662 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Jul 15 05:17:27.994667 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Jul 15 05:17:27.994673 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Jul 15 05:17:27.994679 kernel: printk: legacy bootconsole [earlyser0] enabled Jul 15 05:17:27.994687 kernel: NX (Execute Disable) protection: active Jul 15 05:17:27.994694 kernel: APIC: Static calls initialized Jul 15 05:17:27.994700 kernel: efi: EFI v2.7 by Microsoft Jul 15 05:17:27.994706 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3e9da718 RNG=0x3ffd2018 Jul 15 05:17:27.994712 kernel: random: crng init done Jul 15 05:17:27.994719 kernel: secureboot: Secure boot disabled Jul 15 05:17:27.994724 kernel: SMBIOS 3.1.0 present. Jul 15 05:17:27.994730 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 01/28/2025 Jul 15 05:17:27.994736 kernel: DMI: Memory slots populated: 2/2 Jul 15 05:17:27.994744 kernel: Hypervisor detected: Microsoft Hyper-V Jul 15 05:17:27.994749 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 Jul 15 05:17:27.994756 kernel: Hyper-V: Nested features: 0x3e0101 Jul 15 05:17:27.994762 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Jul 15 05:17:27.994768 kernel: Hyper-V: Using hypercall for remote TLB flush Jul 15 05:17:27.994774 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Jul 15 05:17:27.994780 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Jul 15 05:17:27.994786 kernel: tsc: Detected 2299.998 MHz processor Jul 15 05:17:27.994793 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 15 05:17:27.994800 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 15 05:17:27.994808 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 Jul 15 05:17:27.994814 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jul 15 05:17:27.994821 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 15 05:17:27.994827 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved Jul 15 05:17:27.994833 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 Jul 15 05:17:27.994840 kernel: Using GB pages for direct mapping Jul 15 05:17:27.994846 kernel: ACPI: Early table checksum verification disabled Jul 15 05:17:27.994855 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Jul 15 05:17:27.994864 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994870 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994877 kernel: ACPI: DSDT 0x000000003FFD6000 01E27A (v02 MSFTVM DSDT01 00000001 INTL 20230628) Jul 15 05:17:27.994884 kernel: ACPI: FACS 0x000000003FFFE000 000040 Jul 15 05:17:27.994890 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994897 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994905 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994912 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) Jul 15 05:17:27.994918 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) Jul 15 05:17:27.994925 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Jul 15 05:17:27.994932 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Jul 15 05:17:27.994939 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4279] Jul 15 05:17:27.994945 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Jul 15 05:17:27.994952 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Jul 15 05:17:27.994958 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Jul 15 05:17:27.994966 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Jul 15 05:17:27.994973 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] Jul 15 05:17:27.994979 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] Jul 15 05:17:27.994986 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Jul 15 05:17:27.994993 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Jul 15 05:17:27.994999 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] Jul 15 05:17:27.995006 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] Jul 15 05:17:27.995013 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] Jul 15 05:17:27.995020 kernel: Zone ranges: Jul 15 05:17:27.995028 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 15 05:17:27.995034 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jul 15 05:17:27.995041 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Jul 15 05:17:27.995047 kernel: Device empty Jul 15 05:17:27.995054 kernel: Movable zone start for each node Jul 15 05:17:27.995060 kernel: Early memory node ranges Jul 15 05:17:27.995067 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jul 15 05:17:27.995073 kernel: node 0: [mem 0x0000000000100000-0x00000000044fdfff] Jul 15 05:17:27.995080 kernel: node 0: [mem 0x00000000048fe000-0x000000003ff1efff] Jul 15 05:17:27.995088 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Jul 15 05:17:27.995094 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Jul 15 05:17:27.995101 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Jul 15 05:17:27.995107 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 15 05:17:27.995114 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jul 15 05:17:27.995120 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Jul 15 05:17:27.995127 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Jul 15 05:17:27.995133 kernel: ACPI: PM-Timer IO Port: 0x408 Jul 15 05:17:27.995140 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 15 05:17:27.995148 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 15 05:17:27.995155 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 15 05:17:27.995161 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Jul 15 05:17:27.995168 kernel: TSC deadline timer available Jul 15 05:17:27.995175 kernel: CPU topo: Max. logical packages: 1 Jul 15 05:17:27.995181 kernel: CPU topo: Max. logical dies: 1 Jul 15 05:17:27.995187 kernel: CPU topo: Max. dies per package: 1 Jul 15 05:17:27.995194 kernel: CPU topo: Max. threads per core: 2 Jul 15 05:17:27.995200 kernel: CPU topo: Num. cores per package: 1 Jul 15 05:17:27.995208 kernel: CPU topo: Num. threads per package: 2 Jul 15 05:17:27.995215 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 15 05:17:27.995222 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Jul 15 05:17:27.995228 kernel: Booting paravirtualized kernel on Hyper-V Jul 15 05:17:27.995235 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 15 05:17:27.995242 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 15 05:17:27.995249 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 15 05:17:27.995255 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 15 05:17:27.995262 kernel: pcpu-alloc: [0] 0 1 Jul 15 05:17:27.995270 kernel: Hyper-V: PV spinlocks enabled Jul 15 05:17:27.995276 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jul 15 05:17:27.995284 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:17:27.995291 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 05:17:27.995298 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jul 15 05:17:27.995305 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 15 05:17:27.995311 kernel: Fallback order for Node 0: 0 Jul 15 05:17:27.995318 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2095807 Jul 15 05:17:27.995326 kernel: Policy zone: Normal Jul 15 05:17:27.995332 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 05:17:27.995339 kernel: software IO TLB: area num 2. Jul 15 05:17:27.995346 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 05:17:27.995352 kernel: ftrace: allocating 40097 entries in 157 pages Jul 15 05:17:27.995358 kernel: ftrace: allocated 157 pages with 5 groups Jul 15 05:17:27.995365 kernel: Dynamic Preempt: voluntary Jul 15 05:17:27.995372 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 05:17:27.995379 kernel: rcu: RCU event tracing is enabled. Jul 15 05:17:27.995393 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 05:17:27.995400 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 05:17:27.995408 kernel: Rude variant of Tasks RCU enabled. Jul 15 05:17:27.995416 kernel: Tracing variant of Tasks RCU enabled. Jul 15 05:17:27.995423 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 05:17:27.995443 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 05:17:27.995451 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:17:27.995458 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:17:27.995465 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:17:27.995473 kernel: Using NULL legacy PIC Jul 15 05:17:27.995482 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Jul 15 05:17:27.995490 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 05:17:27.995497 kernel: Console: colour dummy device 80x25 Jul 15 05:17:27.995504 kernel: printk: legacy console [tty1] enabled Jul 15 05:17:27.995511 kernel: printk: legacy console [ttyS0] enabled Jul 15 05:17:27.995518 kernel: printk: legacy bootconsole [earlyser0] disabled Jul 15 05:17:27.995525 kernel: ACPI: Core revision 20240827 Jul 15 05:17:27.995534 kernel: Failed to register legacy timer interrupt Jul 15 05:17:27.995541 kernel: APIC: Switch to symmetric I/O mode setup Jul 15 05:17:27.995548 kernel: x2apic enabled Jul 15 05:17:27.995555 kernel: APIC: Switched APIC routing to: physical x2apic Jul 15 05:17:27.995563 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Jul 15 05:17:27.995570 kernel: Hyper-V: enabling crash_kexec_post_notifiers Jul 15 05:17:27.995577 kernel: Hyper-V: Disabling IBT because of Hyper-V bug Jul 15 05:17:27.995584 kernel: Hyper-V: Using IPI hypercalls Jul 15 05:17:27.995591 kernel: APIC: send_IPI() replaced with hv_send_ipi() Jul 15 05:17:27.995599 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Jul 15 05:17:27.995607 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Jul 15 05:17:27.995615 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Jul 15 05:17:27.995622 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Jul 15 05:17:27.995629 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Jul 15 05:17:27.995636 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Jul 15 05:17:27.995644 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4599.99 BogoMIPS (lpj=2299998) Jul 15 05:17:27.995651 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jul 15 05:17:27.995659 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jul 15 05:17:27.995666 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jul 15 05:17:27.995673 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 15 05:17:27.995680 kernel: Spectre V2 : Mitigation: Retpolines Jul 15 05:17:27.995687 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 15 05:17:27.995695 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Jul 15 05:17:27.995702 kernel: RETBleed: Vulnerable Jul 15 05:17:27.995709 kernel: Speculative Store Bypass: Vulnerable Jul 15 05:17:27.995716 kernel: ITS: Mitigation: Aligned branch/return thunks Jul 15 05:17:27.995723 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 15 05:17:27.995730 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 15 05:17:27.995738 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 15 05:17:27.995745 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jul 15 05:17:27.995752 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jul 15 05:17:27.995759 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jul 15 05:17:27.995766 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' Jul 15 05:17:27.995773 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' Jul 15 05:17:27.995780 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' Jul 15 05:17:27.995787 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 15 05:17:27.995794 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jul 15 05:17:27.995801 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jul 15 05:17:27.995808 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jul 15 05:17:27.995816 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 Jul 15 05:17:27.995823 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 Jul 15 05:17:27.995831 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 Jul 15 05:17:27.995838 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. Jul 15 05:17:27.995845 kernel: Freeing SMP alternatives memory: 32K Jul 15 05:17:27.995852 kernel: pid_max: default: 32768 minimum: 301 Jul 15 05:17:27.995858 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 05:17:27.995865 kernel: landlock: Up and running. Jul 15 05:17:27.995872 kernel: SELinux: Initializing. Jul 15 05:17:27.995879 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:17:27.995886 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:17:27.995893 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) Jul 15 05:17:27.995902 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. Jul 15 05:17:27.995910 kernel: signal: max sigframe size: 11952 Jul 15 05:17:27.995917 kernel: rcu: Hierarchical SRCU implementation. Jul 15 05:17:27.995924 kernel: rcu: Max phase no-delay instances is 400. Jul 15 05:17:27.995931 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 05:17:27.995938 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jul 15 05:17:27.995945 kernel: smp: Bringing up secondary CPUs ... Jul 15 05:17:27.995952 kernel: smpboot: x86: Booting SMP configuration: Jul 15 05:17:27.995959 kernel: .... node #0, CPUs: #1 Jul 15 05:17:27.995968 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 05:17:27.995975 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Jul 15 05:17:27.995982 kernel: Memory: 8077024K/8383228K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54608K init, 2360K bss, 299988K reserved, 0K cma-reserved) Jul 15 05:17:27.995990 kernel: devtmpfs: initialized Jul 15 05:17:27.995997 kernel: x86/mm: Memory block size: 128MB Jul 15 05:17:27.996004 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Jul 15 05:17:27.996011 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 05:17:27.996018 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 05:17:27.996026 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 05:17:27.996034 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 05:17:27.996041 kernel: audit: initializing netlink subsys (disabled) Jul 15 05:17:27.996048 kernel: audit: type=2000 audit(1752556644.030:1): state=initialized audit_enabled=0 res=1 Jul 15 05:17:27.996055 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 05:17:27.996062 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 15 05:17:27.996069 kernel: cpuidle: using governor menu Jul 15 05:17:27.996077 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 05:17:27.996084 kernel: dca service started, version 1.12.1 Jul 15 05:17:27.996091 kernel: e820: reserve RAM buffer [mem 0x044fe000-0x07ffffff] Jul 15 05:17:27.996099 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Jul 15 05:17:27.996107 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 15 05:17:27.996114 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 05:17:27.996121 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 05:17:27.996128 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 05:17:27.996135 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 05:17:27.996143 kernel: ACPI: Added _OSI(Module Device) Jul 15 05:17:27.996149 kernel: ACPI: Added _OSI(Processor Device) Jul 15 05:17:27.996158 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 05:17:27.996165 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 05:17:27.996173 kernel: ACPI: Interpreter enabled Jul 15 05:17:27.996180 kernel: ACPI: PM: (supports S0 S5) Jul 15 05:17:27.996187 kernel: ACPI: Using IOAPIC for interrupt routing Jul 15 05:17:27.996194 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 15 05:17:27.996201 kernel: PCI: Ignoring E820 reservations for host bridge windows Jul 15 05:17:27.996208 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Jul 15 05:17:27.996216 kernel: iommu: Default domain type: Translated Jul 15 05:17:27.996222 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 15 05:17:27.996231 kernel: efivars: Registered efivars operations Jul 15 05:17:27.996238 kernel: PCI: Using ACPI for IRQ routing Jul 15 05:17:27.996245 kernel: PCI: System does not support PCI Jul 15 05:17:27.996252 kernel: vgaarb: loaded Jul 15 05:17:27.996259 kernel: clocksource: Switched to clocksource tsc-early Jul 15 05:17:27.996266 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 05:17:27.996273 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 05:17:27.996281 kernel: pnp: PnP ACPI init Jul 15 05:17:27.996288 kernel: pnp: PnP ACPI: found 3 devices Jul 15 05:17:27.996297 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 15 05:17:27.996304 kernel: NET: Registered PF_INET protocol family Jul 15 05:17:27.996311 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 15 05:17:27.996319 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Jul 15 05:17:27.996326 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 05:17:27.996333 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 15 05:17:27.996341 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jul 15 05:17:27.996348 kernel: TCP: Hash tables configured (established 65536 bind 65536) Jul 15 05:17:27.996357 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Jul 15 05:17:27.996365 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Jul 15 05:17:27.996372 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 05:17:27.996379 kernel: NET: Registered PF_XDP protocol family Jul 15 05:17:27.996386 kernel: PCI: CLS 0 bytes, default 64 Jul 15 05:17:27.996393 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jul 15 05:17:27.996401 kernel: software IO TLB: mapped [mem 0x000000003a9da000-0x000000003e9da000] (64MB) Jul 15 05:17:27.996408 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer Jul 15 05:17:27.996415 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules Jul 15 05:17:27.996424 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Jul 15 05:17:27.997750 kernel: clocksource: Switched to clocksource tsc Jul 15 05:17:27.997767 kernel: Initialise system trusted keyrings Jul 15 05:17:27.997778 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Jul 15 05:17:27.997786 kernel: Key type asymmetric registered Jul 15 05:17:27.997793 kernel: Asymmetric key parser 'x509' registered Jul 15 05:17:27.997801 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 15 05:17:27.997809 kernel: io scheduler mq-deadline registered Jul 15 05:17:27.997818 kernel: io scheduler kyber registered Jul 15 05:17:27.997832 kernel: io scheduler bfq registered Jul 15 05:17:27.997841 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 15 05:17:27.997850 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 05:17:27.997859 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:17:27.997867 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Jul 15 05:17:27.997876 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:17:27.997885 kernel: i8042: PNP: No PS/2 controller found. Jul 15 05:17:27.998045 kernel: rtc_cmos 00:02: registered as rtc0 Jul 15 05:17:27.998133 kernel: rtc_cmos 00:02: setting system clock to 2025-07-15T05:17:27 UTC (1752556647) Jul 15 05:17:27.998211 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Jul 15 05:17:27.998220 kernel: intel_pstate: Intel P-state driver initializing Jul 15 05:17:27.998228 kernel: efifb: probing for efifb Jul 15 05:17:27.998236 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Jul 15 05:17:27.998243 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Jul 15 05:17:27.998250 kernel: efifb: scrolling: redraw Jul 15 05:17:27.998257 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jul 15 05:17:27.998264 kernel: Console: switching to colour frame buffer device 128x48 Jul 15 05:17:27.998272 kernel: fb0: EFI VGA frame buffer device Jul 15 05:17:27.998283 kernel: pstore: Using crash dump compression: deflate Jul 15 05:17:27.998293 kernel: pstore: Registered efi_pstore as persistent store backend Jul 15 05:17:27.998301 kernel: NET: Registered PF_INET6 protocol family Jul 15 05:17:27.998308 kernel: Segment Routing with IPv6 Jul 15 05:17:27.998315 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 05:17:27.998323 kernel: NET: Registered PF_PACKET protocol family Jul 15 05:17:27.998330 kernel: Key type dns_resolver registered Jul 15 05:17:27.998336 kernel: IPI shorthand broadcast: enabled Jul 15 05:17:27.998346 kernel: sched_clock: Marking stable (2829153477, 92238649)->(3256667863, -335275737) Jul 15 05:17:27.998353 kernel: registered taskstats version 1 Jul 15 05:17:27.998360 kernel: Loading compiled-in X.509 certificates Jul 15 05:17:27.998368 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: a24478b628e55368911ce1800a2bd6bc158938c7' Jul 15 05:17:27.998376 kernel: Demotion targets for Node 0: null Jul 15 05:17:27.998383 kernel: Key type .fscrypt registered Jul 15 05:17:27.998391 kernel: Key type fscrypt-provisioning registered Jul 15 05:17:27.998399 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 05:17:27.998406 kernel: ima: Allocated hash algorithm: sha1 Jul 15 05:17:27.998416 kernel: ima: No architecture policies found Jul 15 05:17:27.998424 kernel: clk: Disabling unused clocks Jul 15 05:17:27.999166 kernel: Warning: unable to open an initial console. Jul 15 05:17:27.999177 kernel: Freeing unused kernel image (initmem) memory: 54608K Jul 15 05:17:27.999185 kernel: Write protecting the kernel read-only data: 24576k Jul 15 05:17:27.999193 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 15 05:17:27.999200 kernel: Run /init as init process Jul 15 05:17:27.999208 kernel: with arguments: Jul 15 05:17:27.999215 kernel: /init Jul 15 05:17:27.999225 kernel: with environment: Jul 15 05:17:27.999232 kernel: HOME=/ Jul 15 05:17:27.999239 kernel: TERM=linux Jul 15 05:17:27.999247 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 05:17:27.999261 systemd[1]: Successfully made /usr/ read-only. Jul 15 05:17:27.999271 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:17:27.999279 systemd[1]: Detected virtualization microsoft. Jul 15 05:17:27.999287 systemd[1]: Detected architecture x86-64. Jul 15 05:17:27.999293 systemd[1]: Running in initrd. Jul 15 05:17:27.999300 systemd[1]: No hostname configured, using default hostname. Jul 15 05:17:27.999309 systemd[1]: Hostname set to . Jul 15 05:17:27.999316 systemd[1]: Initializing machine ID from random generator. Jul 15 05:17:27.999324 systemd[1]: Queued start job for default target initrd.target. Jul 15 05:17:27.999331 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:17:27.999339 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:17:27.999355 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 05:17:27.999364 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:17:27.999371 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 05:17:27.999380 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 05:17:27.999389 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 05:17:27.999397 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 05:17:27.999405 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:17:27.999416 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:17:27.999424 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:17:27.999470 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:17:27.999478 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:17:27.999485 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:17:27.999492 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:17:27.999500 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:17:27.999508 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 05:17:27.999515 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 05:17:27.999524 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:17:27.999532 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:17:27.999540 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:17:27.999546 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:17:27.999553 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 05:17:27.999561 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:17:27.999569 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 05:17:27.999576 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 05:17:27.999585 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 05:17:27.999593 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:17:27.999601 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:17:27.999616 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:27.999625 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 05:17:27.999655 systemd-journald[205]: Collecting audit messages is disabled. Jul 15 05:17:27.999677 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:17:27.999686 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 05:17:27.999695 systemd-journald[205]: Journal started Jul 15 05:17:27.999716 systemd-journald[205]: Runtime Journal (/run/log/journal/88bb7732b8374007b525fb77891f659b) is 8M, max 158.9M, 150.9M free. Jul 15 05:17:27.998301 systemd-modules-load[206]: Inserted module 'overlay' Jul 15 05:17:28.003444 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:17:28.005601 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:28.010942 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 05:17:28.017535 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:17:28.022957 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:17:28.033496 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 05:17:28.037791 systemd-modules-load[206]: Inserted module 'br_netfilter' Jul 15 05:17:28.038818 kernel: Bridge firewalling registered Jul 15 05:17:28.039599 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:17:28.039833 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:17:28.041081 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:17:28.045000 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:17:28.049083 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:17:28.052713 systemd-tmpfiles[221]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 05:17:28.059683 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:17:28.064909 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 05:17:28.074714 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:17:28.079511 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:17:28.085726 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:17:28.089860 dracut-cmdline[241]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:17:28.130307 systemd-resolved[248]: Positive Trust Anchors: Jul 15 05:17:28.131016 systemd-resolved[248]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:17:28.131506 systemd-resolved[248]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:17:28.145367 systemd-resolved[248]: Defaulting to hostname 'linux'. Jul 15 05:17:28.147872 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:17:28.153667 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:17:28.160443 kernel: SCSI subsystem initialized Jul 15 05:17:28.167446 kernel: Loading iSCSI transport class v2.0-870. Jul 15 05:17:28.175448 kernel: iscsi: registered transport (tcp) Jul 15 05:17:28.190539 kernel: iscsi: registered transport (qla4xxx) Jul 15 05:17:28.190580 kernel: QLogic iSCSI HBA Driver Jul 15 05:17:28.202505 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:17:28.216183 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:17:28.220265 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:17:28.247585 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 05:17:28.248740 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 05:17:28.287451 kernel: raid6: avx512x4 gen() 44422 MB/s Jul 15 05:17:28.305438 kernel: raid6: avx512x2 gen() 44072 MB/s Jul 15 05:17:28.322440 kernel: raid6: avx512x1 gen() 27667 MB/s Jul 15 05:17:28.340439 kernel: raid6: avx2x4 gen() 42949 MB/s Jul 15 05:17:28.357439 kernel: raid6: avx2x2 gen() 43057 MB/s Jul 15 05:17:28.375146 kernel: raid6: avx2x1 gen() 30156 MB/s Jul 15 05:17:28.375167 kernel: raid6: using algorithm avx512x4 gen() 44422 MB/s Jul 15 05:17:28.394474 kernel: raid6: .... xor() 7187 MB/s, rmw enabled Jul 15 05:17:28.394492 kernel: raid6: using avx512x2 recovery algorithm Jul 15 05:17:28.410444 kernel: xor: automatically using best checksumming function avx Jul 15 05:17:28.512451 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 05:17:28.516772 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:17:28.519575 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:17:28.537525 systemd-udevd[454]: Using default interface naming scheme 'v255'. Jul 15 05:17:28.540999 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:17:28.544710 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 05:17:28.560725 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Jul 15 05:17:28.576283 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:17:28.580375 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:17:28.604851 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:17:28.611732 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 05:17:28.647468 kernel: cryptd: max_cpu_qlen set to 1000 Jul 15 05:17:28.661453 kernel: AES CTR mode by8 optimization enabled Jul 15 05:17:28.668526 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:28.668761 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:28.675813 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:28.687610 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:28.700528 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:28.700602 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:28.704624 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:28.712488 kernel: hv_vmbus: Vmbus version:5.3 Jul 15 05:17:28.718432 kernel: pps_core: LinuxPPS API ver. 1 registered Jul 15 05:17:28.718461 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jul 15 05:17:28.723449 kernel: hv_vmbus: registering driver hyperv_keyboard Jul 15 05:17:28.732060 kernel: PTP clock support registered Jul 15 05:17:28.738462 kernel: hv_vmbus: registering driver hv_netvsc Jul 15 05:17:28.745074 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:28.720268 kernel: hv_utils: Registering HyperV Utility Driver Jul 15 05:17:28.723398 kernel: hv_vmbus: registering driver hv_utils Jul 15 05:17:28.723414 kernel: hv_utils: Shutdown IC version 3.2 Jul 15 05:17:28.723422 kernel: hv_utils: Heartbeat IC version 3.0 Jul 15 05:17:28.723430 kernel: hv_utils: TimeSync IC version 4.0 Jul 15 05:17:28.723438 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Jul 15 05:17:28.723448 kernel: hv_vmbus: registering driver hv_pci Jul 15 05:17:28.723458 systemd-journald[205]: Time jumped backwards, rotating. Jul 15 05:17:28.716123 systemd-resolved[248]: Clock change detected. Flushing caches. Jul 15 05:17:28.733289 kernel: hv_vmbus: registering driver hv_storvsc Jul 15 05:17:28.733324 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 Jul 15 05:17:28.733801 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 05:17:28.735103 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d496b19 (unnamed net_device) (uninitialized): VF slot 1 added Jul 15 05:17:28.738251 kernel: scsi host0: storvsc_host_t Jul 15 05:17:28.743745 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Jul 15 05:17:28.743920 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 Jul 15 05:17:28.744498 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] Jul 15 05:17:28.746136 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] Jul 15 05:17:28.750009 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint Jul 15 05:17:28.755149 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] Jul 15 05:17:28.756978 kernel: hv_vmbus: registering driver hid_hyperv Jul 15 05:17:28.761815 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Jul 15 05:17:28.761844 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Jul 15 05:17:28.769974 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) Jul 15 05:17:28.775641 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Jul 15 05:17:28.776080 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 05:17:28.776094 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 Jul 15 05:17:28.777455 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned Jul 15 05:17:28.779128 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Jul 15 05:17:28.791822 kernel: nvme nvme0: pci function c05b:00:00.0 Jul 15 05:17:28.792007 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) Jul 15 05:17:28.796013 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#196 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Jul 15 05:17:28.812978 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#20 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Jul 15 05:17:29.054056 kernel: nvme nvme0: 2/0/0 default/read/poll queues Jul 15 05:17:29.248982 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Jul 15 05:17:29.498982 kernel: nvme nvme0: using unchecked data buffer Jul 15 05:17:29.760926 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 Jul 15 05:17:29.761143 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 Jul 15 05:17:29.763489 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] Jul 15 05:17:29.765067 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] Jul 15 05:17:29.769100 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint Jul 15 05:17:29.772995 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] Jul 15 05:17:29.777509 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] Jul 15 05:17:29.777530 kernel: pci 7870:00:00.0: enabling Extended Tags Jul 15 05:17:29.791532 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 Jul 15 05:17:29.791714 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned Jul 15 05:17:29.793407 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned Jul 15 05:17:29.800349 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) Jul 15 05:17:29.809980 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 Jul 15 05:17:29.812970 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d496b19 eth0: VF registering: eth1 Jul 15 05:17:29.813109 kernel: mana 7870:00:00.0 eth1: joined to eth0 Jul 15 05:17:29.817973 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 Jul 15 05:17:30.363277 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. Jul 15 05:17:30.375188 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Jul 15 05:17:30.384899 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. Jul 15 05:17:30.385930 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. Jul 15 05:17:30.393824 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 05:17:30.409124 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. Jul 15 05:17:31.324258 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 05:17:31.328739 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:17:31.331932 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:17:31.334130 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:17:31.336298 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 05:17:31.354898 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:17:31.418982 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Jul 15 05:17:31.419312 disk-uuid[667]: The operation has completed successfully. Jul 15 05:17:31.469598 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 05:17:31.469692 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 05:17:31.498615 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 05:17:31.508862 sh[717]: Success Jul 15 05:17:31.537210 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 05:17:31.537253 kernel: device-mapper: uevent: version 1.0.3 Jul 15 05:17:31.538437 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 05:17:31.545970 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jul 15 05:17:31.864539 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 05:17:31.869078 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 05:17:31.882778 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 05:17:31.902429 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 05:17:31.902473 kernel: BTRFS: device fsid eb96c768-dac4-4ca9-ae1d-82815d4ce00b devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (730) Jul 15 05:17:31.903973 kernel: BTRFS info (device dm-0): first mount of filesystem eb96c768-dac4-4ca9-ae1d-82815d4ce00b Jul 15 05:17:31.906471 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:17:31.908014 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 05:17:32.456667 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 05:17:32.460516 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:17:32.465049 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 05:17:32.466561 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 05:17:32.480658 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 05:17:32.537223 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:12) scanned by mount (753) Jul 15 05:17:32.537260 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:17:32.539433 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:17:32.539467 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Jul 15 05:17:32.552433 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:17:32.557089 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:17:32.569108 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:17:32.569038 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 05:17:32.571912 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 05:17:32.589271 systemd-networkd[893]: lo: Link UP Jul 15 05:17:32.593454 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Jul 15 05:17:32.596056 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jul 15 05:17:32.589278 systemd-networkd[893]: lo: Gained carrier Jul 15 05:17:32.598721 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d496b19 eth0: Data path switched to VF: enP30832s1 Jul 15 05:17:32.590731 systemd-networkd[893]: Enumeration completed Jul 15 05:17:32.590794 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:17:32.591117 systemd-networkd[893]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:32.591120 systemd-networkd[893]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:17:32.594120 systemd[1]: Reached target network.target - Network. Jul 15 05:17:32.598688 systemd-networkd[893]: enP30832s1: Link UP Jul 15 05:17:32.598744 systemd-networkd[893]: eth0: Link UP Jul 15 05:17:32.598884 systemd-networkd[893]: eth0: Gained carrier Jul 15 05:17:32.598893 systemd-networkd[893]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:32.604480 systemd-networkd[893]: enP30832s1: Gained carrier Jul 15 05:17:32.610991 systemd-networkd[893]: eth0: DHCPv4 address 10.200.8.4/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jul 15 05:17:33.938665 ignition[900]: Ignition 2.21.0 Jul 15 05:17:33.938677 ignition[900]: Stage: fetch-offline Jul 15 05:17:33.940887 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:17:33.938760 ignition[900]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:33.943194 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 05:17:33.938767 ignition[900]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:33.938850 ignition[900]: parsed url from cmdline: "" Jul 15 05:17:33.938853 ignition[900]: no config URL provided Jul 15 05:17:33.938857 ignition[900]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:17:33.938862 ignition[900]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:17:33.938866 ignition[900]: failed to fetch config: resource requires networking Jul 15 05:17:33.939130 ignition[900]: Ignition finished successfully Jul 15 05:17:33.974948 ignition[916]: Ignition 2.21.0 Jul 15 05:17:33.974976 ignition[916]: Stage: fetch Jul 15 05:17:33.975158 ignition[916]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:33.975165 ignition[916]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:33.975870 ignition[916]: parsed url from cmdline: "" Jul 15 05:17:33.975872 ignition[916]: no config URL provided Jul 15 05:17:33.975875 ignition[916]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:17:33.975879 ignition[916]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:17:33.975903 ignition[916]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Jul 15 05:17:33.994084 systemd-networkd[893]: enP30832s1: Gained IPv6LL Jul 15 05:17:34.043268 ignition[916]: GET result: OK Jul 15 05:17:34.043345 ignition[916]: config has been read from IMDS userdata Jul 15 05:17:34.043371 ignition[916]: parsing config with SHA512: 64e275655b1cec308110bf3029c18e83283d0659419a42a8621afeac7811db76979812ff995d2836424aff9fe355e20f7b635b592f10c1fa191fe3f88c027897 Jul 15 05:17:34.049420 unknown[916]: fetched base config from "system" Jul 15 05:17:34.049428 unknown[916]: fetched base config from "system" Jul 15 05:17:34.050479 ignition[916]: fetch: fetch complete Jul 15 05:17:34.049433 unknown[916]: fetched user config from "azure" Jul 15 05:17:34.050554 ignition[916]: fetch: fetch passed Jul 15 05:17:34.054069 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 05:17:34.050645 ignition[916]: Ignition finished successfully Jul 15 05:17:34.058354 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 05:17:34.077618 ignition[922]: Ignition 2.21.0 Jul 15 05:17:34.077627 ignition[922]: Stage: kargs Jul 15 05:17:34.079851 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 05:17:34.077804 ignition[922]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:34.081881 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 05:17:34.077811 ignition[922]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:34.078421 ignition[922]: kargs: kargs passed Jul 15 05:17:34.078455 ignition[922]: Ignition finished successfully Jul 15 05:17:34.101652 ignition[928]: Ignition 2.21.0 Jul 15 05:17:34.101661 ignition[928]: Stage: disks Jul 15 05:17:34.103635 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 05:17:34.101844 ignition[928]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:34.107168 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 05:17:34.101851 ignition[928]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:34.112011 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 05:17:34.102700 ignition[928]: disks: disks passed Jul 15 05:17:34.113909 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:17:34.102735 ignition[928]: Ignition finished successfully Jul 15 05:17:34.114948 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:17:34.116997 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:17:34.120706 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 05:17:34.193559 systemd-fsck[936]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Jul 15 05:17:34.197360 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 05:17:34.201521 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 05:17:34.378091 systemd-networkd[893]: eth0: Gained IPv6LL Jul 15 05:17:34.626989 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 277c3938-5262-4ab1-8fa3-62fde82f8257 r/w with ordered data mode. Quota mode: none. Jul 15 05:17:34.627140 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 05:17:34.629407 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 05:17:34.647897 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:17:34.648967 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 05:17:34.659175 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 05:17:34.663347 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 05:17:34.663378 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:17:34.668786 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 05:17:34.677821 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:12) scanned by mount (945) Jul 15 05:17:34.677034 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 05:17:34.681875 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:17:34.682013 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:17:34.682026 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Jul 15 05:17:34.687538 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:17:35.235147 coreos-metadata[947]: Jul 15 05:17:35.235 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Jul 15 05:17:35.247156 coreos-metadata[947]: Jul 15 05:17:35.247 INFO Fetch successful Jul 15 05:17:35.248230 coreos-metadata[947]: Jul 15 05:17:35.247 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Jul 15 05:17:35.255265 coreos-metadata[947]: Jul 15 05:17:35.255 INFO Fetch successful Jul 15 05:17:35.269962 coreos-metadata[947]: Jul 15 05:17:35.269 INFO wrote hostname ci-4396.0.0-n-1e5a06c7e3 to /sysroot/etc/hostname Jul 15 05:17:35.273299 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:17:35.362596 initrd-setup-root[976]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 05:17:35.395085 initrd-setup-root[983]: cut: /sysroot/etc/group: No such file or directory Jul 15 05:17:35.398787 initrd-setup-root[990]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 05:17:35.417463 initrd-setup-root[997]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 05:17:37.553492 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 05:17:37.555948 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 05:17:37.561225 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 05:17:37.577846 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 05:17:37.579929 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:17:37.597401 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 05:17:37.603506 ignition[1065]: INFO : Ignition 2.21.0 Jul 15 05:17:37.603506 ignition[1065]: INFO : Stage: mount Jul 15 05:17:37.610359 ignition[1065]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:37.610359 ignition[1065]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:37.610359 ignition[1065]: INFO : mount: mount passed Jul 15 05:17:37.610359 ignition[1065]: INFO : Ignition finished successfully Jul 15 05:17:37.605570 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 05:17:37.608055 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 05:17:37.642835 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:17:37.665969 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:12) scanned by mount (1077) Jul 15 05:17:37.666001 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:17:37.668067 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:17:37.669219 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Jul 15 05:17:37.674362 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:17:37.701645 ignition[1094]: INFO : Ignition 2.21.0 Jul 15 05:17:37.701645 ignition[1094]: INFO : Stage: files Jul 15 05:17:37.705492 ignition[1094]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:37.705492 ignition[1094]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:37.705492 ignition[1094]: DEBUG : files: compiled without relabeling support, skipping Jul 15 05:17:37.719735 ignition[1094]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 05:17:37.719735 ignition[1094]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 05:17:37.772640 ignition[1094]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 05:17:37.774475 ignition[1094]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 05:17:37.774475 ignition[1094]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 05:17:37.772977 unknown[1094]: wrote ssh authorized keys file for user: core Jul 15 05:17:37.780044 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:17:37.780044 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jul 15 05:17:38.081816 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 05:17:38.339696 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:17:38.339696 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:17:38.345995 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:17:38.415586 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:17:38.417692 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:17:38.417692 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:17:38.423286 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:17:38.423286 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:17:38.423286 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Jul 15 05:17:39.248594 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 05:17:42.537120 ignition[1094]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:17:42.537120 ignition[1094]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 05:17:42.551917 ignition[1094]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:17:42.562356 ignition[1094]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:17:42.562356 ignition[1094]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 05:17:42.562356 ignition[1094]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jul 15 05:17:42.574061 ignition[1094]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 05:17:42.574061 ignition[1094]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:17:42.574061 ignition[1094]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:17:42.574061 ignition[1094]: INFO : files: files passed Jul 15 05:17:42.574061 ignition[1094]: INFO : Ignition finished successfully Jul 15 05:17:42.566050 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 05:17:42.572894 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 05:17:42.589072 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 05:17:42.595266 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 05:17:42.595372 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 05:17:42.612209 initrd-setup-root-after-ignition[1124]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:42.612209 initrd-setup-root-after-ignition[1124]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:42.617068 initrd-setup-root-after-ignition[1128]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:42.616740 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:17:42.620444 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 05:17:42.625690 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 05:17:42.664097 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 05:17:42.664173 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 05:17:42.665055 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 05:17:42.665329 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 05:17:42.665416 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 05:17:42.667058 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 05:17:42.678984 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:17:42.682516 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 05:17:42.699692 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:17:42.699839 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:17:42.700028 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 05:17:42.700297 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 05:17:42.700388 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:17:42.700644 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 05:17:42.700894 systemd[1]: Stopped target basic.target - Basic System. Jul 15 05:17:42.701566 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 05:17:42.702044 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:17:42.702285 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 05:17:42.702584 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:17:42.702834 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 05:17:42.703105 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:17:42.703362 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 05:17:42.703667 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 05:17:42.703830 systemd[1]: Stopped target swap.target - Swaps. Jul 15 05:17:42.704090 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 05:17:42.704186 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:17:42.704758 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:17:42.705070 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:17:42.705304 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 05:17:42.708149 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:17:42.725744 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 05:17:42.725868 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 05:17:42.738786 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 05:17:42.738887 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:17:42.743174 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 05:17:42.744297 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 05:17:42.746352 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 05:17:42.746452 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:17:42.787459 ignition[1148]: INFO : Ignition 2.21.0 Jul 15 05:17:42.787459 ignition[1148]: INFO : Stage: umount Jul 15 05:17:42.787459 ignition[1148]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:42.787459 ignition[1148]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Jul 15 05:17:42.750937 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 05:17:42.805045 ignition[1148]: INFO : umount: umount passed Jul 15 05:17:42.805045 ignition[1148]: INFO : Ignition finished successfully Jul 15 05:17:42.755023 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 05:17:42.755131 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:17:42.773326 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 05:17:42.774832 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 05:17:42.778151 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:17:42.783176 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 05:17:42.783305 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:17:42.791276 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 05:17:42.793361 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 05:17:42.799040 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 05:17:42.799203 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 05:17:42.803790 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 05:17:42.803825 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 05:17:42.807064 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 05:17:42.807105 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 05:17:42.811375 systemd[1]: Stopped target network.target - Network. Jul 15 05:17:42.813820 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 05:17:42.814684 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:17:42.817333 systemd[1]: Stopped target paths.target - Path Units. Jul 15 05:17:42.820990 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 05:17:42.826033 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:17:42.828500 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 05:17:42.834702 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 05:17:42.839215 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 05:17:42.839256 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:17:42.842898 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 05:17:42.842996 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:17:42.845382 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 05:17:42.846033 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 05:17:42.849218 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 05:17:42.849253 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 05:17:42.849614 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 05:17:42.911706 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d496b19 eth0: Data path switched from VF: enP30832s1 Jul 15 05:17:42.911869 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jul 15 05:17:42.850099 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 05:17:42.850593 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 05:17:42.850721 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 05:17:42.864393 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 05:17:42.864469 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 05:17:42.868773 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 05:17:42.868885 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 05:17:42.868965 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 05:17:42.875249 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 05:17:42.875679 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 05:17:42.880059 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 05:17:42.880091 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:17:42.885589 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 05:17:42.885742 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 05:17:42.885778 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:17:42.885832 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 05:17:42.885860 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:17:42.889211 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 05:17:42.889255 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 05:17:42.889405 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 05:17:42.889433 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:17:42.890309 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:17:42.891482 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 05:17:42.891526 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:17:42.901592 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 05:17:42.902109 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:17:42.909270 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 05:17:42.909341 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 05:17:42.912340 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 05:17:42.912401 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 05:17:42.924283 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 05:17:42.924307 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:17:42.929754 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 05:17:42.929786 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:17:42.960995 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 05:17:42.961132 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 05:17:42.971992 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 05:17:42.972877 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:17:42.979503 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 05:17:42.982424 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 05:17:42.982478 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:17:42.983669 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 05:17:42.983713 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:17:42.987568 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:42.987829 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:42.993501 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 15 05:17:42.993548 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 15 05:17:42.993581 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:17:42.993820 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 05:17:42.993880 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 05:17:43.322387 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 05:17:43.344794 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 05:17:43.345711 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 05:17:43.350082 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 05:17:43.352375 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 05:17:43.353030 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 05:17:43.356732 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 05:17:43.368573 systemd[1]: Switching root. Jul 15 05:17:43.433930 systemd-journald[205]: Journal stopped Jul 15 05:17:47.089633 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). Jul 15 05:17:47.089661 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 05:17:47.089671 kernel: SELinux: policy capability open_perms=1 Jul 15 05:17:47.089679 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 05:17:47.089687 kernel: SELinux: policy capability always_check_network=0 Jul 15 05:17:47.089694 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 05:17:47.089704 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 05:17:47.089712 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 05:17:47.089720 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 05:17:47.089728 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 05:17:47.089736 kernel: audit: type=1403 audit(1752556664.598:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 05:17:47.089746 systemd[1]: Successfully loaded SELinux policy in 144.153ms. Jul 15 05:17:47.089756 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.920ms. Jul 15 05:17:47.089767 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:17:47.089776 systemd[1]: Detected virtualization microsoft. Jul 15 05:17:47.089783 systemd[1]: Detected architecture x86-64. Jul 15 05:17:47.089792 systemd[1]: Detected first boot. Jul 15 05:17:47.089801 systemd[1]: Hostname set to . Jul 15 05:17:47.089810 systemd[1]: Initializing machine ID from random generator. Jul 15 05:17:47.089818 zram_generator::config[1192]: No configuration found. Jul 15 05:17:47.089828 kernel: Guest personality initialized and is inactive Jul 15 05:17:47.089836 kernel: VMCI host device registered (name=vmci, major=10, minor=124) Jul 15 05:17:47.090077 kernel: Initialized host personality Jul 15 05:17:47.090144 kernel: NET: Registered PF_VSOCK protocol family Jul 15 05:17:47.090154 systemd[1]: Populated /etc with preset unit settings. Jul 15 05:17:47.090222 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 05:17:47.090285 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 05:17:47.090295 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 05:17:47.090305 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 05:17:47.090371 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 05:17:47.090434 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 05:17:47.090445 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 05:17:47.090517 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 05:17:47.090533 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 05:17:47.090542 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 05:17:47.090555 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 05:17:47.090568 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 05:17:47.090581 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:17:47.090593 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:17:47.090606 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 05:17:47.090623 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 05:17:47.090638 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 05:17:47.090652 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:17:47.090664 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 15 05:17:47.090677 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:17:47.090690 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:17:47.090706 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 05:17:47.090720 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 05:17:47.090734 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 05:17:47.090748 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 05:17:47.090763 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:17:47.090774 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:17:47.090788 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:17:47.090800 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:17:47.090812 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 05:17:47.090825 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 05:17:47.090842 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 05:17:47.090857 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:17:47.090869 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:17:47.090881 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:17:47.090895 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 05:17:47.090909 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 05:17:47.090921 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 05:17:47.090935 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 05:17:47.090947 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:47.093247 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 05:17:47.093262 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 05:17:47.093272 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 05:17:47.093283 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 05:17:47.093296 systemd[1]: Reached target machines.target - Containers. Jul 15 05:17:47.093305 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 05:17:47.093314 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:47.093324 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:17:47.093334 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 05:17:47.093343 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:47.093353 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:17:47.093362 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:47.093371 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 05:17:47.093382 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:47.093392 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 05:17:47.093402 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 05:17:47.093411 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 05:17:47.093420 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 05:17:47.093429 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 05:17:47.093439 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:47.093449 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:17:47.093460 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:17:47.093469 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:17:47.093479 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 05:17:47.093488 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 05:17:47.093497 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:17:47.093506 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 05:17:47.093515 systemd[1]: Stopped verity-setup.service. Jul 15 05:17:47.093525 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:47.093556 systemd-journald[1275]: Collecting audit messages is disabled. Jul 15 05:17:47.093578 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 05:17:47.093588 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 05:17:47.093597 systemd-journald[1275]: Journal started Jul 15 05:17:47.093621 systemd-journald[1275]: Runtime Journal (/run/log/journal/55c628f724b6430889886ff44522eb53) is 8M, max 158.9M, 150.9M free. Jul 15 05:17:46.738782 systemd[1]: Queued start job for default target multi-user.target. Jul 15 05:17:46.747349 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Jul 15 05:17:46.747634 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 05:17:47.095974 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:17:47.097821 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 05:17:47.103003 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 05:17:47.104793 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 05:17:47.107671 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 05:17:47.109291 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:17:47.120990 kernel: fuse: init (API version 7.41) Jul 15 05:17:47.121027 kernel: loop: module loaded Jul 15 05:17:47.111588 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 05:17:47.111934 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 05:17:47.114366 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:47.114510 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:47.117130 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:47.117296 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:47.119949 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 05:17:47.120400 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 05:17:47.122453 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:47.122579 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:47.125089 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:17:47.127318 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 05:17:47.134993 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 05:17:47.139492 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 05:17:47.143070 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 05:17:47.144893 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 05:17:47.145063 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:17:47.146734 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 05:17:47.153089 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 05:17:47.154725 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:47.158128 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 05:17:47.161512 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 05:17:47.163806 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:17:47.166120 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 05:17:47.169269 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:17:47.172531 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:17:47.179070 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 05:17:47.185833 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 05:17:47.191266 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:17:47.193936 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 05:17:47.196291 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 05:17:47.198772 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 05:17:47.205391 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:17:47.214193 systemd-journald[1275]: Time spent on flushing to /var/log/journal/55c628f724b6430889886ff44522eb53 is 30.069ms for 981 entries. Jul 15 05:17:47.214193 systemd-journald[1275]: System Journal (/var/log/journal/55c628f724b6430889886ff44522eb53) is 11.8M, max 2.6G, 2.6G free. Jul 15 05:17:47.341576 systemd-journald[1275]: Received client request to flush runtime journal. Jul 15 05:17:47.341611 kernel: loop0: detected capacity change from 0 to 28624 Jul 15 05:17:47.341623 systemd-journald[1275]: /var/log/journal/55c628f724b6430889886ff44522eb53/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Jul 15 05:17:47.341639 systemd-journald[1275]: Rotating system journal. Jul 15 05:17:47.219677 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:17:47.234712 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 05:17:47.236323 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 05:17:47.240065 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 05:17:47.292088 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:17:47.307407 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 05:17:47.310674 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:17:47.342608 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 05:17:47.361331 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 05:17:47.373678 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jul 15 05:17:47.373901 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jul 15 05:17:47.376386 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:17:47.384437 kernel: ACPI: bus type drm_connector registered Jul 15 05:17:47.384226 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:17:47.384379 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:17:47.641980 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 05:17:47.694972 kernel: loop1: detected capacity change from 0 to 146488 Jul 15 05:17:47.748970 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 05:17:48.168981 kernel: loop2: detected capacity change from 0 to 114000 Jul 15 05:17:48.246433 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 05:17:48.250163 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:17:48.281641 systemd-udevd[1356]: Using default interface naming scheme 'v255'. Jul 15 05:17:48.350608 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:17:48.354978 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:17:48.408173 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 15 05:17:48.439075 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 05:17:48.497973 kernel: loop3: detected capacity change from 0 to 221472 Jul 15 05:17:48.515193 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 05:17:48.532986 kernel: hv_vmbus: registering driver hyperv_fb Jul 15 05:17:48.536082 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Jul 15 05:17:48.540407 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Jul 15 05:17:48.540450 kernel: loop4: detected capacity change from 0 to 28624 Jul 15 05:17:48.540463 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#51 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Jul 15 05:17:48.540627 kernel: Console: switching to colour dummy device 80x25 Jul 15 05:17:48.546256 kernel: Console: switching to colour frame buffer device 128x48 Jul 15 05:17:48.552985 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 05:17:48.565981 kernel: loop5: detected capacity change from 0 to 146488 Jul 15 05:17:48.584973 kernel: loop6: detected capacity change from 0 to 114000 Jul 15 05:17:48.609420 kernel: loop7: detected capacity change from 0 to 221472 Jul 15 05:17:48.609143 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:48.615978 kernel: hv_vmbus: registering driver hv_balloon Jul 15 05:17:48.618510 systemd-networkd[1366]: lo: Link UP Jul 15 05:17:48.618578 systemd-networkd[1366]: lo: Gained carrier Jul 15 05:17:48.621454 systemd-networkd[1366]: Enumeration completed Jul 15 05:17:48.621523 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:17:48.624846 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 05:17:48.628382 systemd-networkd[1366]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:48.628388 systemd-networkd[1366]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:17:48.629045 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 05:17:48.635975 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Jul 15 05:17:48.650414 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Jul 15 05:17:48.655212 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d496b19 eth0: Data path switched to VF: enP30832s1 Jul 15 05:17:48.661391 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:48.661614 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:48.663375 systemd-networkd[1366]: enP30832s1: Link UP Jul 15 05:17:48.663434 systemd-networkd[1366]: eth0: Link UP Jul 15 05:17:48.663436 systemd-networkd[1366]: eth0: Gained carrier Jul 15 05:17:48.663449 systemd-networkd[1366]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:48.664086 (sd-merge)[1408]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Jul 15 05:17:48.666792 (sd-merge)[1408]: Merged extensions into '/usr'. Jul 15 05:17:48.669205 systemd-networkd[1366]: enP30832s1: Gained carrier Jul 15 05:17:48.678996 systemd-networkd[1366]: eth0: DHCPv4 address 10.200.8.4/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jul 15 05:17:48.697425 systemd[1]: Reload requested from client PID 1330 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 05:17:48.697436 systemd[1]: Reloading... Jul 15 05:17:48.703985 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Jul 15 05:17:48.816852 zram_generator::config[1468]: No configuration found. Jul 15 05:17:48.865120 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Jul 15 05:17:48.932904 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:49.013750 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Jul 15 05:17:49.015337 systemd[1]: Reloading finished in 317 ms. Jul 15 05:17:49.034626 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 05:17:49.038292 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 05:17:49.073025 systemd[1]: Starting ensure-sysext.service... Jul 15 05:17:49.084077 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 05:17:49.086897 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:17:49.091107 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:49.117051 systemd[1]: Reload requested from client PID 1536 ('systemctl') (unit ensure-sysext.service)... Jul 15 05:17:49.118221 systemd[1]: Reloading... Jul 15 05:17:49.118614 systemd-tmpfiles[1538]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:17:49.118826 systemd-tmpfiles[1538]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:17:49.119084 systemd-tmpfiles[1538]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:17:49.119312 systemd-tmpfiles[1538]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:17:49.119915 systemd-tmpfiles[1538]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:17:49.120179 systemd-tmpfiles[1538]: ACLs are not supported, ignoring. Jul 15 05:17:49.120255 systemd-tmpfiles[1538]: ACLs are not supported, ignoring. Jul 15 05:17:49.123682 systemd-tmpfiles[1538]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:17:49.123771 systemd-tmpfiles[1538]: Skipping /boot Jul 15 05:17:49.129022 systemd-tmpfiles[1538]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:17:49.129083 systemd-tmpfiles[1538]: Skipping /boot Jul 15 05:17:49.170012 zram_generator::config[1571]: No configuration found. Jul 15 05:17:49.256634 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:49.335682 systemd[1]: Reloading finished in 217 ms. Jul 15 05:17:49.354798 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 05:17:49.356639 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:17:49.363640 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:17:49.372706 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 05:17:49.375733 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 05:17:49.379014 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:17:49.383196 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 05:17:49.388257 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.388402 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:49.390529 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:49.398919 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:49.400910 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:49.402653 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:49.402761 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:49.402848 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.410842 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.411060 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:49.411254 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:49.411383 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:49.411552 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.414744 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:49.414947 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:49.417372 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:49.422556 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 05:17:49.429145 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:49.429274 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:49.431712 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:49.432066 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:49.440914 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.441566 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:49.444145 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:49.450037 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:17:49.456390 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:49.461137 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:49.464039 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:49.464153 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:49.464294 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 05:17:49.466376 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:49.468297 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:49.468552 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:49.471771 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:17:49.477133 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:17:49.479577 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 05:17:49.489608 systemd[1]: Finished ensure-sysext.service. Jul 15 05:17:49.491421 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:49.491633 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:49.493328 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:49.493748 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:49.498684 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:17:49.498737 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:17:49.511587 systemd-resolved[1640]: Positive Trust Anchors: Jul 15 05:17:49.511595 systemd-resolved[1640]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:17:49.511618 systemd-resolved[1640]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:17:49.528659 systemd-resolved[1640]: Using system hostname 'ci-4396.0.0-n-1e5a06c7e3'. Jul 15 05:17:49.529987 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:17:49.533090 systemd[1]: Reached target network.target - Network. Jul 15 05:17:49.536058 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:17:49.542564 augenrules[1681]: No rules Jul 15 05:17:49.543267 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:17:49.543434 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:17:49.794164 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 05:17:49.795883 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:17:50.186182 systemd-networkd[1366]: eth0: Gained IPv6LL Jul 15 05:17:50.188533 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 05:17:50.193179 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 05:17:50.506089 systemd-networkd[1366]: enP30832s1: Gained IPv6LL Jul 15 05:17:52.506519 ldconfig[1325]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 05:17:52.517772 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 05:17:52.521075 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 05:17:52.537225 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 05:17:52.540168 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:17:52.541464 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 05:17:52.544049 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 05:17:52.545360 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 15 05:17:52.548091 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 05:17:52.549295 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 05:17:52.552006 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 05:17:52.555005 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 05:17:52.555035 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:17:52.557991 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:17:52.561981 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 05:17:52.564102 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 05:17:52.566553 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 05:17:52.569166 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 05:17:52.572037 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 05:17:52.584386 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 05:17:52.585781 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 05:17:52.590501 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 05:17:52.594613 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:17:52.596996 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:17:52.597845 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:17:52.597864 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:17:52.599541 systemd[1]: Starting chronyd.service - NTP client/server... Jul 15 05:17:52.603034 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 05:17:52.608458 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 05:17:52.612825 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 05:17:52.618067 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 05:17:52.622070 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 05:17:52.626527 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 05:17:52.628122 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 05:17:52.629568 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 15 05:17:52.631381 systemd[1]: hv_fcopy_uio_daemon.service - Hyper-V FCOPY UIO daemon was skipped because of an unmet condition check (ConditionPathExists=/sys/bus/vmbus/devices/eb765408-105f-49b6-b4aa-c123b64d17d4/uio). Jul 15 05:17:52.633060 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Jul 15 05:17:52.634871 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Jul 15 05:17:52.636403 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:52.641017 jq[1702]: false Jul 15 05:17:52.643139 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 05:17:52.650342 KVP[1705]: KVP starting; pid is:1705 Jul 15 05:17:52.651468 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 05:17:52.653980 kernel: hv_utils: KVP IC version 4.0 Jul 15 05:17:52.654111 KVP[1705]: KVP LIC Version: 3.1 Jul 15 05:17:52.655799 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 05:17:52.661546 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 05:17:52.664753 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 05:17:52.673102 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 05:17:52.675582 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 05:17:52.675977 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 05:17:52.681195 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 05:17:52.684012 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Refreshing passwd entry cache Jul 15 05:17:52.684998 oslogin_cache_refresh[1704]: Refreshing passwd entry cache Jul 15 05:17:52.688561 extend-filesystems[1703]: Found /dev/nvme0n1p6 Jul 15 05:17:52.690160 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 05:17:52.696344 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 05:17:52.698628 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 05:17:52.698782 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 05:17:52.701276 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 05:17:52.702277 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 05:17:52.709535 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Failure getting users, quitting Jul 15 05:17:52.709535 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:17:52.709535 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Refreshing group entry cache Jul 15 05:17:52.709391 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 05:17:52.709249 oslogin_cache_refresh[1704]: Failure getting users, quitting Jul 15 05:17:52.709544 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 05:17:52.709262 oslogin_cache_refresh[1704]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:17:52.709294 oslogin_cache_refresh[1704]: Refreshing group entry cache Jul 15 05:17:52.710525 jq[1721]: true Jul 15 05:17:52.712096 extend-filesystems[1703]: Found /dev/nvme0n1p9 Jul 15 05:17:52.719724 extend-filesystems[1703]: Checking size of /dev/nvme0n1p9 Jul 15 05:17:52.723030 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Failure getting groups, quitting Jul 15 05:17:52.723030 google_oslogin_nss_cache[1704]: oslogin_cache_refresh[1704]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:17:52.722281 oslogin_cache_refresh[1704]: Failure getting groups, quitting Jul 15 05:17:52.722289 oslogin_cache_refresh[1704]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:17:52.724251 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 15 05:17:52.724435 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 15 05:17:52.739458 (ntainerd)[1743]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 05:17:52.745256 (chronyd)[1694]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Jul 15 05:17:52.753116 jq[1732]: true Jul 15 05:17:52.758489 chronyd[1750]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Jul 15 05:17:52.765243 extend-filesystems[1703]: Old size kept for /dev/nvme0n1p9 Jul 15 05:17:52.768333 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 05:17:52.768513 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 05:17:52.783007 chronyd[1750]: Timezone right/UTC failed leap second check, ignoring Jul 15 05:17:52.789199 chronyd[1750]: Loaded seccomp filter (level 2) Jul 15 05:17:52.790747 systemd[1]: Started chronyd.service - NTP client/server. Jul 15 05:17:52.812472 update_engine[1718]: I20250715 05:17:52.812398 1718 main.cc:92] Flatcar Update Engine starting Jul 15 05:17:52.821164 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 05:17:52.826767 tar[1730]: linux-amd64/helm Jul 15 05:17:52.827828 systemd-logind[1717]: New seat seat0. Jul 15 05:17:52.832306 systemd-logind[1717]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 15 05:17:52.832426 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 05:17:52.866574 bash[1771]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:17:52.867352 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 05:17:52.871361 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jul 15 05:17:52.878665 dbus-daemon[1697]: [system] SELinux support is enabled Jul 15 05:17:52.879256 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 05:17:52.884381 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 05:17:52.884411 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 05:17:52.886644 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 05:17:52.886664 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 05:17:52.896314 dbus-daemon[1697]: [system] Successfully activated service 'org.freedesktop.systemd1' Jul 15 05:17:52.899245 systemd[1]: Started update-engine.service - Update Engine. Jul 15 05:17:52.904351 update_engine[1718]: I20250715 05:17:52.904309 1718 update_check_scheduler.cc:74] Next update check in 5m52s Jul 15 05:17:52.905143 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 05:17:53.000323 coreos-metadata[1696]: Jul 15 05:17:53.000 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Jul 15 05:17:53.010513 coreos-metadata[1696]: Jul 15 05:17:53.010 INFO Fetch successful Jul 15 05:17:53.010669 coreos-metadata[1696]: Jul 15 05:17:53.010 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Jul 15 05:17:53.014023 coreos-metadata[1696]: Jul 15 05:17:53.013 INFO Fetch successful Jul 15 05:17:53.014023 coreos-metadata[1696]: Jul 15 05:17:53.013 INFO Fetching http://168.63.129.16/machine/3f52465f-3f7b-4fa7-8229-624ee60a9d64/ac78a18c%2D5c00%2D4052%2D9ada%2De543b5b280a0.%5Fci%2D4396.0.0%2Dn%2D1e5a06c7e3?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Jul 15 05:17:53.015413 coreos-metadata[1696]: Jul 15 05:17:53.015 INFO Fetch successful Jul 15 05:17:53.019521 coreos-metadata[1696]: Jul 15 05:17:53.016 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Jul 15 05:17:53.028184 coreos-metadata[1696]: Jul 15 05:17:53.027 INFO Fetch successful Jul 15 05:17:53.093919 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 05:17:53.095999 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 05:17:53.163298 sshd_keygen[1727]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 05:17:53.185248 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 05:17:53.187950 locksmithd[1790]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 05:17:53.191212 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 05:17:53.194099 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Jul 15 05:17:53.218339 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 05:17:53.218507 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 05:17:53.222401 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 05:17:53.236816 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Jul 15 05:17:53.251221 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 05:17:53.256341 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 05:17:53.262197 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 15 05:17:53.264206 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 05:17:53.411068 tar[1730]: linux-amd64/LICENSE Jul 15 05:17:53.411212 tar[1730]: linux-amd64/README.md Jul 15 05:17:53.424922 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 05:17:53.930113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:53.942212 (kubelet)[1846]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:17:54.002103 containerd[1743]: time="2025-07-15T05:17:54Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 05:17:54.003737 containerd[1743]: time="2025-07-15T05:17:54.003696241Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 15 05:17:54.011933 containerd[1743]: time="2025-07-15T05:17:54.011902211Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.068µs" Jul 15 05:17:54.011933 containerd[1743]: time="2025-07-15T05:17:54.011925119Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 05:17:54.012037 containerd[1743]: time="2025-07-15T05:17:54.011941158Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012073984Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012087809Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012106118Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012148029Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012156964Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012351425Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012361445Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012370172Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012377204Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012422082Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012648 containerd[1743]: time="2025-07-15T05:17:54.012563154Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012865 containerd[1743]: time="2025-07-15T05:17:54.012581283Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:17:54.012865 containerd[1743]: time="2025-07-15T05:17:54.012589383Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 05:17:54.012865 containerd[1743]: time="2025-07-15T05:17:54.012624942Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 05:17:54.012917 containerd[1743]: time="2025-07-15T05:17:54.012875958Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 05:17:54.012942 containerd[1743]: time="2025-07-15T05:17:54.012930813Z" level=info msg="metadata content store policy set" policy=shared Jul 15 05:17:54.031442 containerd[1743]: time="2025-07-15T05:17:54.031411311Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 05:17:54.031517 containerd[1743]: time="2025-07-15T05:17:54.031458684Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 05:17:54.031517 containerd[1743]: time="2025-07-15T05:17:54.031477308Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 05:17:54.031517 containerd[1743]: time="2025-07-15T05:17:54.031489256Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 05:17:54.031517 containerd[1743]: time="2025-07-15T05:17:54.031502588Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 05:17:54.031517 containerd[1743]: time="2025-07-15T05:17:54.031515744Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031529422Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031539847Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031551028Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031560950Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031570633Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 05:17:54.031601 containerd[1743]: time="2025-07-15T05:17:54.031584634Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 05:17:54.031698 containerd[1743]: time="2025-07-15T05:17:54.031687690Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 05:17:54.031716 containerd[1743]: time="2025-07-15T05:17:54.031703716Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 05:17:54.031732 containerd[1743]: time="2025-07-15T05:17:54.031721537Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 05:17:54.031751 containerd[1743]: time="2025-07-15T05:17:54.031732586Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 05:17:54.031751 containerd[1743]: time="2025-07-15T05:17:54.031744017Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 05:17:54.031784 containerd[1743]: time="2025-07-15T05:17:54.031753242Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 05:17:54.031784 containerd[1743]: time="2025-07-15T05:17:54.031763304Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 05:17:54.031784 containerd[1743]: time="2025-07-15T05:17:54.031772793Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 05:17:54.031838 containerd[1743]: time="2025-07-15T05:17:54.031782322Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 05:17:54.031838 containerd[1743]: time="2025-07-15T05:17:54.031791279Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 05:17:54.031838 containerd[1743]: time="2025-07-15T05:17:54.031800073Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 05:17:54.031883 containerd[1743]: time="2025-07-15T05:17:54.031855671Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 05:17:54.031883 containerd[1743]: time="2025-07-15T05:17:54.031866524Z" level=info msg="Start snapshots syncer" Jul 15 05:17:54.031919 containerd[1743]: time="2025-07-15T05:17:54.031890759Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 05:17:54.032564 containerd[1743]: time="2025-07-15T05:17:54.032140122Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 05:17:54.032564 containerd[1743]: time="2025-07-15T05:17:54.032188776Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032251244Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032336652Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032353772Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032363147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032373500Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032384448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032395395Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032404597Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032423618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032432994Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032442170Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032473992Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032487173Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:17:54.032714 containerd[1743]: time="2025-07-15T05:17:54.032496040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032504383Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032511208Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032519522Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032529629Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032543256Z" level=info msg="runtime interface created" Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032547999Z" level=info msg="created NRI interface" Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032554686Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032564660Z" level=info msg="Connect containerd service" Jul 15 05:17:54.032941 containerd[1743]: time="2025-07-15T05:17:54.032585825Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 05:17:54.033300 containerd[1743]: time="2025-07-15T05:17:54.033283306Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 05:17:54.568855 kubelet[1846]: E0715 05:17:54.568809 1846 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:17:54.570707 waagent[1830]: 2025-07-15T05:17:54.570651Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Jul 15 05:17:54.571141 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:17:54.571276 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:17:54.572237 systemd[1]: kubelet.service: Consumed 838ms CPU time, 262.9M memory peak. Jul 15 05:17:54.574076 waagent[1830]: 2025-07-15T05:17:54.574023Z INFO Daemon Daemon OS: flatcar 4396.0.0 Jul 15 05:17:54.575228 waagent[1830]: 2025-07-15T05:17:54.575192Z INFO Daemon Daemon Python: 3.11.13 Jul 15 05:17:54.576389 waagent[1830]: 2025-07-15T05:17:54.576350Z INFO Daemon Daemon Run daemon Jul 15 05:17:54.577729 waagent[1830]: 2025-07-15T05:17:54.577694Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4396.0.0' Jul 15 05:17:54.581024 waagent[1830]: 2025-07-15T05:17:54.580989Z INFO Daemon Daemon Using waagent for provisioning Jul 15 05:17:54.582210 waagent[1830]: 2025-07-15T05:17:54.582180Z INFO Daemon Daemon Activate resource disk Jul 15 05:17:54.583306 waagent[1830]: 2025-07-15T05:17:54.583279Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Jul 15 05:17:54.587110 waagent[1830]: 2025-07-15T05:17:54.587074Z INFO Daemon Daemon Found device: None Jul 15 05:17:54.590063 waagent[1830]: 2025-07-15T05:17:54.590023Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Jul 15 05:17:54.591767 waagent[1830]: 2025-07-15T05:17:54.591720Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Jul 15 05:17:54.594397 waagent[1830]: 2025-07-15T05:17:54.594351Z INFO Daemon Daemon Clean protocol and wireserver endpoint Jul 15 05:17:54.597429 waagent[1830]: 2025-07-15T05:17:54.597081Z INFO Daemon Daemon Running default provisioning handler Jul 15 05:17:54.605418 waagent[1830]: 2025-07-15T05:17:54.605273Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Jul 15 05:17:54.610375 waagent[1830]: 2025-07-15T05:17:54.610342Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Jul 15 05:17:54.612190 waagent[1830]: 2025-07-15T05:17:54.612158Z INFO Daemon Daemon cloud-init is enabled: False Jul 15 05:17:54.615980 waagent[1830]: 2025-07-15T05:17:54.615921Z INFO Daemon Daemon Copying ovf-env.xml Jul 15 05:17:54.712230 waagent[1830]: 2025-07-15T05:17:54.712092Z INFO Daemon Daemon Successfully mounted dvd Jul 15 05:17:54.733296 containerd[1743]: time="2025-07-15T05:17:54.733260080Z" level=info msg="Start subscribing containerd event" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733318270Z" level=info msg="Start recovering state" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733415506Z" level=info msg="Start event monitor" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733426440Z" level=info msg="Start cni network conf syncer for default" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733437286Z" level=info msg="Start streaming server" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733451840Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733459787Z" level=info msg="runtime interface starting up..." Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733465507Z" level=info msg="starting plugins..." Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733477011Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733870334Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 05:17:54.734277 containerd[1743]: time="2025-07-15T05:17:54.733909357Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 05:17:54.734038 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 05:17:54.737010 containerd[1743]: time="2025-07-15T05:17:54.736982081Z" level=info msg="containerd successfully booted in 0.732434s" Jul 15 05:17:54.738045 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Jul 15 05:17:54.738698 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 05:17:54.742586 systemd[1]: Startup finished in 2.978s (kernel) + 16.763s (initrd) + 10.286s (userspace) = 30.028s. Jul 15 05:17:54.743648 waagent[1830]: 2025-07-15T05:17:54.743569Z INFO Daemon Daemon Detect protocol endpoint Jul 15 05:17:54.745532 waagent[1830]: 2025-07-15T05:17:54.745036Z INFO Daemon Daemon Clean protocol and wireserver endpoint Jul 15 05:17:54.746458 waagent[1830]: 2025-07-15T05:17:54.746417Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Jul 15 05:17:54.750504 waagent[1830]: 2025-07-15T05:17:54.750133Z INFO Daemon Daemon Test for route to 168.63.129.16 Jul 15 05:17:54.751480 waagent[1830]: 2025-07-15T05:17:54.751448Z INFO Daemon Daemon Route to 168.63.129.16 exists Jul 15 05:17:54.754681 waagent[1830]: 2025-07-15T05:17:54.754537Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Jul 15 05:17:54.766481 waagent[1830]: 2025-07-15T05:17:54.766451Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Jul 15 05:17:54.768071 waagent[1830]: 2025-07-15T05:17:54.768054Z INFO Daemon Daemon Wire protocol version:2012-11-30 Jul 15 05:17:54.771013 waagent[1830]: 2025-07-15T05:17:54.770989Z INFO Daemon Daemon Server preferred version:2015-04-05 Jul 15 05:17:54.876016 waagent[1830]: 2025-07-15T05:17:54.875918Z INFO Daemon Daemon Initializing goal state during protocol detection Jul 15 05:17:54.876619 waagent[1830]: 2025-07-15T05:17:54.876108Z INFO Daemon Daemon Forcing an update of the goal state. Jul 15 05:17:54.880399 waagent[1830]: 2025-07-15T05:17:54.880370Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Jul 15 05:17:54.910418 waagent[1830]: 2025-07-15T05:17:54.910383Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Jul 15 05:17:54.912811 waagent[1830]: 2025-07-15T05:17:54.910866Z INFO Daemon Jul 15 05:17:54.912811 waagent[1830]: 2025-07-15T05:17:54.911114Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: cd050bec-a863-4e1e-8b5a-8f0d0516b00a eTag: 14110770053555717338 source: Fabric] Jul 15 05:17:54.912811 waagent[1830]: 2025-07-15T05:17:54.911593Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Jul 15 05:17:54.912811 waagent[1830]: 2025-07-15T05:17:54.911905Z INFO Daemon Jul 15 05:17:54.912811 waagent[1830]: 2025-07-15T05:17:54.912065Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Jul 15 05:17:54.926331 waagent[1830]: 2025-07-15T05:17:54.926306Z INFO Daemon Daemon Downloading artifacts profile blob Jul 15 05:17:54.998552 waagent[1830]: 2025-07-15T05:17:54.998505Z INFO Daemon Downloaded certificate {'thumbprint': '143EF3980F616EA0B645B5EE30F4A8DF0D43C0B8', 'hasPrivateKey': True} Jul 15 05:17:55.001650 waagent[1830]: 2025-07-15T05:17:54.998934Z INFO Daemon Fetch goal state completed Jul 15 05:17:55.006778 waagent[1830]: 2025-07-15T05:17:55.006742Z INFO Daemon Daemon Starting provisioning Jul 15 05:17:55.009206 waagent[1830]: 2025-07-15T05:17:55.006898Z INFO Daemon Daemon Handle ovf-env.xml. Jul 15 05:17:55.009206 waagent[1830]: 2025-07-15T05:17:55.007154Z INFO Daemon Daemon Set hostname [ci-4396.0.0-n-1e5a06c7e3] Jul 15 05:17:55.022625 waagent[1830]: 2025-07-15T05:17:55.022587Z INFO Daemon Daemon Publish hostname [ci-4396.0.0-n-1e5a06c7e3] Jul 15 05:17:55.022973 waagent[1830]: 2025-07-15T05:17:55.022836Z INFO Daemon Daemon Examine /proc/net/route for primary interface Jul 15 05:17:55.023769 waagent[1830]: 2025-07-15T05:17:55.023096Z INFO Daemon Daemon Primary interface is [eth0] Jul 15 05:17:55.030003 systemd-networkd[1366]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:55.030016 systemd-networkd[1366]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:17:55.030037 systemd-networkd[1366]: eth0: DHCP lease lost Jul 15 05:17:55.030841 waagent[1830]: 2025-07-15T05:17:55.030797Z INFO Daemon Daemon Create user account if not exists Jul 15 05:17:55.031125 waagent[1830]: 2025-07-15T05:17:55.031096Z INFO Daemon Daemon User core already exists, skip useradd Jul 15 05:17:55.031211 waagent[1830]: 2025-07-15T05:17:55.031193Z INFO Daemon Daemon Configure sudoer Jul 15 05:17:55.055982 systemd-networkd[1366]: eth0: DHCPv4 address 10.200.8.4/24, gateway 10.200.8.1 acquired from 168.63.129.16 Jul 15 05:17:55.455924 waagent[1830]: 2025-07-15T05:17:55.455828Z INFO Daemon Daemon Configure sshd Jul 15 05:17:55.463723 waagent[1830]: 2025-07-15T05:17:55.463669Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Jul 15 05:17:55.464867 waagent[1830]: 2025-07-15T05:17:55.464380Z INFO Daemon Daemon Deploy ssh public key. Jul 15 05:17:55.470462 login[1833]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 15 05:17:55.470480 login[1832]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 15 05:17:55.480774 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 05:17:55.482317 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 05:17:55.489498 systemd-logind[1717]: New session 1 of user core. Jul 15 05:17:55.493000 systemd-logind[1717]: New session 2 of user core. Jul 15 05:17:55.498241 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 05:17:55.500140 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 05:17:55.525500 (systemd)[1893]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 05:17:55.527108 systemd-logind[1717]: New session c1 of user core. Jul 15 05:17:55.626345 systemd[1893]: Queued start job for default target default.target. Jul 15 05:17:55.635622 systemd[1893]: Created slice app.slice - User Application Slice. Jul 15 05:17:55.635649 systemd[1893]: Reached target paths.target - Paths. Jul 15 05:17:55.635678 systemd[1893]: Reached target timers.target - Timers. Jul 15 05:17:55.636433 systemd[1893]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 05:17:55.643089 systemd[1893]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 05:17:55.643132 systemd[1893]: Reached target sockets.target - Sockets. Jul 15 05:17:55.643164 systemd[1893]: Reached target basic.target - Basic System. Jul 15 05:17:55.643224 systemd[1893]: Reached target default.target - Main User Target. Jul 15 05:17:55.643243 systemd[1893]: Startup finished in 112ms. Jul 15 05:17:55.643271 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 05:17:55.644622 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 05:17:55.645407 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 05:17:56.550584 waagent[1830]: 2025-07-15T05:17:56.550516Z INFO Daemon Daemon Provisioning complete Jul 15 05:17:56.569184 waagent[1830]: 2025-07-15T05:17:56.569152Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Jul 15 05:17:56.570381 waagent[1830]: 2025-07-15T05:17:56.570350Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Jul 15 05:17:56.572205 waagent[1830]: 2025-07-15T05:17:56.572138Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Jul 15 05:17:56.663690 waagent[1926]: 2025-07-15T05:17:56.663624Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Jul 15 05:17:56.663950 waagent[1926]: 2025-07-15T05:17:56.663715Z INFO ExtHandler ExtHandler OS: flatcar 4396.0.0 Jul 15 05:17:56.663950 waagent[1926]: 2025-07-15T05:17:56.663751Z INFO ExtHandler ExtHandler Python: 3.11.13 Jul 15 05:17:56.663950 waagent[1926]: 2025-07-15T05:17:56.663787Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Jul 15 05:17:56.686568 waagent[1926]: 2025-07-15T05:17:56.686523Z INFO ExtHandler ExtHandler Distro: flatcar-4396.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Jul 15 05:17:56.686695 waagent[1926]: 2025-07-15T05:17:56.686672Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jul 15 05:17:56.686756 waagent[1926]: 2025-07-15T05:17:56.686721Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Jul 15 05:17:56.690972 waagent[1926]: 2025-07-15T05:17:56.690918Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Jul 15 05:17:56.695472 waagent[1926]: 2025-07-15T05:17:56.695442Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Jul 15 05:17:56.695774 waagent[1926]: 2025-07-15T05:17:56.695746Z INFO ExtHandler Jul 15 05:17:56.695815 waagent[1926]: 2025-07-15T05:17:56.695792Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 85e7a048-c90a-4857-b916-ab927c4c35da eTag: 14110770053555717338 source: Fabric] Jul 15 05:17:56.696006 waagent[1926]: 2025-07-15T05:17:56.695975Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Jul 15 05:17:56.696292 waagent[1926]: 2025-07-15T05:17:56.696268Z INFO ExtHandler Jul 15 05:17:56.696334 waagent[1926]: 2025-07-15T05:17:56.696303Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Jul 15 05:17:56.699968 waagent[1926]: 2025-07-15T05:17:56.699939Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Jul 15 05:17:56.752778 waagent[1926]: 2025-07-15T05:17:56.752731Z INFO ExtHandler Downloaded certificate {'thumbprint': '143EF3980F616EA0B645B5EE30F4A8DF0D43C0B8', 'hasPrivateKey': True} Jul 15 05:17:56.753146 waagent[1926]: 2025-07-15T05:17:56.753119Z INFO ExtHandler Fetch goal state completed Jul 15 05:17:56.769350 waagent[1926]: 2025-07-15T05:17:56.769310Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Jul 15 05:17:56.772925 waagent[1926]: 2025-07-15T05:17:56.772873Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1926 Jul 15 05:17:56.773026 waagent[1926]: 2025-07-15T05:17:56.772998Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Jul 15 05:17:56.773228 waagent[1926]: 2025-07-15T05:17:56.773209Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Jul 15 05:17:56.774087 waagent[1926]: 2025-07-15T05:17:56.774056Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4396.0.0', '', 'Flatcar Container Linux by Kinvolk'] Jul 15 05:17:56.774331 waagent[1926]: 2025-07-15T05:17:56.774309Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4396.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Jul 15 05:17:56.774428 waagent[1926]: 2025-07-15T05:17:56.774408Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Jul 15 05:17:56.774743 waagent[1926]: 2025-07-15T05:17:56.774721Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Jul 15 05:17:56.792414 waagent[1926]: 2025-07-15T05:17:56.792392Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Jul 15 05:17:56.792524 waagent[1926]: 2025-07-15T05:17:56.792504Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Jul 15 05:17:56.797399 waagent[1926]: 2025-07-15T05:17:56.797091Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Jul 15 05:17:56.801634 systemd[1]: Reload requested from client PID 1941 ('systemctl') (unit waagent.service)... Jul 15 05:17:56.801647 systemd[1]: Reloading... Jul 15 05:17:56.866970 zram_generator::config[1976]: No configuration found. Jul 15 05:17:56.942662 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:57.024099 systemd[1]: Reloading finished in 222 ms. Jul 15 05:17:57.049553 waagent[1926]: 2025-07-15T05:17:57.049491Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Jul 15 05:17:57.049614 waagent[1926]: 2025-07-15T05:17:57.049599Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Jul 15 05:17:57.867654 waagent[1926]: 2025-07-15T05:17:57.867578Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Jul 15 05:17:57.867951 waagent[1926]: 2025-07-15T05:17:57.867910Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Jul 15 05:17:57.868637 waagent[1926]: 2025-07-15T05:17:57.868520Z INFO ExtHandler ExtHandler Starting env monitor service. Jul 15 05:17:57.868872 waagent[1926]: 2025-07-15T05:17:57.868836Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Jul 15 05:17:57.869191 waagent[1926]: 2025-07-15T05:17:57.869133Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Jul 15 05:17:57.869232 waagent[1926]: 2025-07-15T05:17:57.869198Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Jul 15 05:17:57.869254 waagent[1926]: 2025-07-15T05:17:57.869238Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jul 15 05:17:57.869299 waagent[1926]: 2025-07-15T05:17:57.869282Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Jul 15 05:17:57.869418 waagent[1926]: 2025-07-15T05:17:57.869399Z INFO EnvHandler ExtHandler Configure routes Jul 15 05:17:57.869462 waagent[1926]: 2025-07-15T05:17:57.869448Z INFO EnvHandler ExtHandler Gateway:None Jul 15 05:17:57.869589 waagent[1926]: 2025-07-15T05:17:57.869488Z INFO EnvHandler ExtHandler Routes:None Jul 15 05:17:57.869764 waagent[1926]: 2025-07-15T05:17:57.869746Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Jul 15 05:17:57.870037 waagent[1926]: 2025-07-15T05:17:57.870013Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Jul 15 05:17:57.870113 waagent[1926]: 2025-07-15T05:17:57.870079Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Jul 15 05:17:57.870184 waagent[1926]: 2025-07-15T05:17:57.870155Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Jul 15 05:17:57.870453 waagent[1926]: 2025-07-15T05:17:57.870434Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Jul 15 05:17:57.870524 waagent[1926]: 2025-07-15T05:17:57.870492Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Jul 15 05:17:57.871010 waagent[1926]: 2025-07-15T05:17:57.870975Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Jul 15 05:17:57.871010 waagent[1926]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Jul 15 05:17:57.871010 waagent[1926]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Jul 15 05:17:57.871010 waagent[1926]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Jul 15 05:17:57.871010 waagent[1926]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Jul 15 05:17:57.871010 waagent[1926]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Jul 15 05:17:57.871010 waagent[1926]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Jul 15 05:17:57.880982 waagent[1926]: 2025-07-15T05:17:57.880929Z INFO ExtHandler ExtHandler Jul 15 05:17:57.881052 waagent[1926]: 2025-07-15T05:17:57.881010Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: d103bdcd-e5c8-4fc6-b945-f2ff161313e3 correlation cb2e4acf-0689-4173-8673-459d20473579 created: 2025-07-15T05:16:53.531593Z] Jul 15 05:17:57.881265 waagent[1926]: 2025-07-15T05:17:57.881242Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Jul 15 05:17:57.881605 waagent[1926]: 2025-07-15T05:17:57.881583Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Jul 15 05:17:57.914390 waagent[1926]: 2025-07-15T05:17:57.914354Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Jul 15 05:17:57.914390 waagent[1926]: Try `iptables -h' or 'iptables --help' for more information.) Jul 15 05:17:57.914800 waagent[1926]: 2025-07-15T05:17:57.914775Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 5FA6BBAF-6210-4C3C-A2D7-88257B698CEB;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Jul 15 05:17:57.941058 waagent[1926]: 2025-07-15T05:17:57.941021Z INFO MonitorHandler ExtHandler Network interfaces: Jul 15 05:17:57.941058 waagent[1926]: Executing ['ip', '-a', '-o', 'link']: Jul 15 05:17:57.941058 waagent[1926]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Jul 15 05:17:57.941058 waagent[1926]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:49:6b:19 brd ff:ff:ff:ff:ff:ff\ alias Network Device Jul 15 05:17:57.941058 waagent[1926]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:49:6b:19 brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 Jul 15 05:17:57.941058 waagent[1926]: Executing ['ip', '-4', '-a', '-o', 'address']: Jul 15 05:17:57.941058 waagent[1926]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Jul 15 05:17:57.941058 waagent[1926]: 2: eth0 inet 10.200.8.4/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Jul 15 05:17:57.941058 waagent[1926]: Executing ['ip', '-6', '-a', '-o', 'address']: Jul 15 05:17:57.941058 waagent[1926]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Jul 15 05:17:57.941058 waagent[1926]: 2: eth0 inet6 fe80::7eed:8dff:fe49:6b19/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Jul 15 05:17:57.941058 waagent[1926]: 3: enP30832s1 inet6 fe80::7eed:8dff:fe49:6b19/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Jul 15 05:17:57.979837 waagent[1926]: 2025-07-15T05:17:57.979789Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Jul 15 05:17:57.979837 waagent[1926]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Jul 15 05:17:57.979837 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.979837 waagent[1926]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Jul 15 05:17:57.979837 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.979837 waagent[1926]: Chain OUTPUT (policy ACCEPT 1 packets, 60 bytes) Jul 15 05:17:57.979837 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.979837 waagent[1926]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Jul 15 05:17:57.979837 waagent[1926]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Jul 15 05:17:57.979837 waagent[1926]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Jul 15 05:17:57.982226 waagent[1926]: 2025-07-15T05:17:57.982184Z INFO EnvHandler ExtHandler Current Firewall rules: Jul 15 05:17:57.982226 waagent[1926]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Jul 15 05:17:57.982226 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.982226 waagent[1926]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Jul 15 05:17:57.982226 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.982226 waagent[1926]: Chain OUTPUT (policy ACCEPT 1 packets, 60 bytes) Jul 15 05:17:57.982226 waagent[1926]: pkts bytes target prot opt in out source destination Jul 15 05:17:57.982226 waagent[1926]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Jul 15 05:17:57.982226 waagent[1926]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Jul 15 05:17:57.982226 waagent[1926]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Jul 15 05:18:04.587192 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 05:18:04.589182 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:05.181839 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:05.187140 (kubelet)[2077]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:18:05.215780 kubelet[2077]: E0715 05:18:05.215745 2077 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:18:05.218392 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:18:05.218509 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:18:05.218835 systemd[1]: kubelet.service: Consumed 117ms CPU time, 110.5M memory peak. Jul 15 05:18:13.338356 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 05:18:13.339348 systemd[1]: Started sshd@0-10.200.8.4:22-10.200.16.10:52730.service - OpenSSH per-connection server daemon (10.200.16.10:52730). Jul 15 05:18:14.071897 sshd[2085]: Accepted publickey for core from 10.200.16.10 port 52730 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:14.073252 sshd-session[2085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:14.077917 systemd-logind[1717]: New session 3 of user core. Jul 15 05:18:14.087096 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 05:18:14.622133 systemd[1]: Started sshd@1-10.200.8.4:22-10.200.16.10:52734.service - OpenSSH per-connection server daemon (10.200.16.10:52734). Jul 15 05:18:15.248075 sshd[2091]: Accepted publickey for core from 10.200.16.10 port 52734 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:15.249431 sshd-session[2091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:15.250619 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 05:18:15.252871 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:15.255680 systemd-logind[1717]: New session 4 of user core. Jul 15 05:18:15.263829 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 05:18:15.688767 sshd[2097]: Connection closed by 10.200.16.10 port 52734 Jul 15 05:18:15.689378 sshd-session[2091]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:15.693114 systemd[1]: sshd@1-10.200.8.4:22-10.200.16.10:52734.service: Deactivated successfully. Jul 15 05:18:15.694532 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 05:18:15.695162 systemd-logind[1717]: Session 4 logged out. Waiting for processes to exit. Jul 15 05:18:15.696089 systemd-logind[1717]: Removed session 4. Jul 15 05:18:15.731404 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:15.742167 (kubelet)[2107]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:18:15.783876 kubelet[2107]: E0715 05:18:15.783841 2107 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:18:15.785240 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:18:15.785354 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:18:15.785613 systemd[1]: kubelet.service: Consumed 120ms CPU time, 111.7M memory peak. Jul 15 05:18:15.802042 systemd[1]: Started sshd@2-10.200.8.4:22-10.200.16.10:52740.service - OpenSSH per-connection server daemon (10.200.16.10:52740). Jul 15 05:18:16.431436 sshd[2116]: Accepted publickey for core from 10.200.16.10 port 52740 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:16.432796 sshd-session[2116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:16.437328 systemd-logind[1717]: New session 5 of user core. Jul 15 05:18:16.443101 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 05:18:16.577183 chronyd[1750]: Selected source PHC0 Jul 15 05:18:16.869559 sshd[2119]: Connection closed by 10.200.16.10 port 52740 Jul 15 05:18:16.870151 sshd-session[2116]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:16.873794 systemd[1]: sshd@2-10.200.8.4:22-10.200.16.10:52740.service: Deactivated successfully. Jul 15 05:18:16.875201 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 05:18:16.875779 systemd-logind[1717]: Session 5 logged out. Waiting for processes to exit. Jul 15 05:18:16.876795 systemd-logind[1717]: Removed session 5. Jul 15 05:18:16.980011 systemd[1]: Started sshd@3-10.200.8.4:22-10.200.16.10:52756.service - OpenSSH per-connection server daemon (10.200.16.10:52756). Jul 15 05:18:17.608818 sshd[2125]: Accepted publickey for core from 10.200.16.10 port 52756 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:17.610127 sshd-session[2125]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:17.614477 systemd-logind[1717]: New session 6 of user core. Jul 15 05:18:17.620100 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 05:18:18.049824 sshd[2128]: Connection closed by 10.200.16.10 port 52756 Jul 15 05:18:18.050396 sshd-session[2125]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:18.054366 systemd[1]: sshd@3-10.200.8.4:22-10.200.16.10:52756.service: Deactivated successfully. Jul 15 05:18:18.055747 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 05:18:18.056380 systemd-logind[1717]: Session 6 logged out. Waiting for processes to exit. Jul 15 05:18:18.057504 systemd-logind[1717]: Removed session 6. Jul 15 05:18:18.168786 systemd[1]: Started sshd@4-10.200.8.4:22-10.200.16.10:52772.service - OpenSSH per-connection server daemon (10.200.16.10:52772). Jul 15 05:18:18.795670 sshd[2134]: Accepted publickey for core from 10.200.16.10 port 52772 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:18.797042 sshd-session[2134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:18.801678 systemd-logind[1717]: New session 7 of user core. Jul 15 05:18:18.811093 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 05:18:19.237577 sudo[2138]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 05:18:19.237768 sudo[2138]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:18:19.257593 sudo[2138]: pam_unix(sudo:session): session closed for user root Jul 15 05:18:19.357523 sshd[2137]: Connection closed by 10.200.16.10 port 52772 Jul 15 05:18:19.358289 sshd-session[2134]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:19.362151 systemd[1]: sshd@4-10.200.8.4:22-10.200.16.10:52772.service: Deactivated successfully. Jul 15 05:18:19.363548 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 05:18:19.364269 systemd-logind[1717]: Session 7 logged out. Waiting for processes to exit. Jul 15 05:18:19.365206 systemd-logind[1717]: Removed session 7. Jul 15 05:18:19.483380 systemd[1]: Started sshd@5-10.200.8.4:22-10.200.16.10:52780.service - OpenSSH per-connection server daemon (10.200.16.10:52780). Jul 15 05:18:20.113107 sshd[2144]: Accepted publickey for core from 10.200.16.10 port 52780 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:20.114487 sshd-session[2144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:20.118719 systemd-logind[1717]: New session 8 of user core. Jul 15 05:18:20.125079 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 05:18:20.455143 sudo[2149]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 05:18:20.455477 sudo[2149]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:18:20.465595 sudo[2149]: pam_unix(sudo:session): session closed for user root Jul 15 05:18:20.469181 sudo[2148]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 05:18:20.469368 sudo[2148]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:18:20.476257 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:18:20.502904 augenrules[2171]: No rules Jul 15 05:18:20.503737 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:18:20.503927 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:18:20.504802 sudo[2148]: pam_unix(sudo:session): session closed for user root Jul 15 05:18:20.604373 sshd[2147]: Connection closed by 10.200.16.10 port 52780 Jul 15 05:18:20.604847 sshd-session[2144]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:20.608137 systemd[1]: sshd@5-10.200.8.4:22-10.200.16.10:52780.service: Deactivated successfully. Jul 15 05:18:20.609382 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 05:18:20.610016 systemd-logind[1717]: Session 8 logged out. Waiting for processes to exit. Jul 15 05:18:20.610985 systemd-logind[1717]: Removed session 8. Jul 15 05:18:20.720111 systemd[1]: Started sshd@6-10.200.8.4:22-10.200.16.10:51814.service - OpenSSH per-connection server daemon (10.200.16.10:51814). Jul 15 05:18:21.347245 sshd[2180]: Accepted publickey for core from 10.200.16.10 port 51814 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:18:21.348550 sshd-session[2180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:21.352803 systemd-logind[1717]: New session 9 of user core. Jul 15 05:18:21.356112 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 05:18:21.689950 sudo[2184]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 05:18:21.690169 sudo[2184]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:18:22.957178 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 05:18:22.973223 (dockerd)[2203]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 05:18:23.543101 dockerd[2203]: time="2025-07-15T05:18:23.543052072Z" level=info msg="Starting up" Jul 15 05:18:23.543751 dockerd[2203]: time="2025-07-15T05:18:23.543714531Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 05:18:23.551972 dockerd[2203]: time="2025-07-15T05:18:23.551921019Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 15 05:18:23.708420 dockerd[2203]: time="2025-07-15T05:18:23.708381046Z" level=info msg="Loading containers: start." Jul 15 05:18:23.732973 kernel: Initializing XFRM netlink socket Jul 15 05:18:24.112482 systemd-networkd[1366]: docker0: Link UP Jul 15 05:18:24.124675 dockerd[2203]: time="2025-07-15T05:18:24.124637268Z" level=info msg="Loading containers: done." Jul 15 05:18:24.151058 dockerd[2203]: time="2025-07-15T05:18:24.151023043Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 05:18:24.151169 dockerd[2203]: time="2025-07-15T05:18:24.151092629Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 15 05:18:24.151169 dockerd[2203]: time="2025-07-15T05:18:24.151154184Z" level=info msg="Initializing buildkit" Jul 15 05:18:24.206511 dockerd[2203]: time="2025-07-15T05:18:24.206467942Z" level=info msg="Completed buildkit initialization" Jul 15 05:18:24.212249 dockerd[2203]: time="2025-07-15T05:18:24.212203348Z" level=info msg="Daemon has completed initialization" Jul 15 05:18:24.212317 dockerd[2203]: time="2025-07-15T05:18:24.212260749Z" level=info msg="API listen on /run/docker.sock" Jul 15 05:18:24.212448 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 05:18:25.585911 containerd[1743]: time="2025-07-15T05:18:25.585874377Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 15 05:18:25.837142 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 15 05:18:25.839098 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:26.494715 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:26.498735 (kubelet)[2418]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:18:26.527202 kubelet[2418]: E0715 05:18:26.527171 2418 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:18:26.528605 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:18:26.528740 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:18:26.529109 systemd[1]: kubelet.service: Consumed 116ms CPU time, 108.3M memory peak. Jul 15 05:18:26.837271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3344698166.mount: Deactivated successfully. Jul 15 05:18:28.126824 containerd[1743]: time="2025-07-15T05:18:28.126772011Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:28.129988 containerd[1743]: time="2025-07-15T05:18:28.129964171Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=28077752" Jul 15 05:18:28.132705 containerd[1743]: time="2025-07-15T05:18:28.132668121Z" level=info msg="ImageCreate event name:\"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:28.136257 containerd[1743]: time="2025-07-15T05:18:28.136100685Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:28.136638 containerd[1743]: time="2025-07-15T05:18:28.136617143Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"28074544\" in 2.55070772s" Jul 15 05:18:28.136671 containerd[1743]: time="2025-07-15T05:18:28.136649075Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\"" Jul 15 05:18:28.137349 containerd[1743]: time="2025-07-15T05:18:28.137200445Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 15 05:18:29.362392 containerd[1743]: time="2025-07-15T05:18:29.362344774Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:29.364464 containerd[1743]: time="2025-07-15T05:18:29.364434988Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=24713302" Jul 15 05:18:29.367372 containerd[1743]: time="2025-07-15T05:18:29.367334980Z" level=info msg="ImageCreate event name:\"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:29.378964 containerd[1743]: time="2025-07-15T05:18:29.378917669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:29.379718 containerd[1743]: time="2025-07-15T05:18:29.379541814Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"26315128\" in 1.242315741s" Jul 15 05:18:29.379718 containerd[1743]: time="2025-07-15T05:18:29.379570405Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\"" Jul 15 05:18:29.380082 containerd[1743]: time="2025-07-15T05:18:29.380047044Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 15 05:18:30.596871 containerd[1743]: time="2025-07-15T05:18:30.596826736Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:30.599351 containerd[1743]: time="2025-07-15T05:18:30.599313940Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=18783679" Jul 15 05:18:30.602011 containerd[1743]: time="2025-07-15T05:18:30.601976554Z" level=info msg="ImageCreate event name:\"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:30.606403 containerd[1743]: time="2025-07-15T05:18:30.605672901Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:30.606403 containerd[1743]: time="2025-07-15T05:18:30.606229051Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"20385523\" in 1.22614619s" Jul 15 05:18:30.606403 containerd[1743]: time="2025-07-15T05:18:30.606252938Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\"" Jul 15 05:18:30.606868 containerd[1743]: time="2025-07-15T05:18:30.606852951Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 15 05:18:31.457838 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1513075462.mount: Deactivated successfully. Jul 15 05:18:31.786247 containerd[1743]: time="2025-07-15T05:18:31.786143679Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.788530 containerd[1743]: time="2025-07-15T05:18:31.788506029Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=30383951" Jul 15 05:18:31.791422 containerd[1743]: time="2025-07-15T05:18:31.791384110Z" level=info msg="ImageCreate event name:\"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.795028 containerd[1743]: time="2025-07-15T05:18:31.794988820Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.795329 containerd[1743]: time="2025-07-15T05:18:31.795274132Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"30382962\" in 1.188334238s" Jul 15 05:18:31.795329 containerd[1743]: time="2025-07-15T05:18:31.795304781Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\"" Jul 15 05:18:31.795872 containerd[1743]: time="2025-07-15T05:18:31.795842455Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 15 05:18:32.408281 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3782136419.mount: Deactivated successfully. Jul 15 05:18:33.397707 containerd[1743]: time="2025-07-15T05:18:33.397655846Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:33.400060 containerd[1743]: time="2025-07-15T05:18:33.400036309Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Jul 15 05:18:33.402733 containerd[1743]: time="2025-07-15T05:18:33.402568088Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:33.406672 containerd[1743]: time="2025-07-15T05:18:33.406646302Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:33.407294 containerd[1743]: time="2025-07-15T05:18:33.407270528Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.611403299s" Jul 15 05:18:33.407344 containerd[1743]: time="2025-07-15T05:18:33.407301591Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 15 05:18:33.407863 containerd[1743]: time="2025-07-15T05:18:33.407844725Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 05:18:33.945974 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4249071528.mount: Deactivated successfully. Jul 15 05:18:33.966758 containerd[1743]: time="2025-07-15T05:18:33.966719632Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:18:33.969906 containerd[1743]: time="2025-07-15T05:18:33.969872528Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Jul 15 05:18:33.973138 containerd[1743]: time="2025-07-15T05:18:33.973110906Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:18:33.976686 containerd[1743]: time="2025-07-15T05:18:33.976636420Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:18:33.977314 containerd[1743]: time="2025-07-15T05:18:33.977030475Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 569.159143ms" Jul 15 05:18:33.977314 containerd[1743]: time="2025-07-15T05:18:33.977056918Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 15 05:18:33.977520 containerd[1743]: time="2025-07-15T05:18:33.977502566Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 15 05:18:34.576341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3968256517.mount: Deactivated successfully. Jul 15 05:18:36.270157 containerd[1743]: time="2025-07-15T05:18:36.270109291Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:36.272658 containerd[1743]: time="2025-07-15T05:18:36.272627071Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" Jul 15 05:18:36.275832 containerd[1743]: time="2025-07-15T05:18:36.275797116Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:36.283164 containerd[1743]: time="2025-07-15T05:18:36.283131816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:36.283843 containerd[1743]: time="2025-07-15T05:18:36.283820463Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.306292281s" Jul 15 05:18:36.283888 containerd[1743]: time="2025-07-15T05:18:36.283849255Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jul 15 05:18:36.587200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 15 05:18:36.589081 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:36.825974 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Jul 15 05:18:37.174443 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:37.183179 (kubelet)[2634]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:18:37.220320 kubelet[2634]: E0715 05:18:37.220282 2634 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:18:37.222092 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:18:37.222202 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:18:37.222467 systemd[1]: kubelet.service: Consumed 116ms CPU time, 108.5M memory peak. Jul 15 05:18:37.719752 update_engine[1718]: I20250715 05:18:37.719177 1718 update_attempter.cc:509] Updating boot flags... Jul 15 05:18:38.389027 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:38.389317 systemd[1]: kubelet.service: Consumed 116ms CPU time, 108.5M memory peak. Jul 15 05:18:38.390995 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:38.408864 systemd[1]: Reload requested from client PID 2671 ('systemctl') (unit session-9.scope)... Jul 15 05:18:38.408945 systemd[1]: Reloading... Jul 15 05:18:38.491051 zram_generator::config[2717]: No configuration found. Jul 15 05:18:38.647290 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:18:38.729901 systemd[1]: Reloading finished in 320 ms. Jul 15 05:18:38.759250 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 05:18:38.759316 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 05:18:38.759574 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:38.759609 systemd[1]: kubelet.service: Consumed 69ms CPU time, 78M memory peak. Jul 15 05:18:38.761153 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:39.390875 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:39.397189 (kubelet)[2784]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:18:39.430372 kubelet[2784]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:18:39.430372 kubelet[2784]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:18:39.430372 kubelet[2784]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:18:39.430629 kubelet[2784]: I0715 05:18:39.430420 2784 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:18:39.504555 kubelet[2784]: I0715 05:18:39.504530 2784 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:18:39.504555 kubelet[2784]: I0715 05:18:39.504549 2784 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:18:39.504735 kubelet[2784]: I0715 05:18:39.504725 2784 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:18:39.532353 kubelet[2784]: I0715 05:18:39.532333 2784 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:18:39.538536 kubelet[2784]: I0715 05:18:39.538519 2784 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:18:39.539585 kubelet[2784]: E0715 05:18:39.539526 2784 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.4:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.4:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:18:39.542912 kubelet[2784]: I0715 05:18:39.542889 2784 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:18:39.543736 kubelet[2784]: I0715 05:18:39.543717 2784 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:18:39.543857 kubelet[2784]: I0715 05:18:39.543838 2784 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:18:39.544108 kubelet[2784]: I0715 05:18:39.543860 2784 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396.0.0-n-1e5a06c7e3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:18:39.544218 kubelet[2784]: I0715 05:18:39.544116 2784 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:18:39.544218 kubelet[2784]: I0715 05:18:39.544129 2784 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:18:39.544218 kubelet[2784]: I0715 05:18:39.544213 2784 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:18:39.551649 kubelet[2784]: I0715 05:18:39.551628 2784 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:18:39.551649 kubelet[2784]: I0715 05:18:39.551648 2784 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:18:39.551751 kubelet[2784]: I0715 05:18:39.551675 2784 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:18:39.551751 kubelet[2784]: I0715 05:18:39.551692 2784 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:18:39.552674 kubelet[2784]: W0715 05:18:39.552608 2784 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.4:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396.0.0-n-1e5a06c7e3&limit=500&resourceVersion=0": dial tcp 10.200.8.4:6443: connect: connection refused Jul 15 05:18:39.552674 kubelet[2784]: E0715 05:18:39.552660 2784 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.4:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396.0.0-n-1e5a06c7e3&limit=500&resourceVersion=0\": dial tcp 10.200.8.4:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:18:39.556355 kubelet[2784]: W0715 05:18:39.556213 2784 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.4:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.4:6443: connect: connection refused Jul 15 05:18:39.556355 kubelet[2784]: E0715 05:18:39.556267 2784 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.4:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.4:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:18:39.556598 kubelet[2784]: I0715 05:18:39.556588 2784 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:18:39.556975 kubelet[2784]: I0715 05:18:39.556944 2784 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:18:39.557540 kubelet[2784]: W0715 05:18:39.557525 2784 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 05:18:39.559405 kubelet[2784]: I0715 05:18:39.559197 2784 server.go:1274] "Started kubelet" Jul 15 05:18:39.560275 kubelet[2784]: I0715 05:18:39.560242 2784 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:18:39.565217 kubelet[2784]: E0715 05:18:39.563660 2784 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.4:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.4:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4396.0.0-n-1e5a06c7e3.1852550d0e0c6363 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396.0.0-n-1e5a06c7e3,UID:ci-4396.0.0-n-1e5a06c7e3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396.0.0-n-1e5a06c7e3,},FirstTimestamp:2025-07-15 05:18:39.559172963 +0000 UTC m=+0.158527414,LastTimestamp:2025-07-15 05:18:39.559172963 +0000 UTC m=+0.158527414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396.0.0-n-1e5a06c7e3,}" Jul 15 05:18:39.565666 kubelet[2784]: I0715 05:18:39.565636 2784 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:18:39.566557 kubelet[2784]: I0715 05:18:39.566534 2784 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:18:39.567027 kubelet[2784]: I0715 05:18:39.567006 2784 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:18:39.567249 kubelet[2784]: I0715 05:18:39.567226 2784 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:18:39.567414 kubelet[2784]: I0715 05:18:39.567399 2784 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:18:39.569025 kubelet[2784]: I0715 05:18:39.569014 2784 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:18:39.569330 kubelet[2784]: E0715 05:18:39.569317 2784 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 05:18:39.570078 kubelet[2784]: I0715 05:18:39.569616 2784 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:18:39.570078 kubelet[2784]: I0715 05:18:39.569657 2784 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:18:39.570313 kubelet[2784]: W0715 05:18:39.570282 2784 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.4:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.4:6443: connect: connection refused Jul 15 05:18:39.570380 kubelet[2784]: E0715 05:18:39.570368 2784 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.4:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.4:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:18:39.570562 kubelet[2784]: I0715 05:18:39.570552 2784 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:18:39.570664 kubelet[2784]: I0715 05:18:39.570655 2784 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:18:39.571525 kubelet[2784]: E0715 05:18:39.571506 2784 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4396.0.0-n-1e5a06c7e3\" not found" Jul 15 05:18:39.571822 kubelet[2784]: E0715 05:18:39.571803 2784 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396.0.0-n-1e5a06c7e3?timeout=10s\": dial tcp 10.200.8.4:6443: connect: connection refused" interval="200ms" Jul 15 05:18:39.572002 kubelet[2784]: I0715 05:18:39.571994 2784 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:18:39.592160 kubelet[2784]: I0715 05:18:39.592139 2784 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:18:39.592160 kubelet[2784]: I0715 05:18:39.592158 2784 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:18:39.592242 kubelet[2784]: I0715 05:18:39.592171 2784 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:18:39.597222 kubelet[2784]: I0715 05:18:39.597188 2784 policy_none.go:49] "None policy: Start" Jul 15 05:18:39.597633 kubelet[2784]: I0715 05:18:39.597615 2784 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:18:39.597633 kubelet[2784]: I0715 05:18:39.597631 2784 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:18:39.606838 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 05:18:39.608843 kubelet[2784]: I0715 05:18:39.608815 2784 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:18:39.609593 kubelet[2784]: I0715 05:18:39.609570 2784 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:18:39.609593 kubelet[2784]: I0715 05:18:39.609588 2784 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:18:39.609670 kubelet[2784]: I0715 05:18:39.609601 2784 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:18:39.609670 kubelet[2784]: E0715 05:18:39.609628 2784 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:18:39.615107 kubelet[2784]: W0715 05:18:39.615056 2784 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.4:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.4:6443: connect: connection refused Jul 15 05:18:39.615107 kubelet[2784]: E0715 05:18:39.615086 2784 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.4:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.4:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:18:39.616046 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 05:18:39.618534 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 05:18:39.625392 kubelet[2784]: I0715 05:18:39.625368 2784 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:18:39.625509 kubelet[2784]: I0715 05:18:39.625498 2784 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:18:39.625539 kubelet[2784]: I0715 05:18:39.625508 2784 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:18:39.625862 kubelet[2784]: I0715 05:18:39.625846 2784 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:18:39.627532 kubelet[2784]: E0715 05:18:39.627518 2784 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4396.0.0-n-1e5a06c7e3\" not found" Jul 15 05:18:39.717726 systemd[1]: Created slice kubepods-burstable-pod459f17f3fe2d9b00711e1bfdba0cf76c.slice - libcontainer container kubepods-burstable-pod459f17f3fe2d9b00711e1bfdba0cf76c.slice. Jul 15 05:18:39.727305 kubelet[2784]: I0715 05:18:39.727265 2784 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.727525 kubelet[2784]: E0715 05:18:39.727506 2784 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.4:6443/api/v1/nodes\": dial tcp 10.200.8.4:6443: connect: connection refused" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.735653 systemd[1]: Created slice kubepods-burstable-podb9cb2820fa90450bce250323da5b62f0.slice - libcontainer container kubepods-burstable-podb9cb2820fa90450bce250323da5b62f0.slice. Jul 15 05:18:39.747853 systemd[1]: Created slice kubepods-burstable-pod29222c309a3cb664d9bcbfa56b357906.slice - libcontainer container kubepods-burstable-pod29222c309a3cb664d9bcbfa56b357906.slice. Jul 15 05:18:39.772637 kubelet[2784]: E0715 05:18:39.772611 2784 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396.0.0-n-1e5a06c7e3?timeout=10s\": dial tcp 10.200.8.4:6443: connect: connection refused" interval="400ms" Jul 15 05:18:39.871089 kubelet[2784]: I0715 05:18:39.871050 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-k8s-certs\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871089 kubelet[2784]: I0715 05:18:39.871090 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-k8s-certs\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871244 kubelet[2784]: I0715 05:18:39.871115 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-kubeconfig\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871244 kubelet[2784]: I0715 05:18:39.871134 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871244 kubelet[2784]: I0715 05:18:39.871154 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-ca-certs\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871244 kubelet[2784]: I0715 05:18:39.871175 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871244 kubelet[2784]: I0715 05:18:39.871195 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-ca-certs\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871403 kubelet[2784]: I0715 05:18:39.871226 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-flexvolume-dir\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.871403 kubelet[2784]: I0715 05:18:39.871250 2784 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/29222c309a3cb664d9bcbfa56b357906-kubeconfig\") pod \"kube-scheduler-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"29222c309a3cb664d9bcbfa56b357906\") " pod="kube-system/kube-scheduler-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.929910 kubelet[2784]: I0715 05:18:39.929882 2784 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:39.930308 kubelet[2784]: E0715 05:18:39.930281 2784 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.4:6443/api/v1/nodes\": dial tcp 10.200.8.4:6443: connect: connection refused" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:40.034842 containerd[1743]: time="2025-07-15T05:18:40.034728840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396.0.0-n-1e5a06c7e3,Uid:459f17f3fe2d9b00711e1bfdba0cf76c,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:40.038205 containerd[1743]: time="2025-07-15T05:18:40.038170360Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3,Uid:b9cb2820fa90450bce250323da5b62f0,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:40.050241 containerd[1743]: time="2025-07-15T05:18:40.050212896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396.0.0-n-1e5a06c7e3,Uid:29222c309a3cb664d9bcbfa56b357906,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:40.131974 containerd[1743]: time="2025-07-15T05:18:40.131327397Z" level=info msg="connecting to shim 040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7" address="unix:///run/containerd/s/21e3694cd89cd4ac9531ab36d41c3a6c5b2bbb4b8c17ac26c6bf02bd90651204" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:40.154091 systemd[1]: Started cri-containerd-040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7.scope - libcontainer container 040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7. Jul 15 05:18:40.174351 kubelet[2784]: E0715 05:18:40.173443 2784 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396.0.0-n-1e5a06c7e3?timeout=10s\": dial tcp 10.200.8.4:6443: connect: connection refused" interval="800ms" Jul 15 05:18:40.179810 containerd[1743]: time="2025-07-15T05:18:40.179475274Z" level=info msg="connecting to shim 08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6" address="unix:///run/containerd/s/d3aaa87f36a15056538ad4f8671e715ab8f82419e009fa9ca26c4554b71c3ed1" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:40.184337 containerd[1743]: time="2025-07-15T05:18:40.184313376Z" level=info msg="connecting to shim fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04" address="unix:///run/containerd/s/5ad2a30039154b24ad5cd544f51689518c2ab24e9d13f9ebe9512a6a37c08e7a" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:40.215096 systemd[1]: Started cri-containerd-fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04.scope - libcontainer container fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04. Jul 15 05:18:40.218138 systemd[1]: Started cri-containerd-08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6.scope - libcontainer container 08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6. Jul 15 05:18:40.229594 containerd[1743]: time="2025-07-15T05:18:40.229517368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396.0.0-n-1e5a06c7e3,Uid:459f17f3fe2d9b00711e1bfdba0cf76c,Namespace:kube-system,Attempt:0,} returns sandbox id \"040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7\"" Jul 15 05:18:40.234109 containerd[1743]: time="2025-07-15T05:18:40.234071117Z" level=info msg="CreateContainer within sandbox \"040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 05:18:40.256715 containerd[1743]: time="2025-07-15T05:18:40.256517088Z" level=info msg="Container 66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:40.275936 containerd[1743]: time="2025-07-15T05:18:40.275915185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3,Uid:b9cb2820fa90450bce250323da5b62f0,Namespace:kube-system,Attempt:0,} returns sandbox id \"fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04\"" Jul 15 05:18:40.276262 containerd[1743]: time="2025-07-15T05:18:40.276239279Z" level=info msg="CreateContainer within sandbox \"040cd4d4bffa9f3f0d59a43c59365e0c7fbf67350ec2345772c0634cb4d8e0d7\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19\"" Jul 15 05:18:40.276927 containerd[1743]: time="2025-07-15T05:18:40.276780300Z" level=info msg="StartContainer for \"66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19\"" Jul 15 05:18:40.279187 containerd[1743]: time="2025-07-15T05:18:40.279124438Z" level=info msg="CreateContainer within sandbox \"fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 05:18:40.279187 containerd[1743]: time="2025-07-15T05:18:40.279148309Z" level=info msg="connecting to shim 66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19" address="unix:///run/containerd/s/21e3694cd89cd4ac9531ab36d41c3a6c5b2bbb4b8c17ac26c6bf02bd90651204" protocol=ttrpc version=3 Jul 15 05:18:40.279723 containerd[1743]: time="2025-07-15T05:18:40.279700149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396.0.0-n-1e5a06c7e3,Uid:29222c309a3cb664d9bcbfa56b357906,Namespace:kube-system,Attempt:0,} returns sandbox id \"08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6\"" Jul 15 05:18:40.283387 containerd[1743]: time="2025-07-15T05:18:40.283292423Z" level=info msg="CreateContainer within sandbox \"08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 05:18:40.298076 systemd[1]: Started cri-containerd-66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19.scope - libcontainer container 66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19. Jul 15 05:18:40.303254 containerd[1743]: time="2025-07-15T05:18:40.303227938Z" level=info msg="Container 4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:40.309380 containerd[1743]: time="2025-07-15T05:18:40.309355912Z" level=info msg="Container fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:40.324158 containerd[1743]: time="2025-07-15T05:18:40.323326238Z" level=info msg="CreateContainer within sandbox \"fefaeb0f7782953dae08ba73731c4094b0ae5a6f48bd413f903d28b2f8c08d04\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c\"" Jul 15 05:18:40.324158 containerd[1743]: time="2025-07-15T05:18:40.323772051Z" level=info msg="StartContainer for \"4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c\"" Jul 15 05:18:40.324846 containerd[1743]: time="2025-07-15T05:18:40.324791932Z" level=info msg="connecting to shim 4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c" address="unix:///run/containerd/s/5ad2a30039154b24ad5cd544f51689518c2ab24e9d13f9ebe9512a6a37c08e7a" protocol=ttrpc version=3 Jul 15 05:18:40.332834 kubelet[2784]: I0715 05:18:40.332808 2784 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:40.333171 kubelet[2784]: E0715 05:18:40.333150 2784 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.4:6443/api/v1/nodes\": dial tcp 10.200.8.4:6443: connect: connection refused" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:40.335919 containerd[1743]: time="2025-07-15T05:18:40.335892032Z" level=info msg="CreateContainer within sandbox \"08a6df1742555eef7bf7af77c9e1e8262d2fda8e0a48c6ec64b72cd1adeea2c6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0\"" Jul 15 05:18:40.340079 containerd[1743]: time="2025-07-15T05:18:40.339842005Z" level=info msg="StartContainer for \"fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0\"" Jul 15 05:18:40.340964 containerd[1743]: time="2025-07-15T05:18:40.340929356Z" level=info msg="connecting to shim fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0" address="unix:///run/containerd/s/d3aaa87f36a15056538ad4f8671e715ab8f82419e009fa9ca26c4554b71c3ed1" protocol=ttrpc version=3 Jul 15 05:18:40.343145 systemd[1]: Started cri-containerd-4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c.scope - libcontainer container 4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c. Jul 15 05:18:40.353271 containerd[1743]: time="2025-07-15T05:18:40.353252489Z" level=info msg="StartContainer for \"66b073f477c413b58cc4965cd5a735fcafbec419b771ec5391bbf84be9fe9b19\" returns successfully" Jul 15 05:18:40.363251 systemd[1]: Started cri-containerd-fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0.scope - libcontainer container fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0. Jul 15 05:18:40.426051 containerd[1743]: time="2025-07-15T05:18:40.426011663Z" level=info msg="StartContainer for \"4e88ac7896dec9ebeb27c2e9b1ca454db0cb60c0119cf677cf85a34be259176c\" returns successfully" Jul 15 05:18:40.477693 containerd[1743]: time="2025-07-15T05:18:40.477620711Z" level=info msg="StartContainer for \"fc5676cdefad548d91e80b55b00ead818045575e4a7cf3ad1bd6d488b8f208e0\" returns successfully" Jul 15 05:18:41.138294 kubelet[2784]: I0715 05:18:41.138269 2784 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:41.926176 kubelet[2784]: I0715 05:18:41.926142 2784 kubelet_node_status.go:75] "Successfully registered node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:41.937399 kubelet[2784]: E0715 05:18:41.937307 2784 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4396.0.0-n-1e5a06c7e3.1852550d0e0c6363 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396.0.0-n-1e5a06c7e3,UID:ci-4396.0.0-n-1e5a06c7e3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396.0.0-n-1e5a06c7e3,},FirstTimestamp:2025-07-15 05:18:39.559172963 +0000 UTC m=+0.158527414,LastTimestamp:2025-07-15 05:18:39.559172963 +0000 UTC m=+0.158527414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396.0.0-n-1e5a06c7e3,}" Jul 15 05:18:42.557642 kubelet[2784]: I0715 05:18:42.557597 2784 apiserver.go:52] "Watching apiserver" Jul 15 05:18:42.569896 kubelet[2784]: I0715 05:18:42.569794 2784 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:18:43.836022 systemd[1]: Reload requested from client PID 3050 ('systemctl') (unit session-9.scope)... Jul 15 05:18:43.836035 systemd[1]: Reloading... Jul 15 05:18:43.902981 zram_generator::config[3095]: No configuration found. Jul 15 05:18:43.988433 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:18:44.084237 systemd[1]: Reloading finished in 247 ms. Jul 15 05:18:44.109158 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:44.121146 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:18:44.121334 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:44.121372 systemd[1]: kubelet.service: Consumed 411ms CPU time, 128.6M memory peak. Jul 15 05:18:44.122995 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:18:44.619919 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:18:44.625374 (kubelet)[3163]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:18:44.663444 kubelet[3163]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:18:44.663444 kubelet[3163]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:18:44.663444 kubelet[3163]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:18:44.663671 kubelet[3163]: I0715 05:18:44.663518 3163 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:18:44.668542 kubelet[3163]: I0715 05:18:44.668510 3163 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:18:44.668620 kubelet[3163]: I0715 05:18:44.668545 3163 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:18:44.668925 kubelet[3163]: I0715 05:18:44.668807 3163 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:18:44.670137 kubelet[3163]: I0715 05:18:44.670079 3163 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 15 05:18:44.671835 kubelet[3163]: I0715 05:18:44.671767 3163 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:18:44.674692 kubelet[3163]: I0715 05:18:44.674676 3163 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:18:44.676862 kubelet[3163]: I0715 05:18:44.676839 3163 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:18:44.676971 kubelet[3163]: I0715 05:18:44.676946 3163 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:18:44.677061 kubelet[3163]: I0715 05:18:44.677042 3163 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:18:44.677962 kubelet[3163]: I0715 05:18:44.677063 3163 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396.0.0-n-1e5a06c7e3","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:18:44.677962 kubelet[3163]: I0715 05:18:44.677332 3163 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:18:44.677962 kubelet[3163]: I0715 05:18:44.677342 3163 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:18:44.677962 kubelet[3163]: I0715 05:18:44.677378 3163 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:18:44.677962 kubelet[3163]: I0715 05:18:44.677513 3163 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:18:44.678162 kubelet[3163]: I0715 05:18:44.677522 3163 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:18:44.678162 kubelet[3163]: I0715 05:18:44.677550 3163 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:18:44.678162 kubelet[3163]: I0715 05:18:44.677559 3163 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:18:44.681038 kubelet[3163]: I0715 05:18:44.681021 3163 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:18:44.681483 kubelet[3163]: I0715 05:18:44.681470 3163 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:18:44.681888 kubelet[3163]: I0715 05:18:44.681872 3163 server.go:1274] "Started kubelet" Jul 15 05:18:44.689378 kubelet[3163]: I0715 05:18:44.689360 3163 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:18:44.693021 kubelet[3163]: I0715 05:18:44.692052 3163 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:18:44.696804 kubelet[3163]: I0715 05:18:44.693100 3163 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:18:44.703633 kubelet[3163]: I0715 05:18:44.703609 3163 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:18:44.703695 kubelet[3163]: I0715 05:18:44.696014 3163 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:18:44.703772 kubelet[3163]: I0715 05:18:44.693454 3163 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:18:44.703853 kubelet[3163]: I0715 05:18:44.696028 3163 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:18:44.703934 kubelet[3163]: I0715 05:18:44.703926 3163 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:18:44.704294 kubelet[3163]: I0715 05:18:44.704269 3163 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:18:44.704825 kubelet[3163]: E0715 05:18:44.696135 3163 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4396.0.0-n-1e5a06c7e3\" not found" Jul 15 05:18:44.705752 kubelet[3163]: I0715 05:18:44.705729 3163 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:18:44.705752 kubelet[3163]: I0715 05:18:44.705756 3163 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:18:44.705828 kubelet[3163]: I0715 05:18:44.705774 3163 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:18:44.705828 kubelet[3163]: E0715 05:18:44.705802 3163 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:18:44.706736 kubelet[3163]: I0715 05:18:44.700064 3163 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:18:44.715217 kubelet[3163]: I0715 05:18:44.715199 3163 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:18:44.715217 kubelet[3163]: I0715 05:18:44.715215 3163 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:18:44.715303 kubelet[3163]: I0715 05:18:44.715281 3163 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:18:44.774509 kubelet[3163]: I0715 05:18:44.774483 3163 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:18:44.774509 kubelet[3163]: I0715 05:18:44.774506 3163 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:18:44.774593 kubelet[3163]: I0715 05:18:44.774520 3163 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:18:44.774656 kubelet[3163]: I0715 05:18:44.774627 3163 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 05:18:44.774683 kubelet[3163]: I0715 05:18:44.774642 3163 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 05:18:44.774683 kubelet[3163]: I0715 05:18:44.774671 3163 policy_none.go:49] "None policy: Start" Jul 15 05:18:44.775108 kubelet[3163]: I0715 05:18:44.775095 3163 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:18:44.775157 kubelet[3163]: I0715 05:18:44.775112 3163 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:18:44.775230 kubelet[3163]: I0715 05:18:44.775219 3163 state_mem.go:75] "Updated machine memory state" Jul 15 05:18:44.778062 kubelet[3163]: I0715 05:18:44.778030 3163 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:18:44.778470 kubelet[3163]: I0715 05:18:44.778461 3163 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:18:44.778545 kubelet[3163]: I0715 05:18:44.778525 3163 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:18:44.779156 kubelet[3163]: I0715 05:18:44.779146 3163 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:18:44.815110 kubelet[3163]: W0715 05:18:44.815096 3163 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 15 05:18:44.818930 kubelet[3163]: W0715 05:18:44.818912 3163 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 15 05:18:44.819369 kubelet[3163]: W0715 05:18:44.819275 3163 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 15 05:18:44.881753 kubelet[3163]: I0715 05:18:44.881694 3163 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.892112 kubelet[3163]: I0715 05:18:44.892089 3163 kubelet_node_status.go:111] "Node was previously registered" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.892174 kubelet[3163]: I0715 05:18:44.892136 3163 kubelet_node_status.go:75] "Successfully registered node" node="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.904888 kubelet[3163]: I0715 05:18:44.904667 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-ca-certs\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.904888 kubelet[3163]: I0715 05:18:44.904692 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-kubeconfig\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.904888 kubelet[3163]: I0715 05:18:44.904711 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.904888 kubelet[3163]: I0715 05:18:44.904729 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/29222c309a3cb664d9bcbfa56b357906-kubeconfig\") pod \"kube-scheduler-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"29222c309a3cb664d9bcbfa56b357906\") " pod="kube-system/kube-scheduler-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.904888 kubelet[3163]: I0715 05:18:44.904745 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-ca-certs\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.905099 kubelet[3163]: I0715 05:18:44.904761 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-flexvolume-dir\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.905099 kubelet[3163]: I0715 05:18:44.904777 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b9cb2820fa90450bce250323da5b62f0-k8s-certs\") pod \"kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"b9cb2820fa90450bce250323da5b62f0\") " pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.905099 kubelet[3163]: I0715 05:18:44.904793 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-k8s-certs\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:44.905099 kubelet[3163]: I0715 05:18:44.904811 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/459f17f3fe2d9b00711e1bfdba0cf76c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" (UID: \"459f17f3fe2d9b00711e1bfdba0cf76c\") " pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:45.679249 kubelet[3163]: I0715 05:18:45.679212 3163 apiserver.go:52] "Watching apiserver" Jul 15 05:18:45.704909 kubelet[3163]: I0715 05:18:45.704867 3163 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:18:45.756134 kubelet[3163]: W0715 05:18:45.756100 3163 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 15 05:18:45.756314 kubelet[3163]: E0715 05:18:45.756254 3163 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4396.0.0-n-1e5a06c7e3\" already exists" pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:18:45.765760 kubelet[3163]: I0715 05:18:45.765688 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4396.0.0-n-1e5a06c7e3" podStartSLOduration=1.765675785 podStartE2EDuration="1.765675785s" podCreationTimestamp="2025-07-15 05:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:45.765509682 +0000 UTC m=+1.136504578" watchObservedRunningTime="2025-07-15 05:18:45.765675785 +0000 UTC m=+1.136670681" Jul 15 05:18:45.788162 kubelet[3163]: I0715 05:18:45.788118 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4396.0.0-n-1e5a06c7e3" podStartSLOduration=1.788104209 podStartE2EDuration="1.788104209s" podCreationTimestamp="2025-07-15 05:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:45.776258383 +0000 UTC m=+1.147253282" watchObservedRunningTime="2025-07-15 05:18:45.788104209 +0000 UTC m=+1.159099222" Jul 15 05:18:45.798014 kubelet[3163]: I0715 05:18:45.797980 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4396.0.0-n-1e5a06c7e3" podStartSLOduration=1.797946797 podStartE2EDuration="1.797946797s" podCreationTimestamp="2025-07-15 05:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:45.788391501 +0000 UTC m=+1.159386398" watchObservedRunningTime="2025-07-15 05:18:45.797946797 +0000 UTC m=+1.168941694" Jul 15 05:18:50.625026 kubelet[3163]: I0715 05:18:50.624991 3163 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 05:18:50.625571 containerd[1743]: time="2025-07-15T05:18:50.625538595Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 05:18:50.626538 kubelet[3163]: I0715 05:18:50.625801 3163 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 05:18:50.659156 systemd[1]: Created slice kubepods-besteffort-pod1f013bd4_12df_4863_8b43_cf0de0c6f018.slice - libcontainer container kubepods-besteffort-pod1f013bd4_12df_4863_8b43_cf0de0c6f018.slice. Jul 15 05:18:50.741256 kubelet[3163]: I0715 05:18:50.741229 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f013bd4-12df-4863-8b43-cf0de0c6f018-lib-modules\") pod \"kube-proxy-p4qll\" (UID: \"1f013bd4-12df-4863-8b43-cf0de0c6f018\") " pod="kube-system/kube-proxy-p4qll" Jul 15 05:18:50.741256 kubelet[3163]: I0715 05:18:50.741260 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1f013bd4-12df-4863-8b43-cf0de0c6f018-xtables-lock\") pod \"kube-proxy-p4qll\" (UID: \"1f013bd4-12df-4863-8b43-cf0de0c6f018\") " pod="kube-system/kube-proxy-p4qll" Jul 15 05:18:50.741399 kubelet[3163]: I0715 05:18:50.741285 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1f013bd4-12df-4863-8b43-cf0de0c6f018-kube-proxy\") pod \"kube-proxy-p4qll\" (UID: \"1f013bd4-12df-4863-8b43-cf0de0c6f018\") " pod="kube-system/kube-proxy-p4qll" Jul 15 05:18:50.741399 kubelet[3163]: I0715 05:18:50.741300 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jls7v\" (UniqueName: \"kubernetes.io/projected/1f013bd4-12df-4863-8b43-cf0de0c6f018-kube-api-access-jls7v\") pod \"kube-proxy-p4qll\" (UID: \"1f013bd4-12df-4863-8b43-cf0de0c6f018\") " pod="kube-system/kube-proxy-p4qll" Jul 15 05:18:50.849578 kubelet[3163]: E0715 05:18:50.849538 3163 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jul 15 05:18:50.849578 kubelet[3163]: E0715 05:18:50.849577 3163 projected.go:194] Error preparing data for projected volume kube-api-access-jls7v for pod kube-system/kube-proxy-p4qll: configmap "kube-root-ca.crt" not found Jul 15 05:18:50.849881 kubelet[3163]: E0715 05:18:50.849633 3163 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f013bd4-12df-4863-8b43-cf0de0c6f018-kube-api-access-jls7v podName:1f013bd4-12df-4863-8b43-cf0de0c6f018 nodeName:}" failed. No retries permitted until 2025-07-15 05:18:51.349613025 +0000 UTC m=+6.720607926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jls7v" (UniqueName: "kubernetes.io/projected/1f013bd4-12df-4863-8b43-cf0de0c6f018-kube-api-access-jls7v") pod "kube-proxy-p4qll" (UID: "1f013bd4-12df-4863-8b43-cf0de0c6f018") : configmap "kube-root-ca.crt" not found Jul 15 05:18:51.565732 containerd[1743]: time="2025-07-15T05:18:51.565689930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-p4qll,Uid:1f013bd4-12df-4863-8b43-cf0de0c6f018,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:51.615346 containerd[1743]: time="2025-07-15T05:18:51.615282320Z" level=info msg="connecting to shim d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932" address="unix:///run/containerd/s/a6a88d1882644d87afa97caf06313fb58fb7da2291dcd4846a6808fb2e959f2b" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:51.635129 systemd[1]: Started cri-containerd-d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932.scope - libcontainer container d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932. Jul 15 05:18:51.655707 containerd[1743]: time="2025-07-15T05:18:51.655673394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-p4qll,Uid:1f013bd4-12df-4863-8b43-cf0de0c6f018,Namespace:kube-system,Attempt:0,} returns sandbox id \"d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932\"" Jul 15 05:18:51.658168 containerd[1743]: time="2025-07-15T05:18:51.658126084Z" level=info msg="CreateContainer within sandbox \"d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 05:18:51.680080 containerd[1743]: time="2025-07-15T05:18:51.677104414Z" level=info msg="Container 131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:51.696293 containerd[1743]: time="2025-07-15T05:18:51.696267195Z" level=info msg="CreateContainer within sandbox \"d9f582508d6bf1f026932f9d07977669eae41a13b4f393f4036f7258cc8f7932\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5\"" Jul 15 05:18:51.696812 containerd[1743]: time="2025-07-15T05:18:51.696702762Z" level=info msg="StartContainer for \"131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5\"" Jul 15 05:18:51.698298 containerd[1743]: time="2025-07-15T05:18:51.698261548Z" level=info msg="connecting to shim 131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5" address="unix:///run/containerd/s/a6a88d1882644d87afa97caf06313fb58fb7da2291dcd4846a6808fb2e959f2b" protocol=ttrpc version=3 Jul 15 05:18:51.720279 systemd[1]: Started cri-containerd-131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5.scope - libcontainer container 131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5. Jul 15 05:18:51.768224 kubelet[3163]: W0715 05:18:51.768201 3163 reflector.go:561] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4396.0.0-n-1e5a06c7e3" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object Jul 15 05:18:51.768462 kubelet[3163]: E0715 05:18:51.768241 3163 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4396.0.0-n-1e5a06c7e3\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object" logger="UnhandledError" Jul 15 05:18:51.768462 kubelet[3163]: W0715 05:18:51.768286 3163 reflector.go:561] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4396.0.0-n-1e5a06c7e3" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object Jul 15 05:18:51.768462 kubelet[3163]: E0715 05:18:51.768297 3163 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ci-4396.0.0-n-1e5a06c7e3\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object" logger="UnhandledError" Jul 15 05:18:51.770443 systemd[1]: Created slice kubepods-besteffort-pod82b550b7_9a6b_4b60_a726_1cdbe323e4b6.slice - libcontainer container kubepods-besteffort-pod82b550b7_9a6b_4b60_a726_1cdbe323e4b6.slice. Jul 15 05:18:51.817226 containerd[1743]: time="2025-07-15T05:18:51.817150039Z" level=info msg="StartContainer for \"131142d6231603e89e15dd8d75698aea813040c40cf03eabf2303c002f4732f5\" returns successfully" Jul 15 05:18:51.847356 kubelet[3163]: I0715 05:18:51.847330 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/82b550b7-9a6b-4b60-a726-1cdbe323e4b6-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-8qvf7\" (UID: \"82b550b7-9a6b-4b60-a726-1cdbe323e4b6\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-8qvf7" Jul 15 05:18:51.847356 kubelet[3163]: I0715 05:18:51.847362 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhfk\" (UniqueName: \"kubernetes.io/projected/82b550b7-9a6b-4b60-a726-1cdbe323e4b6-kube-api-access-zbhfk\") pod \"tigera-operator-5bf8dfcb4-8qvf7\" (UID: \"82b550b7-9a6b-4b60-a726-1cdbe323e4b6\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-8qvf7" Jul 15 05:18:52.775331 kubelet[3163]: I0715 05:18:52.775269 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-p4qll" podStartSLOduration=2.775250797 podStartE2EDuration="2.775250797s" podCreationTimestamp="2025-07-15 05:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:52.775089643 +0000 UTC m=+8.146084560" watchObservedRunningTime="2025-07-15 05:18:52.775250797 +0000 UTC m=+8.146245694" Jul 15 05:18:52.951271 kubelet[3163]: E0715 05:18:52.951235 3163 projected.go:288] Couldn't get configMap tigera-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:52.951271 kubelet[3163]: E0715 05:18:52.951270 3163 projected.go:194] Error preparing data for projected volume kube-api-access-zbhfk for pod tigera-operator/tigera-operator-5bf8dfcb4-8qvf7: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:52.951451 kubelet[3163]: E0715 05:18:52.951329 3163 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82b550b7-9a6b-4b60-a726-1cdbe323e4b6-kube-api-access-zbhfk podName:82b550b7-9a6b-4b60-a726-1cdbe323e4b6 nodeName:}" failed. No retries permitted until 2025-07-15 05:18:53.451308171 +0000 UTC m=+8.822303069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zbhfk" (UniqueName: "kubernetes.io/projected/82b550b7-9a6b-4b60-a726-1cdbe323e4b6-kube-api-access-zbhfk") pod "tigera-operator-5bf8dfcb4-8qvf7" (UID: "82b550b7-9a6b-4b60-a726-1cdbe323e4b6") : failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:53.574265 containerd[1743]: time="2025-07-15T05:18:53.574146660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-8qvf7,Uid:82b550b7-9a6b-4b60-a726-1cdbe323e4b6,Namespace:tigera-operator,Attempt:0,}" Jul 15 05:18:53.614601 containerd[1743]: time="2025-07-15T05:18:53.614560714Z" level=info msg="connecting to shim 70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2" address="unix:///run/containerd/s/6687c23f86be99ddfc1dbc0c119587adbcfa3fe83cbbfe3bce20099470cae163" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:53.631148 systemd[1]: Started cri-containerd-70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2.scope - libcontainer container 70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2. Jul 15 05:18:53.666106 containerd[1743]: time="2025-07-15T05:18:53.666078125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-8qvf7,Uid:82b550b7-9a6b-4b60-a726-1cdbe323e4b6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2\"" Jul 15 05:18:53.667299 containerd[1743]: time="2025-07-15T05:18:53.667273004Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 05:18:55.121412 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3715414648.mount: Deactivated successfully. Jul 15 05:18:55.513381 containerd[1743]: time="2025-07-15T05:18:55.513285888Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:55.515647 containerd[1743]: time="2025-07-15T05:18:55.515611628Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 15 05:18:55.518693 containerd[1743]: time="2025-07-15T05:18:55.518662533Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:55.522054 containerd[1743]: time="2025-07-15T05:18:55.522013126Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:55.522430 containerd[1743]: time="2025-07-15T05:18:55.522409838Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 1.855106196s" Jul 15 05:18:55.522468 containerd[1743]: time="2025-07-15T05:18:55.522436255Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 15 05:18:55.524800 containerd[1743]: time="2025-07-15T05:18:55.524121128Z" level=info msg="CreateContainer within sandbox \"70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 05:18:55.538971 containerd[1743]: time="2025-07-15T05:18:55.538881074Z" level=info msg="Container 3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:55.553681 containerd[1743]: time="2025-07-15T05:18:55.553652328Z" level=info msg="CreateContainer within sandbox \"70c58ba97c14014ca900430f780adbfb4c8a813b6df8534abe0ea5d51ddcdce2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe\"" Jul 15 05:18:55.554160 containerd[1743]: time="2025-07-15T05:18:55.554137780Z" level=info msg="StartContainer for \"3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe\"" Jul 15 05:18:55.554852 containerd[1743]: time="2025-07-15T05:18:55.554819393Z" level=info msg="connecting to shim 3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe" address="unix:///run/containerd/s/6687c23f86be99ddfc1dbc0c119587adbcfa3fe83cbbfe3bce20099470cae163" protocol=ttrpc version=3 Jul 15 05:18:55.573084 systemd[1]: Started cri-containerd-3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe.scope - libcontainer container 3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe. Jul 15 05:18:55.595964 containerd[1743]: time="2025-07-15T05:18:55.595938339Z" level=info msg="StartContainer for \"3ad1cbd9a1c91120a044083f702d54e26938a890596baa61b1c09258867d9ebe\" returns successfully" Jul 15 05:18:55.791898 kubelet[3163]: I0715 05:18:55.791656 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-8qvf7" podStartSLOduration=2.935497925 podStartE2EDuration="4.791637806s" podCreationTimestamp="2025-07-15 05:18:51 +0000 UTC" firstStartedPulling="2025-07-15 05:18:53.666875797 +0000 UTC m=+9.037870693" lastFinishedPulling="2025-07-15 05:18:55.523015678 +0000 UTC m=+10.894010574" observedRunningTime="2025-07-15 05:18:55.779769305 +0000 UTC m=+11.150764203" watchObservedRunningTime="2025-07-15 05:18:55.791637806 +0000 UTC m=+11.162632702" Jul 15 05:19:01.063921 sudo[2184]: pam_unix(sudo:session): session closed for user root Jul 15 05:19:01.164396 sshd[2183]: Connection closed by 10.200.16.10 port 51814 Jul 15 05:19:01.165115 sshd-session[2180]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:01.170317 systemd-logind[1717]: Session 9 logged out. Waiting for processes to exit. Jul 15 05:19:01.171368 systemd[1]: sshd@6-10.200.8.4:22-10.200.16.10:51814.service: Deactivated successfully. Jul 15 05:19:01.174779 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 05:19:01.174937 systemd[1]: session-9.scope: Consumed 2.954s CPU time, 222.2M memory peak. Jul 15 05:19:01.180827 systemd-logind[1717]: Removed session 9. Jul 15 05:19:06.026654 systemd[1]: Created slice kubepods-besteffort-pod9faceaa2_1232_400f_85d0_a081e3610126.slice - libcontainer container kubepods-besteffort-pod9faceaa2_1232_400f_85d0_a081e3610126.slice. Jul 15 05:19:06.035276 kubelet[3163]: I0715 05:19:06.035246 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9faceaa2-1232-400f-85d0-a081e3610126-tigera-ca-bundle\") pod \"calico-typha-b7d489c59-bb4nc\" (UID: \"9faceaa2-1232-400f-85d0-a081e3610126\") " pod="calico-system/calico-typha-b7d489c59-bb4nc" Jul 15 05:19:06.035526 kubelet[3163]: I0715 05:19:06.035285 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxck2\" (UniqueName: \"kubernetes.io/projected/9faceaa2-1232-400f-85d0-a081e3610126-kube-api-access-nxck2\") pod \"calico-typha-b7d489c59-bb4nc\" (UID: \"9faceaa2-1232-400f-85d0-a081e3610126\") " pod="calico-system/calico-typha-b7d489c59-bb4nc" Jul 15 05:19:06.035526 kubelet[3163]: I0715 05:19:06.035313 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9faceaa2-1232-400f-85d0-a081e3610126-typha-certs\") pod \"calico-typha-b7d489c59-bb4nc\" (UID: \"9faceaa2-1232-400f-85d0-a081e3610126\") " pod="calico-system/calico-typha-b7d489c59-bb4nc" Jul 15 05:19:06.192544 systemd[1]: Created slice kubepods-besteffort-pod229887aa_90dd_46ed_be77_4f3d33f40e4c.slice - libcontainer container kubepods-besteffort-pod229887aa_90dd_46ed_be77_4f3d33f40e4c.slice. Jul 15 05:19:06.236252 kubelet[3163]: I0715 05:19:06.236189 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-var-lib-calico\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236252 kubelet[3163]: I0715 05:19:06.236225 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-cni-net-dir\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236252 kubelet[3163]: I0715 05:19:06.236242 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-lib-modules\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236252 kubelet[3163]: I0715 05:19:06.236255 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-policysync\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236421 kubelet[3163]: I0715 05:19:06.236269 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-var-run-calico\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236421 kubelet[3163]: I0715 05:19:06.236282 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/229887aa-90dd-46ed-be77-4f3d33f40e4c-tigera-ca-bundle\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236421 kubelet[3163]: I0715 05:19:06.236296 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/229887aa-90dd-46ed-be77-4f3d33f40e4c-node-certs\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236421 kubelet[3163]: I0715 05:19:06.236310 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-xtables-lock\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236421 kubelet[3163]: I0715 05:19:06.236323 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jbx2\" (UniqueName: \"kubernetes.io/projected/229887aa-90dd-46ed-be77-4f3d33f40e4c-kube-api-access-2jbx2\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236519 kubelet[3163]: I0715 05:19:06.236341 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-flexvol-driver-host\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236519 kubelet[3163]: I0715 05:19:06.236355 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-cni-bin-dir\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.236519 kubelet[3163]: I0715 05:19:06.236370 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/229887aa-90dd-46ed-be77-4f3d33f40e4c-cni-log-dir\") pod \"calico-node-6cpwz\" (UID: \"229887aa-90dd-46ed-be77-4f3d33f40e4c\") " pod="calico-system/calico-node-6cpwz" Jul 15 05:19:06.330139 kubelet[3163]: E0715 05:19:06.330034 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:06.334726 containerd[1743]: time="2025-07-15T05:19:06.334690818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b7d489c59-bb4nc,Uid:9faceaa2-1232-400f-85d0-a081e3610126,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:06.340531 kubelet[3163]: E0715 05:19:06.340093 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.340531 kubelet[3163]: W0715 05:19:06.340108 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.340531 kubelet[3163]: E0715 05:19:06.340131 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.342338 kubelet[3163]: E0715 05:19:06.342280 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.342471 kubelet[3163]: W0715 05:19:06.342460 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.342520 kubelet[3163]: E0715 05:19:06.342513 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.342843 kubelet[3163]: E0715 05:19:06.342749 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.342843 kubelet[3163]: W0715 05:19:06.342757 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.342843 kubelet[3163]: E0715 05:19:06.342765 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.345128 kubelet[3163]: E0715 05:19:06.345092 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.345128 kubelet[3163]: W0715 05:19:06.345126 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.345226 kubelet[3163]: E0715 05:19:06.345141 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.352023 kubelet[3163]: E0715 05:19:06.352006 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.352023 kubelet[3163]: W0715 05:19:06.352019 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.352370 kubelet[3163]: E0715 05:19:06.352034 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.359351 kubelet[3163]: E0715 05:19:06.359337 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.359448 kubelet[3163]: W0715 05:19:06.359414 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.359448 kubelet[3163]: E0715 05:19:06.359428 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.386576 containerd[1743]: time="2025-07-15T05:19:06.386541245Z" level=info msg="connecting to shim 986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa" address="unix:///run/containerd/s/4b7dfccad6f575f26d4343425b6289f60c84b56a5d708a10578ad54632ae0fb1" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:06.409106 systemd[1]: Started cri-containerd-986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa.scope - libcontainer container 986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa. Jul 15 05:19:06.422152 kubelet[3163]: E0715 05:19:06.422133 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422152 kubelet[3163]: W0715 05:19:06.422150 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422245 kubelet[3163]: E0715 05:19:06.422163 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422321 kubelet[3163]: E0715 05:19:06.422313 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422352 kubelet[3163]: W0715 05:19:06.422322 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422352 kubelet[3163]: E0715 05:19:06.422330 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422434 kubelet[3163]: E0715 05:19:06.422426 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422454 kubelet[3163]: W0715 05:19:06.422433 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422454 kubelet[3163]: E0715 05:19:06.422448 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422541 kubelet[3163]: E0715 05:19:06.422534 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422561 kubelet[3163]: W0715 05:19:06.422541 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422561 kubelet[3163]: E0715 05:19:06.422546 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422647 kubelet[3163]: E0715 05:19:06.422639 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422647 kubelet[3163]: W0715 05:19:06.422647 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422694 kubelet[3163]: E0715 05:19:06.422654 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422747 kubelet[3163]: E0715 05:19:06.422737 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422770 kubelet[3163]: W0715 05:19:06.422752 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422770 kubelet[3163]: E0715 05:19:06.422758 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.422845 kubelet[3163]: E0715 05:19:06.422836 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.422867 kubelet[3163]: W0715 05:19:06.422845 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.422867 kubelet[3163]: E0715 05:19:06.422851 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.424105 kubelet[3163]: E0715 05:19:06.424080 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.424105 kubelet[3163]: W0715 05:19:06.424094 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.424105 kubelet[3163]: E0715 05:19:06.424106 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.424251 kubelet[3163]: E0715 05:19:06.424242 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.424276 kubelet[3163]: W0715 05:19:06.424252 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.424276 kubelet[3163]: E0715 05:19:06.424261 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.424352 kubelet[3163]: E0715 05:19:06.424345 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.424373 kubelet[3163]: W0715 05:19:06.424353 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.424373 kubelet[3163]: E0715 05:19:06.424359 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.424822 kubelet[3163]: E0715 05:19:06.424443 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.424822 kubelet[3163]: W0715 05:19:06.424448 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.424822 kubelet[3163]: E0715 05:19:06.424454 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.424822 kubelet[3163]: E0715 05:19:06.424539 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.424822 kubelet[3163]: W0715 05:19:06.424543 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.424822 kubelet[3163]: E0715 05:19:06.424550 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.425104 kubelet[3163]: E0715 05:19:06.425086 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.425130 kubelet[3163]: W0715 05:19:06.425106 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.425130 kubelet[3163]: E0715 05:19:06.425117 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.425694 kubelet[3163]: E0715 05:19:06.425679 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.425694 kubelet[3163]: W0715 05:19:06.425695 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.425767 kubelet[3163]: E0715 05:19:06.425704 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.425844 kubelet[3163]: E0715 05:19:06.425835 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.425863 kubelet[3163]: W0715 05:19:06.425845 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.425863 kubelet[3163]: E0715 05:19:06.425852 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.425963 kubelet[3163]: E0715 05:19:06.425946 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.426008 kubelet[3163]: W0715 05:19:06.426001 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.426033 kubelet[3163]: E0715 05:19:06.426011 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.426117 kubelet[3163]: E0715 05:19:06.426110 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.426137 kubelet[3163]: W0715 05:19:06.426117 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.426137 kubelet[3163]: E0715 05:19:06.426124 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.426215 kubelet[3163]: E0715 05:19:06.426208 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.426238 kubelet[3163]: W0715 05:19:06.426215 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.426238 kubelet[3163]: E0715 05:19:06.426221 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.426304 kubelet[3163]: E0715 05:19:06.426298 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.426322 kubelet[3163]: W0715 05:19:06.426304 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.426322 kubelet[3163]: E0715 05:19:06.426309 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.426390 kubelet[3163]: E0715 05:19:06.426384 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.426408 kubelet[3163]: W0715 05:19:06.426390 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.426408 kubelet[3163]: E0715 05:19:06.426396 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.438245 kubelet[3163]: E0715 05:19:06.438111 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.438245 kubelet[3163]: W0715 05:19:06.438128 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.438245 kubelet[3163]: E0715 05:19:06.438142 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.438245 kubelet[3163]: I0715 05:19:06.438167 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba-socket-dir\") pod \"csi-node-driver-dg9p5\" (UID: \"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba\") " pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:06.438570 kubelet[3163]: E0715 05:19:06.438492 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.438570 kubelet[3163]: W0715 05:19:06.438503 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.438570 kubelet[3163]: E0715 05:19:06.438517 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.438570 kubelet[3163]: I0715 05:19:06.438535 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba-kubelet-dir\") pod \"csi-node-driver-dg9p5\" (UID: \"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba\") " pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:06.438999 kubelet[3163]: E0715 05:19:06.438911 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.439523 kubelet[3163]: W0715 05:19:06.439102 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.439764 kubelet[3163]: E0715 05:19:06.439615 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.439764 kubelet[3163]: I0715 05:19:06.439642 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba-registration-dir\") pod \"csi-node-driver-dg9p5\" (UID: \"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba\") " pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:06.441002 kubelet[3163]: E0715 05:19:06.440940 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.441106 kubelet[3163]: W0715 05:19:06.441089 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.441245 kubelet[3163]: E0715 05:19:06.441236 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.442263 kubelet[3163]: E0715 05:19:06.442191 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.442263 kubelet[3163]: W0715 05:19:06.442203 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.442541 kubelet[3163]: E0715 05:19:06.442519 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.443047 kubelet[3163]: E0715 05:19:06.443029 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.443418 kubelet[3163]: W0715 05:19:06.443195 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.443418 kubelet[3163]: E0715 05:19:06.443237 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.443698 kubelet[3163]: E0715 05:19:06.443676 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.443698 kubelet[3163]: W0715 05:19:06.443691 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.443848 kubelet[3163]: E0715 05:19:06.443793 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.443848 kubelet[3163]: I0715 05:19:06.443816 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba-varrun\") pod \"csi-node-driver-dg9p5\" (UID: \"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba\") " pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:06.444018 kubelet[3163]: E0715 05:19:06.443943 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.444192 kubelet[3163]: W0715 05:19:06.443950 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.444345 kubelet[3163]: E0715 05:19:06.444283 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.444644 kubelet[3163]: E0715 05:19:06.444627 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.444644 kubelet[3163]: W0715 05:19:06.444640 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.444812 kubelet[3163]: E0715 05:19:06.444651 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.445360 kubelet[3163]: E0715 05:19:06.445328 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.445360 kubelet[3163]: W0715 05:19:06.445342 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.445446 kubelet[3163]: E0715 05:19:06.445438 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.445946 kubelet[3163]: E0715 05:19:06.445911 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.445946 kubelet[3163]: W0715 05:19:06.445925 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.445946 kubelet[3163]: E0715 05:19:06.445939 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.446901 kubelet[3163]: E0715 05:19:06.446881 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.446901 kubelet[3163]: W0715 05:19:06.446900 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.447123 kubelet[3163]: E0715 05:19:06.446914 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.447123 kubelet[3163]: I0715 05:19:06.446933 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwp9q\" (UniqueName: \"kubernetes.io/projected/21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba-kube-api-access-nwp9q\") pod \"csi-node-driver-dg9p5\" (UID: \"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba\") " pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:06.447421 kubelet[3163]: E0715 05:19:06.447294 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.447421 kubelet[3163]: W0715 05:19:06.447306 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.447421 kubelet[3163]: E0715 05:19:06.447318 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.447780 kubelet[3163]: E0715 05:19:06.447758 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.447780 kubelet[3163]: W0715 05:19:06.447773 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.447935 kubelet[3163]: E0715 05:19:06.447785 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.448830 kubelet[3163]: E0715 05:19:06.448813 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.448830 kubelet[3163]: W0715 05:19:06.448832 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.449011 kubelet[3163]: E0715 05:19:06.448845 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.458125 containerd[1743]: time="2025-07-15T05:19:06.458054125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b7d489c59-bb4nc,Uid:9faceaa2-1232-400f-85d0-a081e3610126,Namespace:calico-system,Attempt:0,} returns sandbox id \"986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa\"" Jul 15 05:19:06.459991 containerd[1743]: time="2025-07-15T05:19:06.459950456Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 05:19:06.495762 containerd[1743]: time="2025-07-15T05:19:06.495734069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6cpwz,Uid:229887aa-90dd-46ed-be77-4f3d33f40e4c,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:06.541868 containerd[1743]: time="2025-07-15T05:19:06.541587933Z" level=info msg="connecting to shim 7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7" address="unix:///run/containerd/s/ed17e44c79ee9b68cc57a1a230caf47d288758c8d3bd1ce9d5ab079116fff63f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:06.547986 kubelet[3163]: E0715 05:19:06.547951 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.547986 kubelet[3163]: W0715 05:19:06.547983 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.548082 kubelet[3163]: E0715 05:19:06.547997 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.548384 kubelet[3163]: E0715 05:19:06.548374 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.548519 kubelet[3163]: W0715 05:19:06.548437 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.548519 kubelet[3163]: E0715 05:19:06.548454 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.548730 kubelet[3163]: E0715 05:19:06.548685 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.548730 kubelet[3163]: W0715 05:19:06.548694 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.548730 kubelet[3163]: E0715 05:19:06.548708 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.549505 kubelet[3163]: E0715 05:19:06.549488 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.549505 kubelet[3163]: W0715 05:19:06.549501 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.549606 kubelet[3163]: E0715 05:19:06.549517 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.549693 kubelet[3163]: E0715 05:19:06.549682 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.549693 kubelet[3163]: W0715 05:19:06.549691 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.549783 kubelet[3163]: E0715 05:19:06.549763 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.549821 kubelet[3163]: E0715 05:19:06.549789 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.549821 kubelet[3163]: W0715 05:19:06.549793 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.549860 kubelet[3163]: E0715 05:19:06.549841 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.549931 kubelet[3163]: E0715 05:19:06.549920 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.549931 kubelet[3163]: W0715 05:19:06.549927 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.550044 kubelet[3163]: E0715 05:19:06.549939 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.550094 kubelet[3163]: E0715 05:19:06.550084 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.550094 kubelet[3163]: W0715 05:19:06.550093 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551038 kubelet[3163]: E0715 05:19:06.551019 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551193 kubelet[3163]: E0715 05:19:06.551184 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551237 kubelet[3163]: W0715 05:19:06.551194 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551303 kubelet[3163]: E0715 05:19:06.551270 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551329 kubelet[3163]: E0715 05:19:06.551305 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551329 kubelet[3163]: W0715 05:19:06.551310 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551409 kubelet[3163]: E0715 05:19:06.551357 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551409 kubelet[3163]: E0715 05:19:06.551406 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551501 kubelet[3163]: W0715 05:19:06.551411 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551501 kubelet[3163]: E0715 05:19:06.551485 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551545 kubelet[3163]: E0715 05:19:06.551509 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551545 kubelet[3163]: W0715 05:19:06.551513 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551589 kubelet[3163]: E0715 05:19:06.551583 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551691 kubelet[3163]: E0715 05:19:06.551660 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551691 kubelet[3163]: W0715 05:19:06.551669 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551691 kubelet[3163]: E0715 05:19:06.551682 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.551984 kubelet[3163]: E0715 05:19:06.551873 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.551984 kubelet[3163]: W0715 05:19:06.551882 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.551984 kubelet[3163]: E0715 05:19:06.551894 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.552236 kubelet[3163]: E0715 05:19:06.552181 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.552236 kubelet[3163]: W0715 05:19:06.552190 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.552236 kubelet[3163]: E0715 05:19:06.552201 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.552423 kubelet[3163]: E0715 05:19:06.552412 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.552449 kubelet[3163]: W0715 05:19:06.552423 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.552449 kubelet[3163]: E0715 05:19:06.552434 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.552550 kubelet[3163]: E0715 05:19:06.552539 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.552550 kubelet[3163]: W0715 05:19:06.552547 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.552592 kubelet[3163]: E0715 05:19:06.552559 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.552696 kubelet[3163]: E0715 05:19:06.552689 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.552719 kubelet[3163]: W0715 05:19:06.552696 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.552778 kubelet[3163]: E0715 05:19:06.552766 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.552839 kubelet[3163]: E0715 05:19:06.552793 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.552839 kubelet[3163]: W0715 05:19:06.552839 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.552919 kubelet[3163]: E0715 05:19:06.552912 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.553095 kubelet[3163]: E0715 05:19:06.553083 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.553137 kubelet[3163]: W0715 05:19:06.553095 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.553137 kubelet[3163]: E0715 05:19:06.553109 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.553224 kubelet[3163]: E0715 05:19:06.553211 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.553224 kubelet[3163]: W0715 05:19:06.553221 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.553263 kubelet[3163]: E0715 05:19:06.553234 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.553436 kubelet[3163]: E0715 05:19:06.553398 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.553436 kubelet[3163]: W0715 05:19:06.553405 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.553495 kubelet[3163]: E0715 05:19:06.553442 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.553980 kubelet[3163]: E0715 05:19:06.553927 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.553980 kubelet[3163]: W0715 05:19:06.553939 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.553980 kubelet[3163]: E0715 05:19:06.553949 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.554254 kubelet[3163]: E0715 05:19:06.554184 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.554254 kubelet[3163]: W0715 05:19:06.554193 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.554254 kubelet[3163]: E0715 05:19:06.554204 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.554699 kubelet[3163]: E0715 05:19:06.554516 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.554699 kubelet[3163]: W0715 05:19:06.554527 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.554699 kubelet[3163]: E0715 05:19:06.554536 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.562054 kubelet[3163]: E0715 05:19:06.562032 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:06.562054 kubelet[3163]: W0715 05:19:06.562046 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:06.562054 kubelet[3163]: E0715 05:19:06.562057 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:06.569247 systemd[1]: Started cri-containerd-7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7.scope - libcontainer container 7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7. Jul 15 05:19:06.593619 containerd[1743]: time="2025-07-15T05:19:06.593597914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6cpwz,Uid:229887aa-90dd-46ed-be77-4f3d33f40e4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\"" Jul 15 05:19:07.616045 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1531535675.mount: Deactivated successfully. Jul 15 05:19:07.706270 kubelet[3163]: E0715 05:19:07.706212 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:08.232607 containerd[1743]: time="2025-07-15T05:19:08.232564751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:08.235342 containerd[1743]: time="2025-07-15T05:19:08.235304890Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 15 05:19:08.237947 containerd[1743]: time="2025-07-15T05:19:08.237909882Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:08.240846 containerd[1743]: time="2025-07-15T05:19:08.240802038Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:08.241294 containerd[1743]: time="2025-07-15T05:19:08.241191433Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 1.781192279s" Jul 15 05:19:08.241294 containerd[1743]: time="2025-07-15T05:19:08.241217777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 15 05:19:08.242741 containerd[1743]: time="2025-07-15T05:19:08.242682635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 05:19:08.258026 containerd[1743]: time="2025-07-15T05:19:08.258005749Z" level=info msg="CreateContainer within sandbox \"986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 05:19:08.276150 containerd[1743]: time="2025-07-15T05:19:08.276128295Z" level=info msg="Container 029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:08.280835 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2709815278.mount: Deactivated successfully. Jul 15 05:19:08.291402 containerd[1743]: time="2025-07-15T05:19:08.291377855Z" level=info msg="CreateContainer within sandbox \"986fbae175cf2eb714abc40e15f8075550513874d669d54f5cc824fe1cbd7aaa\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c\"" Jul 15 05:19:08.291894 containerd[1743]: time="2025-07-15T05:19:08.291827035Z" level=info msg="StartContainer for \"029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c\"" Jul 15 05:19:08.292900 containerd[1743]: time="2025-07-15T05:19:08.292875372Z" level=info msg="connecting to shim 029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c" address="unix:///run/containerd/s/4b7dfccad6f575f26d4343425b6289f60c84b56a5d708a10578ad54632ae0fb1" protocol=ttrpc version=3 Jul 15 05:19:08.307114 systemd[1]: Started cri-containerd-029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c.scope - libcontainer container 029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c. Jul 15 05:19:08.348235 containerd[1743]: time="2025-07-15T05:19:08.348192457Z" level=info msg="StartContainer for \"029795e2d14f4ca7e71831c59ddcf3feadcf8975acf1c745b11541ff89b0b27c\" returns successfully" Jul 15 05:19:08.809880 kubelet[3163]: I0715 05:19:08.809673 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b7d489c59-bb4nc" podStartSLOduration=2.0272031679999998 podStartE2EDuration="3.809656042s" podCreationTimestamp="2025-07-15 05:19:05 +0000 UTC" firstStartedPulling="2025-07-15 05:19:06.459401462 +0000 UTC m=+21.830396364" lastFinishedPulling="2025-07-15 05:19:08.241854353 +0000 UTC m=+23.612849238" observedRunningTime="2025-07-15 05:19:08.809229532 +0000 UTC m=+24.180224430" watchObservedRunningTime="2025-07-15 05:19:08.809656042 +0000 UTC m=+24.180650943" Jul 15 05:19:08.842659 kubelet[3163]: E0715 05:19:08.842636 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.842659 kubelet[3163]: W0715 05:19:08.842654 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.842778 kubelet[3163]: E0715 05:19:08.842671 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.842810 kubelet[3163]: E0715 05:19:08.842798 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.842810 kubelet[3163]: W0715 05:19:08.842803 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.842851 kubelet[3163]: E0715 05:19:08.842810 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.842896 kubelet[3163]: E0715 05:19:08.842886 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.842896 kubelet[3163]: W0715 05:19:08.842894 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.842936 kubelet[3163]: E0715 05:19:08.842900 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.843709 kubelet[3163]: E0715 05:19:08.843687 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.843709 kubelet[3163]: W0715 05:19:08.843707 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.843811 kubelet[3163]: E0715 05:19:08.843723 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.843887 kubelet[3163]: E0715 05:19:08.843874 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.843887 kubelet[3163]: W0715 05:19:08.843884 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.843936 kubelet[3163]: E0715 05:19:08.843893 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.844127 kubelet[3163]: E0715 05:19:08.843994 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.844127 kubelet[3163]: W0715 05:19:08.844000 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.844127 kubelet[3163]: E0715 05:19:08.844007 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.844470 kubelet[3163]: E0715 05:19:08.844450 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.844531 kubelet[3163]: W0715 05:19:08.844471 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.844531 kubelet[3163]: E0715 05:19:08.844486 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.844651 kubelet[3163]: E0715 05:19:08.844636 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.844651 kubelet[3163]: W0715 05:19:08.844648 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.844701 kubelet[3163]: E0715 05:19:08.844659 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.844775 kubelet[3163]: E0715 05:19:08.844767 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.844851 kubelet[3163]: W0715 05:19:08.844775 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.844851 kubelet[3163]: E0715 05:19:08.844782 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.844932 kubelet[3163]: E0715 05:19:08.844864 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.844932 kubelet[3163]: W0715 05:19:08.844870 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.844932 kubelet[3163]: E0715 05:19:08.844876 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.845102 kubelet[3163]: E0715 05:19:08.845048 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.845102 kubelet[3163]: W0715 05:19:08.845055 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.845102 kubelet[3163]: E0715 05:19:08.845063 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.845102 kubelet[3163]: E0715 05:19:08.845159 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.845102 kubelet[3163]: W0715 05:19:08.845164 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.845102 kubelet[3163]: E0715 05:19:08.845170 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.845423 kubelet[3163]: E0715 05:19:08.845257 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.845423 kubelet[3163]: W0715 05:19:08.845262 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.845423 kubelet[3163]: E0715 05:19:08.845268 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.845423 kubelet[3163]: E0715 05:19:08.845347 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.845423 kubelet[3163]: W0715 05:19:08.845351 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.845423 kubelet[3163]: E0715 05:19:08.845356 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.845637 kubelet[3163]: E0715 05:19:08.845430 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.845637 kubelet[3163]: W0715 05:19:08.845435 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.845637 kubelet[3163]: E0715 05:19:08.845440 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.863763 kubelet[3163]: E0715 05:19:08.863744 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.863763 kubelet[3163]: W0715 05:19:08.863760 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.863890 kubelet[3163]: E0715 05:19:08.863773 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.863926 kubelet[3163]: E0715 05:19:08.863897 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.863926 kubelet[3163]: W0715 05:19:08.863902 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.863926 kubelet[3163]: E0715 05:19:08.863915 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864030 kubelet[3163]: E0715 05:19:08.864022 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864030 kubelet[3163]: W0715 05:19:08.864028 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864151 kubelet[3163]: E0715 05:19:08.864035 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864176 kubelet[3163]: E0715 05:19:08.864154 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864176 kubelet[3163]: W0715 05:19:08.864163 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864223 kubelet[3163]: E0715 05:19:08.864191 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864331 kubelet[3163]: E0715 05:19:08.864308 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864331 kubelet[3163]: W0715 05:19:08.864327 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864377 kubelet[3163]: E0715 05:19:08.864337 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864416 kubelet[3163]: E0715 05:19:08.864413 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864435 kubelet[3163]: W0715 05:19:08.864418 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864435 kubelet[3163]: E0715 05:19:08.864424 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864597 kubelet[3163]: E0715 05:19:08.864575 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864597 kubelet[3163]: W0715 05:19:08.864596 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864644 kubelet[3163]: E0715 05:19:08.864604 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.864893 kubelet[3163]: E0715 05:19:08.864872 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.864893 kubelet[3163]: W0715 05:19:08.864882 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.864988 kubelet[3163]: E0715 05:19:08.864900 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865015 kubelet[3163]: E0715 05:19:08.865001 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865015 kubelet[3163]: W0715 05:19:08.865006 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865057 kubelet[3163]: E0715 05:19:08.865021 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865120 kubelet[3163]: E0715 05:19:08.865114 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865145 kubelet[3163]: W0715 05:19:08.865120 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865201 kubelet[3163]: E0715 05:19:08.865191 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865260 kubelet[3163]: E0715 05:19:08.865237 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865260 kubelet[3163]: W0715 05:19:08.865259 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865363 kubelet[3163]: E0715 05:19:08.865340 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865399 kubelet[3163]: E0715 05:19:08.865390 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865399 kubelet[3163]: W0715 05:19:08.865397 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865438 kubelet[3163]: E0715 05:19:08.865405 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865547 kubelet[3163]: E0715 05:19:08.865532 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865547 kubelet[3163]: W0715 05:19:08.865544 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865592 kubelet[3163]: E0715 05:19:08.865553 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865724 kubelet[3163]: E0715 05:19:08.865705 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865724 kubelet[3163]: W0715 05:19:08.865723 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865785 kubelet[3163]: E0715 05:19:08.865731 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865831 kubelet[3163]: E0715 05:19:08.865799 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865831 kubelet[3163]: W0715 05:19:08.865803 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865831 kubelet[3163]: E0715 05:19:08.865808 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.865898 kubelet[3163]: E0715 05:19:08.865887 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.865898 kubelet[3163]: W0715 05:19:08.865892 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.865943 kubelet[3163]: E0715 05:19:08.865903 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.866031 kubelet[3163]: E0715 05:19:08.866022 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.866031 kubelet[3163]: W0715 05:19:08.866029 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.866076 kubelet[3163]: E0715 05:19:08.866036 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:08.866276 kubelet[3163]: E0715 05:19:08.866255 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:08.866276 kubelet[3163]: W0715 05:19:08.866274 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:08.866317 kubelet[3163]: E0715 05:19:08.866280 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.707115 kubelet[3163]: E0715 05:19:09.707071 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:09.852902 kubelet[3163]: E0715 05:19:09.852869 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.852902 kubelet[3163]: W0715 05:19:09.852888 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.852902 kubelet[3163]: E0715 05:19:09.852907 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853024 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853391 kubelet[3163]: W0715 05:19:09.853030 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853039 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853115 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853391 kubelet[3163]: W0715 05:19:09.853120 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853128 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853204 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853391 kubelet[3163]: W0715 05:19:09.853209 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853214 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853391 kubelet[3163]: E0715 05:19:09.853288 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853673 kubelet[3163]: W0715 05:19:09.853293 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853297 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853365 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853673 kubelet[3163]: W0715 05:19:09.853370 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853376 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853445 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853673 kubelet[3163]: W0715 05:19:09.853450 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853455 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853673 kubelet[3163]: E0715 05:19:09.853527 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853673 kubelet[3163]: W0715 05:19:09.853531 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853537 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853612 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853932 kubelet[3163]: W0715 05:19:09.853617 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853622 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853692 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853932 kubelet[3163]: W0715 05:19:09.853697 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853702 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853771 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.853932 kubelet[3163]: W0715 05:19:09.853776 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.853932 kubelet[3163]: E0715 05:19:09.853788 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.853870 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.854213 kubelet[3163]: W0715 05:19:09.853875 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.853881 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.853969 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.854213 kubelet[3163]: W0715 05:19:09.853973 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.853979 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.854054 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.854213 kubelet[3163]: W0715 05:19:09.854059 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.854065 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.854213 kubelet[3163]: E0715 05:19:09.854137 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.854358 kubelet[3163]: W0715 05:19:09.854141 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.854358 kubelet[3163]: E0715 05:19:09.854147 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.870462 kubelet[3163]: E0715 05:19:09.870439 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.870462 kubelet[3163]: W0715 05:19:09.870454 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.870625 kubelet[3163]: E0715 05:19:09.870469 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.870625 kubelet[3163]: E0715 05:19:09.870617 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.870625 kubelet[3163]: W0715 05:19:09.870622 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.870717 kubelet[3163]: E0715 05:19:09.870633 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.870823 kubelet[3163]: E0715 05:19:09.870727 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.870823 kubelet[3163]: W0715 05:19:09.870732 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.870823 kubelet[3163]: E0715 05:19:09.870738 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.870935 kubelet[3163]: E0715 05:19:09.870909 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.870935 kubelet[3163]: W0715 05:19:09.870932 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871026 kubelet[3163]: E0715 05:19:09.870943 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871070 kubelet[3163]: E0715 05:19:09.871061 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871070 kubelet[3163]: W0715 05:19:09.871066 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871127 kubelet[3163]: E0715 05:19:09.871076 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871163 kubelet[3163]: E0715 05:19:09.871157 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871163 kubelet[3163]: W0715 05:19:09.871162 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871207 kubelet[3163]: E0715 05:19:09.871170 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871336 kubelet[3163]: E0715 05:19:09.871313 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871336 kubelet[3163]: W0715 05:19:09.871334 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871384 kubelet[3163]: E0715 05:19:09.871343 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871608 kubelet[3163]: E0715 05:19:09.871589 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871635 kubelet[3163]: W0715 05:19:09.871613 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871635 kubelet[3163]: E0715 05:19:09.871622 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871744 kubelet[3163]: E0715 05:19:09.871736 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871744 kubelet[3163]: W0715 05:19:09.871743 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871800 kubelet[3163]: E0715 05:19:09.871749 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.871892 kubelet[3163]: E0715 05:19:09.871869 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.871892 kubelet[3163]: W0715 05:19:09.871889 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.871980 kubelet[3163]: E0715 05:19:09.871972 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872027 kubelet[3163]: E0715 05:19:09.872012 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872027 kubelet[3163]: W0715 05:19:09.872020 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872165 kubelet[3163]: E0715 05:19:09.872101 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872165 kubelet[3163]: W0715 05:19:09.872108 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872165 kubelet[3163]: E0715 05:19:09.872112 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872165 kubelet[3163]: E0715 05:19:09.872116 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872318 kubelet[3163]: E0715 05:19:09.872295 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872318 kubelet[3163]: W0715 05:19:09.872316 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872382 kubelet[3163]: E0715 05:19:09.872325 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872424 kubelet[3163]: E0715 05:19:09.872418 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872446 kubelet[3163]: W0715 05:19:09.872425 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872446 kubelet[3163]: E0715 05:19:09.872432 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872572 kubelet[3163]: E0715 05:19:09.872552 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872604 kubelet[3163]: W0715 05:19:09.872573 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872604 kubelet[3163]: E0715 05:19:09.872590 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.872918 kubelet[3163]: E0715 05:19:09.872761 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.872918 kubelet[3163]: W0715 05:19:09.872770 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.872918 kubelet[3163]: E0715 05:19:09.872778 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.873157 kubelet[3163]: E0715 05:19:09.873129 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.873187 kubelet[3163]: W0715 05:19:09.873157 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.873187 kubelet[3163]: E0715 05:19:09.873170 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:09.873297 kubelet[3163]: E0715 05:19:09.873271 3163 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:19:09.873297 kubelet[3163]: W0715 05:19:09.873294 3163 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:19:09.873349 kubelet[3163]: E0715 05:19:09.873308 3163 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:19:10.302643 containerd[1743]: time="2025-07-15T05:19:10.302187366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:10.305005 containerd[1743]: time="2025-07-15T05:19:10.304973847Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 15 05:19:10.307686 containerd[1743]: time="2025-07-15T05:19:10.307653474Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:10.312230 containerd[1743]: time="2025-07-15T05:19:10.312203218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:10.312786 containerd[1743]: time="2025-07-15T05:19:10.312526981Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 2.069812508s" Jul 15 05:19:10.312786 containerd[1743]: time="2025-07-15T05:19:10.312553639Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 15 05:19:10.314620 containerd[1743]: time="2025-07-15T05:19:10.314594774Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 05:19:10.329532 containerd[1743]: time="2025-07-15T05:19:10.329491571Z" level=info msg="Container 0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:10.348309 containerd[1743]: time="2025-07-15T05:19:10.348279215Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\"" Jul 15 05:19:10.348700 containerd[1743]: time="2025-07-15T05:19:10.348653435Z" level=info msg="StartContainer for \"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\"" Jul 15 05:19:10.350129 containerd[1743]: time="2025-07-15T05:19:10.350066513Z" level=info msg="connecting to shim 0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039" address="unix:///run/containerd/s/ed17e44c79ee9b68cc57a1a230caf47d288758c8d3bd1ce9d5ab079116fff63f" protocol=ttrpc version=3 Jul 15 05:19:10.372103 systemd[1]: Started cri-containerd-0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039.scope - libcontainer container 0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039. Jul 15 05:19:10.404523 containerd[1743]: time="2025-07-15T05:19:10.404452636Z" level=info msg="StartContainer for \"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\" returns successfully" Jul 15 05:19:10.407892 systemd[1]: cri-containerd-0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039.scope: Deactivated successfully. Jul 15 05:19:10.410281 containerd[1743]: time="2025-07-15T05:19:10.410255987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\" id:\"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\" pid:3861 exited_at:{seconds:1752556750 nanos:409370059}" Jul 15 05:19:10.411417 containerd[1743]: time="2025-07-15T05:19:10.410407778Z" level=info msg="received exit event container_id:\"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\" id:\"0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039\" pid:3861 exited_at:{seconds:1752556750 nanos:409370059}" Jul 15 05:19:10.431627 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0cd2d5b120d46a27daf21b97d68370102b4c7c348fc607c160534bce58c55039-rootfs.mount: Deactivated successfully. Jul 15 05:19:11.706740 kubelet[3163]: E0715 05:19:11.706694 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:12.808975 containerd[1743]: time="2025-07-15T05:19:12.808718522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 05:19:13.706657 kubelet[3163]: E0715 05:19:13.706588 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:15.706858 kubelet[3163]: E0715 05:19:15.706816 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:16.380633 containerd[1743]: time="2025-07-15T05:19:16.380585450Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:16.383078 containerd[1743]: time="2025-07-15T05:19:16.383036559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 15 05:19:16.385736 containerd[1743]: time="2025-07-15T05:19:16.385698911Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:16.389004 containerd[1743]: time="2025-07-15T05:19:16.388967232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:16.389395 containerd[1743]: time="2025-07-15T05:19:16.389266446Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.580501775s" Jul 15 05:19:16.389395 containerd[1743]: time="2025-07-15T05:19:16.389295044Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 15 05:19:16.391127 containerd[1743]: time="2025-07-15T05:19:16.391101405Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 05:19:16.404487 containerd[1743]: time="2025-07-15T05:19:16.404417757Z" level=info msg="Container f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:16.423045 containerd[1743]: time="2025-07-15T05:19:16.423021097Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\"" Jul 15 05:19:16.423475 containerd[1743]: time="2025-07-15T05:19:16.423420227Z" level=info msg="StartContainer for \"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\"" Jul 15 05:19:16.424780 containerd[1743]: time="2025-07-15T05:19:16.424754926Z" level=info msg="connecting to shim f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70" address="unix:///run/containerd/s/ed17e44c79ee9b68cc57a1a230caf47d288758c8d3bd1ce9d5ab079116fff63f" protocol=ttrpc version=3 Jul 15 05:19:16.451100 systemd[1]: Started cri-containerd-f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70.scope - libcontainer container f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70. Jul 15 05:19:16.484549 containerd[1743]: time="2025-07-15T05:19:16.484522885Z" level=info msg="StartContainer for \"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\" returns successfully" Jul 15 05:19:17.706743 kubelet[3163]: E0715 05:19:17.706617 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:17.798464 systemd[1]: cri-containerd-f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70.scope: Deactivated successfully. Jul 15 05:19:17.799024 systemd[1]: cri-containerd-f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70.scope: Consumed 377ms CPU time, 194.3M memory peak, 171.2M written to disk. Jul 15 05:19:17.799613 containerd[1743]: time="2025-07-15T05:19:17.799195166Z" level=info msg="received exit event container_id:\"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\" id:\"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\" pid:3920 exited_at:{seconds:1752556757 nanos:798920240}" Jul 15 05:19:17.800196 containerd[1743]: time="2025-07-15T05:19:17.799801878Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\" id:\"f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70\" pid:3920 exited_at:{seconds:1752556757 nanos:798920240}" Jul 15 05:19:17.817564 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f610964fe8afaa4169d39d998651bedd447b6526ef0e2f3a39f5407a1561db70-rootfs.mount: Deactivated successfully. Jul 15 05:19:17.841315 kubelet[3163]: I0715 05:19:17.840608 3163 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 15 05:19:17.882904 systemd[1]: Created slice kubepods-burstable-podd0441816_eb8b_4adf_9c57_452d4b3a9e9f.slice - libcontainer container kubepods-burstable-podd0441816_eb8b_4adf_9c57_452d4b3a9e9f.slice. Jul 15 05:19:17.894580 kubelet[3163]: W0715 05:19:17.894248 3163 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4396.0.0-n-1e5a06c7e3" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object Jul 15 05:19:17.894580 kubelet[3163]: E0715 05:19:17.894283 3163 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4396.0.0-n-1e5a06c7e3\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object" logger="UnhandledError" Jul 15 05:19:17.894580 kubelet[3163]: W0715 05:19:17.894331 3163 reflector.go:561] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4396.0.0-n-1e5a06c7e3" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object Jul 15 05:19:17.894580 kubelet[3163]: E0715 05:19:17.894340 3163 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4396.0.0-n-1e5a06c7e3\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object" logger="UnhandledError" Jul 15 05:19:17.896475 systemd[1]: Created slice kubepods-besteffort-pod9c7ce63e_afea_4962_bead_e12dc034d4c7.slice - libcontainer container kubepods-besteffort-pod9c7ce63e_afea_4962_bead_e12dc034d4c7.slice. Jul 15 05:19:17.899344 kubelet[3163]: W0715 05:19:17.899252 3163 reflector.go:561] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:ci-4396.0.0-n-1e5a06c7e3" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object Jul 15 05:19:17.901974 kubelet[3163]: E0715 05:19:17.899424 3163 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:ci-4396.0.0-n-1e5a06c7e3\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4396.0.0-n-1e5a06c7e3' and this object" logger="UnhandledError" Jul 15 05:19:17.906358 systemd[1]: Created slice kubepods-burstable-pod7cbdf8eb_12a8_45a5_adbf_c095ae49accc.slice - libcontainer container kubepods-burstable-pod7cbdf8eb_12a8_45a5_adbf_c095ae49accc.slice. Jul 15 05:19:17.914604 systemd[1]: Created slice kubepods-besteffort-pod89a2e3cd_7a2d_4602_b983_f5c595a6dead.slice - libcontainer container kubepods-besteffort-pod89a2e3cd_7a2d_4602_b983_f5c595a6dead.slice. Jul 15 05:19:17.918381 kubelet[3163]: I0715 05:19:17.918357 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6wd\" (UniqueName: \"kubernetes.io/projected/7cbdf8eb-12a8-45a5-adbf-c095ae49accc-kube-api-access-cp6wd\") pod \"coredns-7c65d6cfc9-c5cfl\" (UID: \"7cbdf8eb-12a8-45a5-adbf-c095ae49accc\") " pod="kube-system/coredns-7c65d6cfc9-c5cfl" Jul 15 05:19:17.918459 kubelet[3163]: I0715 05:19:17.918388 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c7ce63e-afea-4962-bead-e12dc034d4c7-tigera-ca-bundle\") pod \"calico-kube-controllers-fbd55b5d6-zlcp7\" (UID: \"9c7ce63e-afea-4962-bead-e12dc034d4c7\") " pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" Jul 15 05:19:17.918459 kubelet[3163]: I0715 05:19:17.918406 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5jhd\" (UniqueName: \"kubernetes.io/projected/9c7ce63e-afea-4962-bead-e12dc034d4c7-kube-api-access-c5jhd\") pod \"calico-kube-controllers-fbd55b5d6-zlcp7\" (UID: \"9c7ce63e-afea-4962-bead-e12dc034d4c7\") " pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" Jul 15 05:19:17.918459 kubelet[3163]: I0715 05:19:17.918423 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/89a2e3cd-7a2d-4602-b983-f5c595a6dead-calico-apiserver-certs\") pod \"calico-apiserver-796c4d775c-wrw9j\" (UID: \"89a2e3cd-7a2d-4602-b983-f5c595a6dead\") " pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" Jul 15 05:19:17.918459 kubelet[3163]: I0715 05:19:17.918440 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-backend-key-pair\") pod \"whisker-f987b6b86-zq6pf\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " pod="calico-system/whisker-f987b6b86-zq6pf" Jul 15 05:19:17.918459 kubelet[3163]: I0715 05:19:17.918455 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cbdf8eb-12a8-45a5-adbf-c095ae49accc-config-volume\") pod \"coredns-7c65d6cfc9-c5cfl\" (UID: \"7cbdf8eb-12a8-45a5-adbf-c095ae49accc\") " pod="kube-system/coredns-7c65d6cfc9-c5cfl" Jul 15 05:19:17.918577 kubelet[3163]: I0715 05:19:17.918471 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6td99\" (UniqueName: \"kubernetes.io/projected/07ac1f5f-59ee-45e7-8079-7306d9542bfe-kube-api-access-6td99\") pod \"calico-apiserver-796c4d775c-b6jrw\" (UID: \"07ac1f5f-59ee-45e7-8079-7306d9542bfe\") " pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" Jul 15 05:19:17.918577 kubelet[3163]: I0715 05:19:17.918489 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-config\") pod \"goldmane-58fd7646b9-sxqg6\" (UID: \"9a75b180-3ce9-4020-99a4-ab3fc3f07d66\") " pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:17.918577 kubelet[3163]: I0715 05:19:17.918506 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bplg5\" (UniqueName: \"kubernetes.io/projected/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-kube-api-access-bplg5\") pod \"whisker-f987b6b86-zq6pf\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " pod="calico-system/whisker-f987b6b86-zq6pf" Jul 15 05:19:17.918577 kubelet[3163]: I0715 05:19:17.918527 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0441816-eb8b-4adf-9c57-452d4b3a9e9f-config-volume\") pod \"coredns-7c65d6cfc9-h92x4\" (UID: \"d0441816-eb8b-4adf-9c57-452d4b3a9e9f\") " pod="kube-system/coredns-7c65d6cfc9-h92x4" Jul 15 05:19:17.918577 kubelet[3163]: I0715 05:19:17.918547 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/07ac1f5f-59ee-45e7-8079-7306d9542bfe-calico-apiserver-certs\") pod \"calico-apiserver-796c4d775c-b6jrw\" (UID: \"07ac1f5f-59ee-45e7-8079-7306d9542bfe\") " pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" Jul 15 05:19:17.918680 kubelet[3163]: I0715 05:19:17.918564 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-sxqg6\" (UID: \"9a75b180-3ce9-4020-99a4-ab3fc3f07d66\") " pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:17.918680 kubelet[3163]: I0715 05:19:17.918581 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-key-pair\") pod \"goldmane-58fd7646b9-sxqg6\" (UID: \"9a75b180-3ce9-4020-99a4-ab3fc3f07d66\") " pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:17.918680 kubelet[3163]: I0715 05:19:17.918596 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6vz\" (UniqueName: \"kubernetes.io/projected/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-kube-api-access-hk6vz\") pod \"goldmane-58fd7646b9-sxqg6\" (UID: \"9a75b180-3ce9-4020-99a4-ab3fc3f07d66\") " pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:17.918680 kubelet[3163]: I0715 05:19:17.918617 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6clr\" (UniqueName: \"kubernetes.io/projected/89a2e3cd-7a2d-4602-b983-f5c595a6dead-kube-api-access-d6clr\") pod \"calico-apiserver-796c4d775c-wrw9j\" (UID: \"89a2e3cd-7a2d-4602-b983-f5c595a6dead\") " pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" Jul 15 05:19:17.918680 kubelet[3163]: I0715 05:19:17.918632 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-ca-bundle\") pod \"whisker-f987b6b86-zq6pf\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " pod="calico-system/whisker-f987b6b86-zq6pf" Jul 15 05:19:17.918780 kubelet[3163]: I0715 05:19:17.918648 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2cr\" (UniqueName: \"kubernetes.io/projected/d0441816-eb8b-4adf-9c57-452d4b3a9e9f-kube-api-access-pw2cr\") pod \"coredns-7c65d6cfc9-h92x4\" (UID: \"d0441816-eb8b-4adf-9c57-452d4b3a9e9f\") " pod="kube-system/coredns-7c65d6cfc9-h92x4" Jul 15 05:19:17.923014 systemd[1]: Created slice kubepods-besteffort-pod07ac1f5f_59ee_45e7_8079_7306d9542bfe.slice - libcontainer container kubepods-besteffort-pod07ac1f5f_59ee_45e7_8079_7306d9542bfe.slice. Jul 15 05:19:17.926515 systemd[1]: Created slice kubepods-besteffort-pod9a75b180_3ce9_4020_99a4_ab3fc3f07d66.slice - libcontainer container kubepods-besteffort-pod9a75b180_3ce9_4020_99a4_ab3fc3f07d66.slice. Jul 15 05:19:17.939867 systemd[1]: Created slice kubepods-besteffort-podbdb5035e_0d08_4b3f_8e08_97dc3e8ae8b7.slice - libcontainer container kubepods-besteffort-podbdb5035e_0d08_4b3f_8e08_97dc3e8ae8b7.slice. Jul 15 05:19:18.191424 containerd[1743]: time="2025-07-15T05:19:18.191382419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-h92x4,Uid:d0441816-eb8b-4adf-9c57-452d4b3a9e9f,Namespace:kube-system,Attempt:0,}" Jul 15 05:19:18.246205 containerd[1743]: time="2025-07-15T05:19:18.246004063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f987b6b86-zq6pf,Uid:bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:18.246205 containerd[1743]: time="2025-07-15T05:19:18.246048980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fbd55b5d6-zlcp7,Uid:9c7ce63e-afea-4962-bead-e12dc034d4c7,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:18.246205 containerd[1743]: time="2025-07-15T05:19:18.246003966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-c5cfl,Uid:7cbdf8eb-12a8-45a5-adbf-c095ae49accc,Namespace:kube-system,Attempt:0,}" Jul 15 05:19:18.789891 containerd[1743]: time="2025-07-15T05:19:18.789789907Z" level=error msg="Failed to destroy network for sandbox \"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.792991 containerd[1743]: time="2025-07-15T05:19:18.792876236Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-h92x4,Uid:d0441816-eb8b-4adf-9c57-452d4b3a9e9f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.793320 kubelet[3163]: E0715 05:19:18.793256 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.793874 kubelet[3163]: E0715 05:19:18.793343 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-h92x4" Jul 15 05:19:18.793874 kubelet[3163]: E0715 05:19:18.793362 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-h92x4" Jul 15 05:19:18.793874 kubelet[3163]: E0715 05:19:18.793417 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-h92x4_kube-system(d0441816-eb8b-4adf-9c57-452d4b3a9e9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-h92x4_kube-system(d0441816-eb8b-4adf-9c57-452d4b3a9e9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"19ec0834f525da462859eed68007c0d37d0ab47ee2e806c3381de470a26c566b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-h92x4" podUID="d0441816-eb8b-4adf-9c57-452d4b3a9e9f" Jul 15 05:19:18.794027 containerd[1743]: time="2025-07-15T05:19:18.793501050Z" level=error msg="Failed to destroy network for sandbox \"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.796600 containerd[1743]: time="2025-07-15T05:19:18.796562728Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f987b6b86-zq6pf,Uid:bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.797081 kubelet[3163]: E0715 05:19:18.797049 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.797168 kubelet[3163]: E0715 05:19:18.797107 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f987b6b86-zq6pf" Jul 15 05:19:18.797168 kubelet[3163]: E0715 05:19:18.797124 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-f987b6b86-zq6pf" Jul 15 05:19:18.798286 kubelet[3163]: E0715 05:19:18.797163 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-f987b6b86-zq6pf_calico-system(bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-f987b6b86-zq6pf_calico-system(bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4b8f37b98fb8f061161529193f70f6c8032a816f1a4675ea2800237c238a221\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-f987b6b86-zq6pf" podUID="bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7" Jul 15 05:19:18.807097 containerd[1743]: time="2025-07-15T05:19:18.807067837Z" level=error msg="Failed to destroy network for sandbox \"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.809675 containerd[1743]: time="2025-07-15T05:19:18.809636658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fbd55b5d6-zlcp7,Uid:9c7ce63e-afea-4962-bead-e12dc034d4c7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.809933 kubelet[3163]: E0715 05:19:18.809912 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.810180 kubelet[3163]: E0715 05:19:18.810161 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" Jul 15 05:19:18.810258 kubelet[3163]: E0715 05:19:18.810246 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" Jul 15 05:19:18.810340 kubelet[3163]: E0715 05:19:18.810322 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-fbd55b5d6-zlcp7_calico-system(9c7ce63e-afea-4962-bead-e12dc034d4c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-fbd55b5d6-zlcp7_calico-system(9c7ce63e-afea-4962-bead-e12dc034d4c7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05d9afa220d7ac6d532a532acd08903c8fc38360a3a2730489da15ba22fef7ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" podUID="9c7ce63e-afea-4962-bead-e12dc034d4c7" Jul 15 05:19:18.815113 containerd[1743]: time="2025-07-15T05:19:18.815085748Z" level=error msg="Failed to destroy network for sandbox \"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.818296 containerd[1743]: time="2025-07-15T05:19:18.818227223Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-c5cfl,Uid:7cbdf8eb-12a8-45a5-adbf-c095ae49accc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.819644 kubelet[3163]: E0715 05:19:18.819304 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:18.819749 kubelet[3163]: E0715 05:19:18.819717 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-c5cfl" Jul 15 05:19:18.819749 kubelet[3163]: E0715 05:19:18.819737 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-c5cfl" Jul 15 05:19:18.820124 kubelet[3163]: E0715 05:19:18.819815 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-c5cfl_kube-system(7cbdf8eb-12a8-45a5-adbf-c095ae49accc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-c5cfl_kube-system(7cbdf8eb-12a8-45a5-adbf-c095ae49accc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed2d33a39fc2b8353a302afa6ad55e422d9046b09438fba2916e5fe3348e1e19\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-c5cfl" podUID="7cbdf8eb-12a8-45a5-adbf-c095ae49accc" Jul 15 05:19:18.824175 containerd[1743]: time="2025-07-15T05:19:18.824153624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 05:19:19.020259 kubelet[3163]: E0715 05:19:19.020212 3163 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:19:19.020375 kubelet[3163]: E0715 05:19:19.020298 3163 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-ca-bundle podName:9a75b180-3ce9-4020-99a4-ab3fc3f07d66 nodeName:}" failed. No retries permitted until 2025-07-15 05:19:19.52028106 +0000 UTC m=+34.891275951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-ca-bundle") pod "goldmane-58fd7646b9-sxqg6" (UID: "9a75b180-3ce9-4020-99a4-ab3fc3f07d66") : failed to sync configmap cache: timed out waiting for the condition Jul 15 05:19:19.020462 kubelet[3163]: E0715 05:19:19.020212 3163 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Jul 15 05:19:19.020486 kubelet[3163]: E0715 05:19:19.020471 3163 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-key-pair podName:9a75b180-3ce9-4020-99a4-ab3fc3f07d66 nodeName:}" failed. No retries permitted until 2025-07-15 05:19:19.520456331 +0000 UTC m=+34.891451226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/9a75b180-3ce9-4020-99a4-ab3fc3f07d66-goldmane-key-pair") pod "goldmane-58fd7646b9-sxqg6" (UID: "9a75b180-3ce9-4020-99a4-ab3fc3f07d66") : failed to sync secret cache: timed out waiting for the condition Jul 15 05:19:19.118364 containerd[1743]: time="2025-07-15T05:19:19.118243080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-wrw9j,Uid:89a2e3cd-7a2d-4602-b983-f5c595a6dead,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:19:19.125768 containerd[1743]: time="2025-07-15T05:19:19.125739751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-b6jrw,Uid:07ac1f5f-59ee-45e7-8079-7306d9542bfe,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:19:19.185171 containerd[1743]: time="2025-07-15T05:19:19.183655834Z" level=error msg="Failed to destroy network for sandbox \"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.186090 systemd[1]: run-netns-cni\x2d1042c100\x2db42c\x2de170\x2d84d9\x2d2e09f9ed8724.mount: Deactivated successfully. Jul 15 05:19:19.191356 containerd[1743]: time="2025-07-15T05:19:19.191327784Z" level=error msg="Failed to destroy network for sandbox \"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.194591 containerd[1743]: time="2025-07-15T05:19:19.194563884Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-wrw9j,Uid:89a2e3cd-7a2d-4602-b983-f5c595a6dead,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.194819 kubelet[3163]: E0715 05:19:19.194791 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.194880 kubelet[3163]: E0715 05:19:19.194839 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" Jul 15 05:19:19.194880 kubelet[3163]: E0715 05:19:19.194859 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" Jul 15 05:19:19.194931 kubelet[3163]: E0715 05:19:19.194898 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-796c4d775c-wrw9j_calico-apiserver(89a2e3cd-7a2d-4602-b983-f5c595a6dead)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-796c4d775c-wrw9j_calico-apiserver(89a2e3cd-7a2d-4602-b983-f5c595a6dead)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"566d75acc1ee083092c3b8e8b3a071a8d1faeede8103e6b795f5c8481380c181\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" podUID="89a2e3cd-7a2d-4602-b983-f5c595a6dead" Jul 15 05:19:19.197273 containerd[1743]: time="2025-07-15T05:19:19.197238439Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-b6jrw,Uid:07ac1f5f-59ee-45e7-8079-7306d9542bfe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.197434 kubelet[3163]: E0715 05:19:19.197399 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.197497 kubelet[3163]: E0715 05:19:19.197452 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" Jul 15 05:19:19.197497 kubelet[3163]: E0715 05:19:19.197473 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" Jul 15 05:19:19.197552 kubelet[3163]: E0715 05:19:19.197522 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-796c4d775c-b6jrw_calico-apiserver(07ac1f5f-59ee-45e7-8079-7306d9542bfe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-796c4d775c-b6jrw_calico-apiserver(07ac1f5f-59ee-45e7-8079-7306d9542bfe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c49468d089de9c42a631cd8e16078a8836af1d8f5fc8fbd2c644304c82ed5638\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" podUID="07ac1f5f-59ee-45e7-8079-7306d9542bfe" Jul 15 05:19:19.710603 systemd[1]: Created slice kubepods-besteffort-pod21f441b2_c7b8_4d3b_84a5_4e3e2ffc87ba.slice - libcontainer container kubepods-besteffort-pod21f441b2_c7b8_4d3b_84a5_4e3e2ffc87ba.slice. Jul 15 05:19:19.712338 containerd[1743]: time="2025-07-15T05:19:19.712298121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dg9p5,Uid:21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:19.739171 containerd[1743]: time="2025-07-15T05:19:19.739060790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-sxqg6,Uid:9a75b180-3ce9-4020-99a4-ab3fc3f07d66,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:19.760734 containerd[1743]: time="2025-07-15T05:19:19.760664787Z" level=error msg="Failed to destroy network for sandbox \"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.763366 containerd[1743]: time="2025-07-15T05:19:19.763321156Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dg9p5,Uid:21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.763686 kubelet[3163]: E0715 05:19:19.763652 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.763765 kubelet[3163]: E0715 05:19:19.763704 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:19.763765 kubelet[3163]: E0715 05:19:19.763734 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dg9p5" Jul 15 05:19:19.763814 kubelet[3163]: E0715 05:19:19.763780 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dg9p5_calico-system(21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dg9p5_calico-system(21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"07cd033147a20358c110536d6b49ae46ea2782decd2708192ca24eb198e101d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dg9p5" podUID="21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba" Jul 15 05:19:19.792360 containerd[1743]: time="2025-07-15T05:19:19.792320999Z" level=error msg="Failed to destroy network for sandbox \"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.795708 containerd[1743]: time="2025-07-15T05:19:19.795683624Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-sxqg6,Uid:9a75b180-3ce9-4020-99a4-ab3fc3f07d66,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.795842 kubelet[3163]: E0715 05:19:19.795818 3163 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:19:19.796157 kubelet[3163]: E0715 05:19:19.795862 3163 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:19.796157 kubelet[3163]: E0715 05:19:19.795879 3163 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-sxqg6" Jul 15 05:19:19.796157 kubelet[3163]: E0715 05:19:19.795917 3163 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-sxqg6_calico-system(9a75b180-3ce9-4020-99a4-ab3fc3f07d66)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-sxqg6_calico-system(9a75b180-3ce9-4020-99a4-ab3fc3f07d66)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d4722bbbba80d3f616553eea50cd7930fde3d3faa8dc9ea3a019ffbd4674322\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-sxqg6" podUID="9a75b180-3ce9-4020-99a4-ab3fc3f07d66" Jul 15 05:19:19.818867 systemd[1]: run-netns-cni\x2dfc6e7fb1\x2da2a6\x2dfdf8\x2d40ef\x2d507a71b1f4f7.mount: Deactivated successfully. Jul 15 05:19:25.422072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3761153122.mount: Deactivated successfully. Jul 15 05:19:25.464917 containerd[1743]: time="2025-07-15T05:19:25.464873232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:25.467628 containerd[1743]: time="2025-07-15T05:19:25.467449525Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 15 05:19:25.471993 containerd[1743]: time="2025-07-15T05:19:25.471966371Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:25.477774 containerd[1743]: time="2025-07-15T05:19:25.477745226Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:25.478296 containerd[1743]: time="2025-07-15T05:19:25.478178032Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 6.653998019s" Jul 15 05:19:25.478296 containerd[1743]: time="2025-07-15T05:19:25.478209159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 15 05:19:25.489384 containerd[1743]: time="2025-07-15T05:19:25.489259545Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 05:19:25.510640 containerd[1743]: time="2025-07-15T05:19:25.507820940Z" level=info msg="Container aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:25.528848 containerd[1743]: time="2025-07-15T05:19:25.528819923Z" level=info msg="CreateContainer within sandbox \"7940b31810c2c207cea4ec299ae479317d5ed7f35a45965872ebb4a35d3da4a7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\"" Jul 15 05:19:25.529225 containerd[1743]: time="2025-07-15T05:19:25.529206174Z" level=info msg="StartContainer for \"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\"" Jul 15 05:19:25.530273 containerd[1743]: time="2025-07-15T05:19:25.530178752Z" level=info msg="connecting to shim aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db" address="unix:///run/containerd/s/ed17e44c79ee9b68cc57a1a230caf47d288758c8d3bd1ce9d5ab079116fff63f" protocol=ttrpc version=3 Jul 15 05:19:25.549098 systemd[1]: Started cri-containerd-aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db.scope - libcontainer container aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db. Jul 15 05:19:25.581305 containerd[1743]: time="2025-07-15T05:19:25.581270677Z" level=info msg="StartContainer for \"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" returns successfully" Jul 15 05:19:25.798176 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 05:19:25.798299 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 05:19:25.920327 containerd[1743]: time="2025-07-15T05:19:25.920283316Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"9251396635d53a55e2391d1f2034d0232ce936d6f3d1fcc75efd5225433697d2\" pid:4236 exit_status:1 exited_at:{seconds:1752556765 nanos:919890813}" Jul 15 05:19:25.943589 kubelet[3163]: I0715 05:19:25.943533 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6cpwz" podStartSLOduration=1.059160126 podStartE2EDuration="19.943514586s" podCreationTimestamp="2025-07-15 05:19:06 +0000 UTC" firstStartedPulling="2025-07-15 05:19:06.594525368 +0000 UTC m=+21.965520254" lastFinishedPulling="2025-07-15 05:19:25.478879815 +0000 UTC m=+40.849874714" observedRunningTime="2025-07-15 05:19:25.863479918 +0000 UTC m=+41.234474815" watchObservedRunningTime="2025-07-15 05:19:25.943514586 +0000 UTC m=+41.314509554" Jul 15 05:19:25.967158 kubelet[3163]: I0715 05:19:25.967131 3163 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-ca-bundle\") pod \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " Jul 15 05:19:25.967263 kubelet[3163]: I0715 05:19:25.967173 3163 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-backend-key-pair\") pod \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " Jul 15 05:19:25.967263 kubelet[3163]: I0715 05:19:25.967197 3163 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bplg5\" (UniqueName: \"kubernetes.io/projected/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-kube-api-access-bplg5\") pod \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\" (UID: \"bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7\") " Jul 15 05:19:25.967977 kubelet[3163]: I0715 05:19:25.967928 3163 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7" (UID: "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 15 05:19:25.972482 kubelet[3163]: I0715 05:19:25.972401 3163 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-kube-api-access-bplg5" (OuterVolumeSpecName: "kube-api-access-bplg5") pod "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7" (UID: "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7"). InnerVolumeSpecName "kube-api-access-bplg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:19:25.973031 kubelet[3163]: I0715 05:19:25.973006 3163 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7" (UID: "bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:19:26.067422 kubelet[3163]: I0715 05:19:26.067392 3163 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bplg5\" (UniqueName: \"kubernetes.io/projected/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-kube-api-access-bplg5\") on node \"ci-4396.0.0-n-1e5a06c7e3\" DevicePath \"\"" Jul 15 05:19:26.067422 kubelet[3163]: I0715 05:19:26.067422 3163 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-ca-bundle\") on node \"ci-4396.0.0-n-1e5a06c7e3\" DevicePath \"\"" Jul 15 05:19:26.067566 kubelet[3163]: I0715 05:19:26.067435 3163 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7-whisker-backend-key-pair\") on node \"ci-4396.0.0-n-1e5a06c7e3\" DevicePath \"\"" Jul 15 05:19:26.422085 systemd[1]: var-lib-kubelet-pods-bdb5035e\x2d0d08\x2d4b3f\x2d8e08\x2d97dc3e8ae8b7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbplg5.mount: Deactivated successfully. Jul 15 05:19:26.422192 systemd[1]: var-lib-kubelet-pods-bdb5035e\x2d0d08\x2d4b3f\x2d8e08\x2d97dc3e8ae8b7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 05:19:26.711993 systemd[1]: Removed slice kubepods-besteffort-podbdb5035e_0d08_4b3f_8e08_97dc3e8ae8b7.slice - libcontainer container kubepods-besteffort-podbdb5035e_0d08_4b3f_8e08_97dc3e8ae8b7.slice. Jul 15 05:19:26.924373 systemd[1]: Created slice kubepods-besteffort-pod29e6854a_0ea3_4a3a_89f9_b08119e1120d.slice - libcontainer container kubepods-besteffort-pod29e6854a_0ea3_4a3a_89f9_b08119e1120d.slice. Jul 15 05:19:26.936895 containerd[1743]: time="2025-07-15T05:19:26.936857255Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"f11cc45c47c539c72af17e1d14a7c3b77e27d83bb3a3b0af46b915412779a420\" pid:4282 exit_status:1 exited_at:{seconds:1752556766 nanos:936634447}" Jul 15 05:19:26.973665 kubelet[3163]: I0715 05:19:26.973531 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/29e6854a-0ea3-4a3a-89f9-b08119e1120d-whisker-backend-key-pair\") pod \"whisker-f4ddbd7b6-t7wlz\" (UID: \"29e6854a-0ea3-4a3a-89f9-b08119e1120d\") " pod="calico-system/whisker-f4ddbd7b6-t7wlz" Jul 15 05:19:26.973665 kubelet[3163]: I0715 05:19:26.973584 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmsk\" (UniqueName: \"kubernetes.io/projected/29e6854a-0ea3-4a3a-89f9-b08119e1120d-kube-api-access-rwmsk\") pod \"whisker-f4ddbd7b6-t7wlz\" (UID: \"29e6854a-0ea3-4a3a-89f9-b08119e1120d\") " pod="calico-system/whisker-f4ddbd7b6-t7wlz" Jul 15 05:19:26.973665 kubelet[3163]: I0715 05:19:26.973607 3163 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29e6854a-0ea3-4a3a-89f9-b08119e1120d-whisker-ca-bundle\") pod \"whisker-f4ddbd7b6-t7wlz\" (UID: \"29e6854a-0ea3-4a3a-89f9-b08119e1120d\") " pod="calico-system/whisker-f4ddbd7b6-t7wlz" Jul 15 05:19:27.228742 containerd[1743]: time="2025-07-15T05:19:27.228330881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f4ddbd7b6-t7wlz,Uid:29e6854a-0ea3-4a3a-89f9-b08119e1120d,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:27.411233 systemd-networkd[1366]: cali444ea01ab2d: Link UP Jul 15 05:19:27.413525 systemd-networkd[1366]: cali444ea01ab2d: Gained carrier Jul 15 05:19:27.431531 containerd[1743]: 2025-07-15 05:19:27.266 [INFO][4380] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:19:27.431531 containerd[1743]: 2025-07-15 05:19:27.277 [INFO][4380] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0 whisker-f4ddbd7b6- calico-system 29e6854a-0ea3-4a3a-89f9-b08119e1120d 914 0 2025-07-15 05:19:26 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:f4ddbd7b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 whisker-f4ddbd7b6-t7wlz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali444ea01ab2d [] [] }} ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-" Jul 15 05:19:27.431531 containerd[1743]: 2025-07-15 05:19:27.277 [INFO][4380] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.431531 containerd[1743]: 2025-07-15 05:19:27.318 [INFO][4391] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" HandleID="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.319 [INFO][4391] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" HandleID="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"whisker-f4ddbd7b6-t7wlz", "timestamp":"2025-07-15 05:19:27.318594439 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.319 [INFO][4391] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.319 [INFO][4391] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.319 [INFO][4391] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.332 [INFO][4391] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.335 [INFO][4391] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.338 [INFO][4391] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.340 [INFO][4391] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.431823 containerd[1743]: 2025-07-15 05:19:27.342 [INFO][4391] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.342 [INFO][4391] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.343 [INFO][4391] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769 Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.354 [INFO][4391] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.361 [INFO][4391] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.65/26] block=192.168.123.64/26 handle="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.361 [INFO][4391] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.65/26] handle="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.361 [INFO][4391] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:27.433046 containerd[1743]: 2025-07-15 05:19:27.361 [INFO][4391] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.65/26] IPv6=[] ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" HandleID="k8s-pod-network.d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.433186 containerd[1743]: 2025-07-15 05:19:27.365 [INFO][4380] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0", GenerateName:"whisker-f4ddbd7b6-", Namespace:"calico-system", SelfLink:"", UID:"29e6854a-0ea3-4a3a-89f9-b08119e1120d", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f4ddbd7b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"whisker-f4ddbd7b6-t7wlz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.123.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali444ea01ab2d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:27.433186 containerd[1743]: 2025-07-15 05:19:27.366 [INFO][4380] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.65/32] ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.433273 containerd[1743]: 2025-07-15 05:19:27.366 [INFO][4380] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali444ea01ab2d ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.433273 containerd[1743]: 2025-07-15 05:19:27.413 [INFO][4380] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.433316 containerd[1743]: 2025-07-15 05:19:27.415 [INFO][4380] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0", GenerateName:"whisker-f4ddbd7b6-", Namespace:"calico-system", SelfLink:"", UID:"29e6854a-0ea3-4a3a-89f9-b08119e1120d", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f4ddbd7b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769", Pod:"whisker-f4ddbd7b6-t7wlz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.123.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali444ea01ab2d", MAC:"0e:a0:6e:07:b2:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:27.433370 containerd[1743]: 2025-07-15 05:19:27.428 [INFO][4380] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" Namespace="calico-system" Pod="whisker-f4ddbd7b6-t7wlz" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-whisker--f4ddbd7b6--t7wlz-eth0" Jul 15 05:19:27.495261 containerd[1743]: time="2025-07-15T05:19:27.495092176Z" level=info msg="connecting to shim d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769" address="unix:///run/containerd/s/0d9fbd1e5f17390644eb90052cc9a69d1f2a4c0998306ee0483dff08e8252ae6" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:27.524100 systemd[1]: Started cri-containerd-d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769.scope - libcontainer container d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769. Jul 15 05:19:27.566647 containerd[1743]: time="2025-07-15T05:19:27.566610058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f4ddbd7b6-t7wlz,Uid:29e6854a-0ea3-4a3a-89f9-b08119e1120d,Namespace:calico-system,Attempt:0,} returns sandbox id \"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769\"" Jul 15 05:19:27.569853 containerd[1743]: time="2025-07-15T05:19:27.569780279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 05:19:27.807590 systemd-networkd[1366]: vxlan.calico: Link UP Jul 15 05:19:27.807597 systemd-networkd[1366]: vxlan.calico: Gained carrier Jul 15 05:19:28.490074 systemd-networkd[1366]: cali444ea01ab2d: Gained IPv6LL Jul 15 05:19:28.708689 kubelet[3163]: I0715 05:19:28.708646 3163 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7" path="/var/lib/kubelet/pods/bdb5035e-0d08-4b3f-8e08-97dc3e8ae8b7/volumes" Jul 15 05:19:28.874106 systemd-networkd[1366]: vxlan.calico: Gained IPv6LL Jul 15 05:19:29.016616 containerd[1743]: time="2025-07-15T05:19:29.016576333Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:29.019333 containerd[1743]: time="2025-07-15T05:19:29.019301195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 15 05:19:29.022432 containerd[1743]: time="2025-07-15T05:19:29.022388959Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:29.026796 containerd[1743]: time="2025-07-15T05:19:29.026741092Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:29.027443 containerd[1743]: time="2025-07-15T05:19:29.027101007Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.457280928s" Jul 15 05:19:29.027443 containerd[1743]: time="2025-07-15T05:19:29.027128306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 15 05:19:29.028947 containerd[1743]: time="2025-07-15T05:19:29.028907772Z" level=info msg="CreateContainer within sandbox \"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 05:19:29.059863 containerd[1743]: time="2025-07-15T05:19:29.059834973Z" level=info msg="Container 0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:29.083458 containerd[1743]: time="2025-07-15T05:19:29.083432879Z" level=info msg="CreateContainer within sandbox \"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc\"" Jul 15 05:19:29.083974 containerd[1743]: time="2025-07-15T05:19:29.083795212Z" level=info msg="StartContainer for \"0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc\"" Jul 15 05:19:29.084936 containerd[1743]: time="2025-07-15T05:19:29.084896729Z" level=info msg="connecting to shim 0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc" address="unix:///run/containerd/s/0d9fbd1e5f17390644eb90052cc9a69d1f2a4c0998306ee0483dff08e8252ae6" protocol=ttrpc version=3 Jul 15 05:19:29.104077 systemd[1]: Started cri-containerd-0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc.scope - libcontainer container 0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc. Jul 15 05:19:29.149356 containerd[1743]: time="2025-07-15T05:19:29.148585356Z" level=info msg="StartContainer for \"0656a7496ba96cadd0ddb4b51c88161e42f0bc3501ef73fadcd087b8da323ebc\" returns successfully" Jul 15 05:19:29.150132 containerd[1743]: time="2025-07-15T05:19:29.150109335Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 05:19:29.707272 containerd[1743]: time="2025-07-15T05:19:29.707220794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-wrw9j,Uid:89a2e3cd-7a2d-4602-b983-f5c595a6dead,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:19:29.823491 systemd-networkd[1366]: cali5279b0827c7: Link UP Jul 15 05:19:29.824494 systemd-networkd[1366]: cali5279b0827c7: Gained carrier Jul 15 05:19:29.841875 containerd[1743]: 2025-07-15 05:19:29.767 [INFO][4595] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0 calico-apiserver-796c4d775c- calico-apiserver 89a2e3cd-7a2d-4602-b983-f5c595a6dead 841 0 2025-07-15 05:19:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:796c4d775c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 calico-apiserver-796c4d775c-wrw9j eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5279b0827c7 [] [] }} ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-" Jul 15 05:19:29.841875 containerd[1743]: 2025-07-15 05:19:29.767 [INFO][4595] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.841875 containerd[1743]: 2025-07-15 05:19:29.787 [INFO][4607] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" HandleID="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.787 [INFO][4607] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" HandleID="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"calico-apiserver-796c4d775c-wrw9j", "timestamp":"2025-07-15 05:19:29.787737403 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.787 [INFO][4607] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.787 [INFO][4607] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.788 [INFO][4607] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.793 [INFO][4607] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.796 [INFO][4607] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.799 [INFO][4607] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.800 [INFO][4607] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842080 containerd[1743]: 2025-07-15 05:19:29.802 [INFO][4607] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.802 [INFO][4607] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.804 [INFO][4607] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.810 [INFO][4607] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.818 [INFO][4607] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.66/26] block=192.168.123.64/26 handle="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.818 [INFO][4607] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.66/26] handle="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.818 [INFO][4607] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:29.842259 containerd[1743]: 2025-07-15 05:19:29.819 [INFO][4607] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.66/26] IPv6=[] ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" HandleID="k8s-pod-network.fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.842389 containerd[1743]: 2025-07-15 05:19:29.820 [INFO][4595] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0", GenerateName:"calico-apiserver-796c4d775c-", Namespace:"calico-apiserver", SelfLink:"", UID:"89a2e3cd-7a2d-4602-b983-f5c595a6dead", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"796c4d775c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"calico-apiserver-796c4d775c-wrw9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.123.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5279b0827c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:29.842444 containerd[1743]: 2025-07-15 05:19:29.820 [INFO][4595] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.66/32] ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.842444 containerd[1743]: 2025-07-15 05:19:29.820 [INFO][4595] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5279b0827c7 ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.842444 containerd[1743]: 2025-07-15 05:19:29.825 [INFO][4595] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.842497 containerd[1743]: 2025-07-15 05:19:29.826 [INFO][4595] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0", GenerateName:"calico-apiserver-796c4d775c-", Namespace:"calico-apiserver", SelfLink:"", UID:"89a2e3cd-7a2d-4602-b983-f5c595a6dead", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"796c4d775c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d", Pod:"calico-apiserver-796c4d775c-wrw9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.123.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5279b0827c7", MAC:"42:cb:c8:95:12:a8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:29.842548 containerd[1743]: 2025-07-15 05:19:29.839 [INFO][4595] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-wrw9j" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--wrw9j-eth0" Jul 15 05:19:29.884025 containerd[1743]: time="2025-07-15T05:19:29.883993603Z" level=info msg="connecting to shim fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d" address="unix:///run/containerd/s/7a53f9b9cb53a70ad184d1d84c39f44f6cfb9ef905b05c590cfa69495ff38ade" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:29.902095 systemd[1]: Started cri-containerd-fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d.scope - libcontainer container fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d. Jul 15 05:19:29.969232 containerd[1743]: time="2025-07-15T05:19:29.969161382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-wrw9j,Uid:89a2e3cd-7a2d-4602-b983-f5c595a6dead,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d\"" Jul 15 05:19:30.706972 containerd[1743]: time="2025-07-15T05:19:30.706880528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fbd55b5d6-zlcp7,Uid:9c7ce63e-afea-4962-bead-e12dc034d4c7,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:30.707452 containerd[1743]: time="2025-07-15T05:19:30.707076364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-h92x4,Uid:d0441816-eb8b-4adf-9c57-452d4b3a9e9f,Namespace:kube-system,Attempt:0,}" Jul 15 05:19:30.823839 systemd-networkd[1366]: calica70b18471d: Link UP Jul 15 05:19:30.824042 systemd-networkd[1366]: calica70b18471d: Gained carrier Jul 15 05:19:30.837786 containerd[1743]: 2025-07-15 05:19:30.751 [INFO][4673] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0 coredns-7c65d6cfc9- kube-system d0441816-eb8b-4adf-9c57-452d4b3a9e9f 834 0 2025-07-15 05:18:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 coredns-7c65d6cfc9-h92x4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calica70b18471d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-" Jul 15 05:19:30.837786 containerd[1743]: 2025-07-15 05:19:30.751 [INFO][4673] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.837786 containerd[1743]: 2025-07-15 05:19:30.779 [INFO][4695] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" HandleID="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.780 [INFO][4695] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" HandleID="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"coredns-7c65d6cfc9-h92x4", "timestamp":"2025-07-15 05:19:30.779643986 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.780 [INFO][4695] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.780 [INFO][4695] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.780 [INFO][4695] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.787 [INFO][4695] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.791 [INFO][4695] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.794 [INFO][4695] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.795 [INFO][4695] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838229 containerd[1743]: 2025-07-15 05:19:30.797 [INFO][4695] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.797 [INFO][4695] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.798 [INFO][4695] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.806 [INFO][4695] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4695] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.67/26] block=192.168.123.64/26 handle="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4695] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.67/26] handle="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4695] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:30.838998 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4695] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.67/26] IPv6=[] ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" HandleID="k8s-pod-network.23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.819 [INFO][4673] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d0441816-eb8b-4adf-9c57-452d4b3a9e9f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"coredns-7c65d6cfc9-h92x4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.123.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica70b18471d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.819 [INFO][4673] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.67/32] ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.819 [INFO][4673] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica70b18471d ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.823 [INFO][4673] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.823 [INFO][4673] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d0441816-eb8b-4adf-9c57-452d4b3a9e9f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a", Pod:"coredns-7c65d6cfc9-h92x4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.123.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica70b18471d", MAC:"0a:6d:b8:28:78:48", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:30.839294 containerd[1743]: 2025-07-15 05:19:30.836 [INFO][4673] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-h92x4" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--h92x4-eth0" Jul 15 05:19:30.889598 containerd[1743]: time="2025-07-15T05:19:30.889317689Z" level=info msg="connecting to shim 23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a" address="unix:///run/containerd/s/7953a83d1d3525310ddb1b824e926fd3c44c2d257e383fae5dd100ca5c338fc7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:30.921876 systemd[1]: Started cri-containerd-23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a.scope - libcontainer container 23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a. Jul 15 05:19:30.938018 systemd-networkd[1366]: cali17f1bcb399a: Link UP Jul 15 05:19:30.939552 systemd-networkd[1366]: cali17f1bcb399a: Gained carrier Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.766 [INFO][4684] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0 calico-kube-controllers-fbd55b5d6- calico-system 9c7ce63e-afea-4962-bead-e12dc034d4c7 843 0 2025-07-15 05:19:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:fbd55b5d6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 calico-kube-controllers-fbd55b5d6-zlcp7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali17f1bcb399a [] [] }} ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.766 [INFO][4684] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.790 [INFO][4705] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" HandleID="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.790 [INFO][4705] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" HandleID="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5640), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"calico-kube-controllers-fbd55b5d6-zlcp7", "timestamp":"2025-07-15 05:19:30.790508846 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.790 [INFO][4705] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4705] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.815 [INFO][4705] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.888 [INFO][4705] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.892 [INFO][4705] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.902 [INFO][4705] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.903 [INFO][4705] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.906 [INFO][4705] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.906 [INFO][4705] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.907 [INFO][4705] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.911 [INFO][4705] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.921 [INFO][4705] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.68/26] block=192.168.123.64/26 handle="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.921 [INFO][4705] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.68/26] handle="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.921 [INFO][4705] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:30.969370 containerd[1743]: 2025-07-15 05:19:30.921 [INFO][4705] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.68/26] IPv6=[] ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" HandleID="k8s-pod-network.7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.926 [INFO][4684] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0", GenerateName:"calico-kube-controllers-fbd55b5d6-", Namespace:"calico-system", SelfLink:"", UID:"9c7ce63e-afea-4962-bead-e12dc034d4c7", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fbd55b5d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"calico-kube-controllers-fbd55b5d6-zlcp7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.123.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali17f1bcb399a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.926 [INFO][4684] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.68/32] ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.926 [INFO][4684] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali17f1bcb399a ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.944 [INFO][4684] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.945 [INFO][4684] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0", GenerateName:"calico-kube-controllers-fbd55b5d6-", Namespace:"calico-system", SelfLink:"", UID:"9c7ce63e-afea-4962-bead-e12dc034d4c7", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fbd55b5d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f", Pod:"calico-kube-controllers-fbd55b5d6-zlcp7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.123.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali17f1bcb399a", MAC:"86:c3:23:a4:a2:cd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:30.969847 containerd[1743]: 2025-07-15 05:19:30.965 [INFO][4684] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" Namespace="calico-system" Pod="calico-kube-controllers-fbd55b5d6-zlcp7" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--kube--controllers--fbd55b5d6--zlcp7-eth0" Jul 15 05:19:30.986136 systemd-networkd[1366]: cali5279b0827c7: Gained IPv6LL Jul 15 05:19:31.001151 containerd[1743]: time="2025-07-15T05:19:31.001129662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-h92x4,Uid:d0441816-eb8b-4adf-9c57-452d4b3a9e9f,Namespace:kube-system,Attempt:0,} returns sandbox id \"23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a\"" Jul 15 05:19:31.003062 containerd[1743]: time="2025-07-15T05:19:31.003009910Z" level=info msg="CreateContainer within sandbox \"23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:19:31.028049 containerd[1743]: time="2025-07-15T05:19:31.027825035Z" level=info msg="connecting to shim 7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f" address="unix:///run/containerd/s/c3024df5297226a79dbe27d60c8efe01c249452fc05fd75d28302808c4b51a56" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:31.030556 containerd[1743]: time="2025-07-15T05:19:31.030531217Z" level=info msg="Container b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:31.055090 systemd[1]: Started cri-containerd-7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f.scope - libcontainer container 7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f. Jul 15 05:19:31.191708 containerd[1743]: time="2025-07-15T05:19:31.191607797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fbd55b5d6-zlcp7,Uid:9c7ce63e-afea-4962-bead-e12dc034d4c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f\"" Jul 15 05:19:31.192113 containerd[1743]: time="2025-07-15T05:19:31.192079592Z" level=info msg="CreateContainer within sandbox \"23d84f5cf7012701d0186ff2897ef9fdd640421ef6abf766d8bb83fee961992a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d\"" Jul 15 05:19:31.193719 containerd[1743]: time="2025-07-15T05:19:31.192972289Z" level=info msg="StartContainer for \"b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d\"" Jul 15 05:19:31.193719 containerd[1743]: time="2025-07-15T05:19:31.193667900Z" level=info msg="connecting to shim b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d" address="unix:///run/containerd/s/7953a83d1d3525310ddb1b824e926fd3c44c2d257e383fae5dd100ca5c338fc7" protocol=ttrpc version=3 Jul 15 05:19:31.215165 systemd[1]: Started cri-containerd-b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d.scope - libcontainer container b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d. Jul 15 05:19:31.251301 containerd[1743]: time="2025-07-15T05:19:31.251155321Z" level=info msg="StartContainer for \"b13bd7daba79955cffff1926574474993b14adb44f607f341b3eb7c10b1de72d\" returns successfully" Jul 15 05:19:31.614456 containerd[1743]: time="2025-07-15T05:19:31.614415858Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:31.616896 containerd[1743]: time="2025-07-15T05:19:31.616851540Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 15 05:19:31.620325 containerd[1743]: time="2025-07-15T05:19:31.620286596Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:31.623898 containerd[1743]: time="2025-07-15T05:19:31.623846259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:31.624452 containerd[1743]: time="2025-07-15T05:19:31.624346035Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.474207203s" Jul 15 05:19:31.624452 containerd[1743]: time="2025-07-15T05:19:31.624374019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 15 05:19:31.626024 containerd[1743]: time="2025-07-15T05:19:31.625972447Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:19:31.629903 containerd[1743]: time="2025-07-15T05:19:31.629141104Z" level=info msg="CreateContainer within sandbox \"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 05:19:31.663411 containerd[1743]: time="2025-07-15T05:19:31.663379806Z" level=info msg="Container 0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:31.681796 containerd[1743]: time="2025-07-15T05:19:31.681770895Z" level=info msg="CreateContainer within sandbox \"d9e24a7556696e37bce4958b1ef3c75bdb556942f41eed86c52e6907c7bb3769\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388\"" Jul 15 05:19:31.682118 containerd[1743]: time="2025-07-15T05:19:31.682102570Z" level=info msg="StartContainer for \"0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388\"" Jul 15 05:19:31.683727 containerd[1743]: time="2025-07-15T05:19:31.683690217Z" level=info msg="connecting to shim 0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388" address="unix:///run/containerd/s/0d9fbd1e5f17390644eb90052cc9a69d1f2a4c0998306ee0483dff08e8252ae6" protocol=ttrpc version=3 Jul 15 05:19:31.698115 systemd[1]: Started cri-containerd-0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388.scope - libcontainer container 0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388. Jul 15 05:19:31.739848 containerd[1743]: time="2025-07-15T05:19:31.739823087Z" level=info msg="StartContainer for \"0bc516275af2153a3be0d1cf1098e2f626426e55f530b6f80b70428f8a603388\" returns successfully" Jul 15 05:19:31.873241 kubelet[3163]: I0715 05:19:31.872386 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-h92x4" podStartSLOduration=40.872367814 podStartE2EDuration="40.872367814s" podCreationTimestamp="2025-07-15 05:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:19:31.871546569 +0000 UTC m=+47.242541470" watchObservedRunningTime="2025-07-15 05:19:31.872367814 +0000 UTC m=+47.243362717" Jul 15 05:19:32.139086 systemd-networkd[1366]: cali17f1bcb399a: Gained IPv6LL Jul 15 05:19:32.707910 containerd[1743]: time="2025-07-15T05:19:32.707428855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dg9p5,Uid:21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:32.801646 systemd-networkd[1366]: calie549c14c17f: Link UP Jul 15 05:19:32.802471 systemd-networkd[1366]: calie549c14c17f: Gained carrier Jul 15 05:19:32.814808 kubelet[3163]: I0715 05:19:32.814751 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-f4ddbd7b6-t7wlz" podStartSLOduration=2.757068945 podStartE2EDuration="6.814732163s" podCreationTimestamp="2025-07-15 05:19:26 +0000 UTC" firstStartedPulling="2025-07-15 05:19:27.568019603 +0000 UTC m=+42.939014501" lastFinishedPulling="2025-07-15 05:19:31.625682824 +0000 UTC m=+46.996677719" observedRunningTime="2025-07-15 05:19:31.920025067 +0000 UTC m=+47.291019992" watchObservedRunningTime="2025-07-15 05:19:32.814732163 +0000 UTC m=+48.185727060" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.750 [INFO][4899] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0 csi-node-driver- calico-system 21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba 721 0 2025-07-15 05:19:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 csi-node-driver-dg9p5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie549c14c17f [] [] }} ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.750 [INFO][4899] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.768 [INFO][4910] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" HandleID="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.768 [INFO][4910] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" HandleID="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd040), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"csi-node-driver-dg9p5", "timestamp":"2025-07-15 05:19:32.768381494 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.768 [INFO][4910] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.768 [INFO][4910] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.768 [INFO][4910] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.772 [INFO][4910] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.774 [INFO][4910] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.777 [INFO][4910] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.778 [INFO][4910] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.780 [INFO][4910] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.780 [INFO][4910] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.781 [INFO][4910] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.787 [INFO][4910] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.796 [INFO][4910] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.69/26] block=192.168.123.64/26 handle="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.796 [INFO][4910] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.69/26] handle="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.796 [INFO][4910] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:32.816685 containerd[1743]: 2025-07-15 05:19:32.796 [INFO][4910] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.69/26] IPv6=[] ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" HandleID="k8s-pod-network.f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.798 [INFO][4899] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"csi-node-driver-dg9p5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.123.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie549c14c17f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.798 [INFO][4899] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.69/32] ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.798 [INFO][4899] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie549c14c17f ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.802 [INFO][4899] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.803 [INFO][4899] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f", Pod:"csi-node-driver-dg9p5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.123.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie549c14c17f", MAC:"0a:88:80:18:16:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:32.817346 containerd[1743]: 2025-07-15 05:19:32.814 [INFO][4899] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" Namespace="calico-system" Pod="csi-node-driver-dg9p5" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-csi--node--driver--dg9p5-eth0" Jul 15 05:19:32.842030 systemd-networkd[1366]: calica70b18471d: Gained IPv6LL Jul 15 05:19:32.855368 containerd[1743]: time="2025-07-15T05:19:32.855297980Z" level=info msg="connecting to shim f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f" address="unix:///run/containerd/s/1e6d21cc7803fb1fa33d25e40c47c0eacb1d69fad8925fa85319c9ac8c58fba9" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:32.877089 systemd[1]: Started cri-containerd-f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f.scope - libcontainer container f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f. Jul 15 05:19:32.898017 containerd[1743]: time="2025-07-15T05:19:32.897986026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dg9p5,Uid:21f441b2-c7b8-4d3b-84a5-4e3e2ffc87ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f\"" Jul 15 05:19:33.707118 containerd[1743]: time="2025-07-15T05:19:33.707068426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-b6jrw,Uid:07ac1f5f-59ee-45e7-8079-7306d9542bfe,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:19:33.818112 systemd-networkd[1366]: cali630907d0471: Link UP Jul 15 05:19:33.818280 systemd-networkd[1366]: cali630907d0471: Gained carrier Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.747 [INFO][4979] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0 calico-apiserver-796c4d775c- calico-apiserver 07ac1f5f-59ee-45e7-8079-7306d9542bfe 845 0 2025-07-15 05:19:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:796c4d775c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 calico-apiserver-796c4d775c-b6jrw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali630907d0471 [] [] }} ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.747 [INFO][4979] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.774 [INFO][4990] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" HandleID="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.774 [INFO][4990] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" HandleID="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f230), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"calico-apiserver-796c4d775c-b6jrw", "timestamp":"2025-07-15 05:19:33.774439649 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.774 [INFO][4990] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.774 [INFO][4990] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.774 [INFO][4990] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.780 [INFO][4990] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.783 [INFO][4990] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.787 [INFO][4990] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.788 [INFO][4990] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.790 [INFO][4990] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.790 [INFO][4990] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.791 [INFO][4990] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.802 [INFO][4990] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.811 [INFO][4990] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.70/26] block=192.168.123.64/26 handle="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.812 [INFO][4990] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.70/26] handle="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.812 [INFO][4990] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:33.834692 containerd[1743]: 2025-07-15 05:19:33.812 [INFO][4990] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.70/26] IPv6=[] ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" HandleID="k8s-pod-network.9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.813 [INFO][4979] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0", GenerateName:"calico-apiserver-796c4d775c-", Namespace:"calico-apiserver", SelfLink:"", UID:"07ac1f5f-59ee-45e7-8079-7306d9542bfe", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"796c4d775c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"calico-apiserver-796c4d775c-b6jrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.123.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali630907d0471", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.813 [INFO][4979] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.70/32] ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.814 [INFO][4979] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali630907d0471 ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.816 [INFO][4979] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.816 [INFO][4979] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0", GenerateName:"calico-apiserver-796c4d775c-", Namespace:"calico-apiserver", SelfLink:"", UID:"07ac1f5f-59ee-45e7-8079-7306d9542bfe", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"796c4d775c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e", Pod:"calico-apiserver-796c4d775c-b6jrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.123.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali630907d0471", MAC:"16:9c:f2:2c:34:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:33.836696 containerd[1743]: 2025-07-15 05:19:33.833 [INFO][4979] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" Namespace="calico-apiserver" Pod="calico-apiserver-796c4d775c-b6jrw" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-calico--apiserver--796c4d775c--b6jrw-eth0" Jul 15 05:19:34.180045 containerd[1743]: time="2025-07-15T05:19:34.179987423Z" level=info msg="connecting to shim 9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e" address="unix:///run/containerd/s/a1a86566e4b1e33065f46c7c461838512713fade9616da4e3e9ae71f3c69c667" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:34.211101 systemd[1]: Started cri-containerd-9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e.scope - libcontainer container 9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e. Jul 15 05:19:34.262048 containerd[1743]: time="2025-07-15T05:19:34.262009072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-796c4d775c-b6jrw,Uid:07ac1f5f-59ee-45e7-8079-7306d9542bfe,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e\"" Jul 15 05:19:34.634155 systemd-networkd[1366]: calie549c14c17f: Gained IPv6LL Jul 15 05:19:34.708050 containerd[1743]: time="2025-07-15T05:19:34.707984105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-c5cfl,Uid:7cbdf8eb-12a8-45a5-adbf-c095ae49accc,Namespace:kube-system,Attempt:0,}" Jul 15 05:19:34.708278 containerd[1743]: time="2025-07-15T05:19:34.707991306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-sxqg6,Uid:9a75b180-3ce9-4020-99a4-ab3fc3f07d66,Namespace:calico-system,Attempt:0,}" Jul 15 05:19:34.884883 containerd[1743]: time="2025-07-15T05:19:34.884778767Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:34.887777 containerd[1743]: time="2025-07-15T05:19:34.887752494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 15 05:19:34.891982 containerd[1743]: time="2025-07-15T05:19:34.891549852Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:34.896845 systemd-networkd[1366]: calia72c0d46023: Link UP Jul 15 05:19:34.897841 systemd-networkd[1366]: calia72c0d46023: Gained carrier Jul 15 05:19:34.902721 containerd[1743]: time="2025-07-15T05:19:34.902694336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:34.906840 containerd[1743]: time="2025-07-15T05:19:34.906782174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 3.280674497s" Jul 15 05:19:34.906942 containerd[1743]: time="2025-07-15T05:19:34.906932881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:19:34.908847 containerd[1743]: time="2025-07-15T05:19:34.908786670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 05:19:34.911045 containerd[1743]: time="2025-07-15T05:19:34.910984484Z" level=info msg="CreateContainer within sandbox \"fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.816 [INFO][5054] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0 coredns-7c65d6cfc9- kube-system 7cbdf8eb-12a8-45a5-adbf-c095ae49accc 837 0 2025-07-15 05:18:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 coredns-7c65d6cfc9-c5cfl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia72c0d46023 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.817 [INFO][5054] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.853 [INFO][5086] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" HandleID="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.853 [INFO][5086] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" HandleID="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5860), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"coredns-7c65d6cfc9-c5cfl", "timestamp":"2025-07-15 05:19:34.85365198 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.853 [INFO][5086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.853 [INFO][5086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.853 [INFO][5086] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.861 [INFO][5086] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.864 [INFO][5086] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.870 [INFO][5086] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.873 [INFO][5086] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.876 [INFO][5086] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.876 [INFO][5086] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.878 [INFO][5086] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.882 [INFO][5086] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.890 [INFO][5086] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.71/26] block=192.168.123.64/26 handle="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.890 [INFO][5086] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.71/26] handle="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.890 [INFO][5086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:34.933413 containerd[1743]: 2025-07-15 05:19:34.890 [INFO][5086] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.71/26] IPv6=[] ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" HandleID="k8s-pod-network.5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.892 [INFO][5054] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7cbdf8eb-12a8-45a5-adbf-c095ae49accc", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"coredns-7c65d6cfc9-c5cfl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.123.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia72c0d46023", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.892 [INFO][5054] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.71/32] ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.892 [INFO][5054] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia72c0d46023 ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.895 [INFO][5054] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.898 [INFO][5054] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7cbdf8eb-12a8-45a5-adbf-c095ae49accc", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba", Pod:"coredns-7c65d6cfc9-c5cfl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.123.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia72c0d46023", MAC:"ee:04:18:99:f7:11", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:34.933910 containerd[1743]: 2025-07-15 05:19:34.914 [INFO][5054] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" Namespace="kube-system" Pod="coredns-7c65d6cfc9-c5cfl" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-coredns--7c65d6cfc9--c5cfl-eth0" Jul 15 05:19:34.941982 containerd[1743]: time="2025-07-15T05:19:34.941330558Z" level=info msg="Container d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:34.973128 containerd[1743]: time="2025-07-15T05:19:34.973098291Z" level=info msg="CreateContainer within sandbox \"fc70e220a5137c9ef62f7a03936882e6f465156b20b85f43fe2da6a25feef50d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d\"" Jul 15 05:19:34.974085 containerd[1743]: time="2025-07-15T05:19:34.973986697Z" level=info msg="StartContainer for \"d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d\"" Jul 15 05:19:34.980523 containerd[1743]: time="2025-07-15T05:19:34.980499447Z" level=info msg="connecting to shim d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d" address="unix:///run/containerd/s/7a53f9b9cb53a70ad184d1d84c39f44f6cfb9ef905b05c590cfa69495ff38ade" protocol=ttrpc version=3 Jul 15 05:19:34.998752 systemd[1]: Started cri-containerd-d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d.scope - libcontainer container d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d. Jul 15 05:19:35.006950 containerd[1743]: time="2025-07-15T05:19:35.006923223Z" level=info msg="connecting to shim 5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba" address="unix:///run/containerd/s/2baee7311193bbeb12c4292d154f2a418907c3efc5a20bfdd13147c802f2ad9a" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:35.012708 systemd-networkd[1366]: calie36e69354c4: Link UP Jul 15 05:19:35.016728 systemd-networkd[1366]: calie36e69354c4: Gained carrier Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.827 [INFO][5058] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0 goldmane-58fd7646b9- calico-system 9a75b180-3ce9-4020-99a4-ab3fc3f07d66 844 0 2025-07-15 05:19:06 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4396.0.0-n-1e5a06c7e3 goldmane-58fd7646b9-sxqg6 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie36e69354c4 [] [] }} ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.827 [INFO][5058] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.857 [INFO][5091] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" HandleID="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.857 [INFO][5091] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" HandleID="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5890), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396.0.0-n-1e5a06c7e3", "pod":"goldmane-58fd7646b9-sxqg6", "timestamp":"2025-07-15 05:19:34.857587228 +0000 UTC"}, Hostname:"ci-4396.0.0-n-1e5a06c7e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.857 [INFO][5091] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.892 [INFO][5091] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.892 [INFO][5091] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396.0.0-n-1e5a06c7e3' Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.961 [INFO][5091] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.965 [INFO][5091] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.968 [INFO][5091] ipam/ipam.go 511: Trying affinity for 192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.970 [INFO][5091] ipam/ipam.go 158: Attempting to load block cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.972 [INFO][5091] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.123.64/26 host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.972 [INFO][5091] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.123.64/26 handle="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.974 [INFO][5091] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9 Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:34.980 [INFO][5091] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.123.64/26 handle="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:35.002 [INFO][5091] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.123.72/26] block=192.168.123.64/26 handle="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:35.002 [INFO][5091] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.123.72/26] handle="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" host="ci-4396.0.0-n-1e5a06c7e3" Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:35.002 [INFO][5091] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:35.037209 containerd[1743]: 2025-07-15 05:19:35.002 [INFO][5091] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.123.72/26] IPv6=[] ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" HandleID="k8s-pod-network.7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Workload="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.007 [INFO][5058] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"9a75b180-3ce9-4020-99a4-ab3fc3f07d66", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"", Pod:"goldmane-58fd7646b9-sxqg6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.123.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie36e69354c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.008 [INFO][5058] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.123.72/32] ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.008 [INFO][5058] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie36e69354c4 ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.018 [INFO][5058] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.018 [INFO][5058] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"9a75b180-3ce9-4020-99a4-ab3fc3f07d66", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 19, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396.0.0-n-1e5a06c7e3", ContainerID:"7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9", Pod:"goldmane-58fd7646b9-sxqg6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.123.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie36e69354c4", MAC:"26:d9:60:a7:51:c7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:19:35.037717 containerd[1743]: 2025-07-15 05:19:35.032 [INFO][5058] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" Namespace="calico-system" Pod="goldmane-58fd7646b9-sxqg6" WorkloadEndpoint="ci--4396.0.0--n--1e5a06c7e3-k8s-goldmane--58fd7646b9--sxqg6-eth0" Jul 15 05:19:35.054241 systemd[1]: Started cri-containerd-5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba.scope - libcontainer container 5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba. Jul 15 05:19:35.081871 containerd[1743]: time="2025-07-15T05:19:35.081840601Z" level=info msg="connecting to shim 7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9" address="unix:///run/containerd/s/2e7368957923d3e4eb42e67a60a23185a17e7006b16fe8c656ad4a067ea0e6bf" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:19:35.109429 containerd[1743]: time="2025-07-15T05:19:35.109346891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-c5cfl,Uid:7cbdf8eb-12a8-45a5-adbf-c095ae49accc,Namespace:kube-system,Attempt:0,} returns sandbox id \"5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba\"" Jul 15 05:19:35.111778 containerd[1743]: time="2025-07-15T05:19:35.111759510Z" level=info msg="StartContainer for \"d521b2eb6eae06c7141b186e45d274d83f664bd4225bf9bd434742008c2fc28d\" returns successfully" Jul 15 05:19:35.115472 containerd[1743]: time="2025-07-15T05:19:35.115448306Z" level=info msg="CreateContainer within sandbox \"5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:19:35.117146 systemd[1]: Started cri-containerd-7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9.scope - libcontainer container 7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9. Jul 15 05:19:35.133854 containerd[1743]: time="2025-07-15T05:19:35.133829327Z" level=info msg="Container 01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:35.149761 containerd[1743]: time="2025-07-15T05:19:35.149643880Z" level=info msg="CreateContainer within sandbox \"5caf7e1e7b8be5a9f0ba7d293ce7165737fb9239517e623773a6c968fb59b6ba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a\"" Jul 15 05:19:35.150211 containerd[1743]: time="2025-07-15T05:19:35.150075365Z" level=info msg="StartContainer for \"01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a\"" Jul 15 05:19:35.152021 containerd[1743]: time="2025-07-15T05:19:35.151985544Z" level=info msg="connecting to shim 01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a" address="unix:///run/containerd/s/2baee7311193bbeb12c4292d154f2a418907c3efc5a20bfdd13147c802f2ad9a" protocol=ttrpc version=3 Jul 15 05:19:35.186329 containerd[1743]: time="2025-07-15T05:19:35.186310013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-sxqg6,Uid:9a75b180-3ce9-4020-99a4-ab3fc3f07d66,Namespace:calico-system,Attempt:0,} returns sandbox id \"7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9\"" Jul 15 05:19:35.189265 systemd[1]: Started cri-containerd-01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a.scope - libcontainer container 01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a. Jul 15 05:19:35.216204 containerd[1743]: time="2025-07-15T05:19:35.216177241Z" level=info msg="StartContainer for \"01ef0358b41210f6a3bb6cc15fa1df33e2d0116c55503e4bbc33bf6fbe42ab0a\" returns successfully" Jul 15 05:19:35.402132 systemd-networkd[1366]: cali630907d0471: Gained IPv6LL Jul 15 05:19:35.894979 kubelet[3163]: I0715 05:19:35.894917 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-c5cfl" podStartSLOduration=44.894899953 podStartE2EDuration="44.894899953s" podCreationTimestamp="2025-07-15 05:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:19:35.893605546 +0000 UTC m=+51.264600446" watchObservedRunningTime="2025-07-15 05:19:35.894899953 +0000 UTC m=+51.265894852" Jul 15 05:19:36.106122 systemd-networkd[1366]: calia72c0d46023: Gained IPv6LL Jul 15 05:19:36.885060 kubelet[3163]: I0715 05:19:36.885028 3163 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:19:36.938170 systemd-networkd[1366]: calie36e69354c4: Gained IPv6LL Jul 15 05:19:38.059518 containerd[1743]: time="2025-07-15T05:19:38.059470205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:38.062286 containerd[1743]: time="2025-07-15T05:19:38.062220238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 15 05:19:38.064908 containerd[1743]: time="2025-07-15T05:19:38.064873824Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:38.074019 containerd[1743]: time="2025-07-15T05:19:38.073573142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:38.075564 containerd[1743]: time="2025-07-15T05:19:38.075537052Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 3.166720643s" Jul 15 05:19:38.075663 containerd[1743]: time="2025-07-15T05:19:38.075646836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 15 05:19:38.077190 containerd[1743]: time="2025-07-15T05:19:38.077167644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 05:19:38.091163 containerd[1743]: time="2025-07-15T05:19:38.091043049Z" level=info msg="CreateContainer within sandbox \"7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 05:19:38.099806 containerd[1743]: time="2025-07-15T05:19:38.099771645Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"d90467dc9cd662fb4dfc71d4944363e185e3e536227a6cf9424543cf5309b8b7\" pid:5305 exited_at:{seconds:1752556778 nanos:99068174}" Jul 15 05:19:38.114973 containerd[1743]: time="2025-07-15T05:19:38.113163678Z" level=info msg="Container d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:38.119972 kubelet[3163]: I0715 05:19:38.119245 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-796c4d775c-wrw9j" podStartSLOduration=32.181887229 podStartE2EDuration="37.119226615s" podCreationTimestamp="2025-07-15 05:19:01 +0000 UTC" firstStartedPulling="2025-07-15 05:19:29.970147679 +0000 UTC m=+45.341142565" lastFinishedPulling="2025-07-15 05:19:34.907487065 +0000 UTC m=+50.278481951" observedRunningTime="2025-07-15 05:19:35.932973816 +0000 UTC m=+51.303968712" watchObservedRunningTime="2025-07-15 05:19:38.119226615 +0000 UTC m=+53.490221523" Jul 15 05:19:38.132524 containerd[1743]: time="2025-07-15T05:19:38.132497955Z" level=info msg="CreateContainer within sandbox \"7ea6c00bc9f51957d1660bb1bc312f9076a4607cf4fe3e400f9f5017fa467b7f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\"" Jul 15 05:19:38.132903 containerd[1743]: time="2025-07-15T05:19:38.132885893Z" level=info msg="StartContainer for \"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\"" Jul 15 05:19:38.134970 containerd[1743]: time="2025-07-15T05:19:38.134714845Z" level=info msg="connecting to shim d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030" address="unix:///run/containerd/s/c3024df5297226a79dbe27d60c8efe01c249452fc05fd75d28302808c4b51a56" protocol=ttrpc version=3 Jul 15 05:19:38.155089 systemd[1]: Started cri-containerd-d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030.scope - libcontainer container d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030. Jul 15 05:19:38.194651 containerd[1743]: time="2025-07-15T05:19:38.194617735Z" level=info msg="StartContainer for \"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" returns successfully" Jul 15 05:19:38.906280 kubelet[3163]: I0715 05:19:38.906223 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-fbd55b5d6-zlcp7" podStartSLOduration=26.022254324 podStartE2EDuration="32.906175203s" podCreationTimestamp="2025-07-15 05:19:06 +0000 UTC" firstStartedPulling="2025-07-15 05:19:31.192843629 +0000 UTC m=+46.563838524" lastFinishedPulling="2025-07-15 05:19:38.076764507 +0000 UTC m=+53.447759403" observedRunningTime="2025-07-15 05:19:38.905873055 +0000 UTC m=+54.276867952" watchObservedRunningTime="2025-07-15 05:19:38.906175203 +0000 UTC m=+54.277170094" Jul 15 05:19:38.932003 containerd[1743]: time="2025-07-15T05:19:38.931968269Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"ed14261328dd1389eed3a8c870a6288f6c170a63b0d0d0d08f91616f12bb13ac\" pid:5379 exited_at:{seconds:1752556778 nanos:931622586}" Jul 15 05:19:39.374000 containerd[1743]: time="2025-07-15T05:19:39.373944979Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:39.376409 containerd[1743]: time="2025-07-15T05:19:39.376373211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 15 05:19:39.379550 containerd[1743]: time="2025-07-15T05:19:39.379506804Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:39.383922 containerd[1743]: time="2025-07-15T05:19:39.383447433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:39.383922 containerd[1743]: time="2025-07-15T05:19:39.383809942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.306613387s" Jul 15 05:19:39.383922 containerd[1743]: time="2025-07-15T05:19:39.383833473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 15 05:19:39.384731 containerd[1743]: time="2025-07-15T05:19:39.384712617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:19:39.385805 containerd[1743]: time="2025-07-15T05:19:39.385775668Z" level=info msg="CreateContainer within sandbox \"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 05:19:39.410656 containerd[1743]: time="2025-07-15T05:19:39.410625659Z" level=info msg="Container f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:39.458933 containerd[1743]: time="2025-07-15T05:19:39.458908944Z" level=info msg="CreateContainer within sandbox \"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634\"" Jul 15 05:19:39.459548 containerd[1743]: time="2025-07-15T05:19:39.459341323Z" level=info msg="StartContainer for \"f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634\"" Jul 15 05:19:39.460789 containerd[1743]: time="2025-07-15T05:19:39.460761000Z" level=info msg="connecting to shim f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634" address="unix:///run/containerd/s/1e6d21cc7803fb1fa33d25e40c47c0eacb1d69fad8925fa85319c9ac8c58fba9" protocol=ttrpc version=3 Jul 15 05:19:39.480135 systemd[1]: Started cri-containerd-f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634.scope - libcontainer container f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634. Jul 15 05:19:39.518450 containerd[1743]: time="2025-07-15T05:19:39.518432030Z" level=info msg="StartContainer for \"f984981548f962b0012ce0733961536112f71fb6239dcf9fdd9d6d4db7085634\" returns successfully" Jul 15 05:19:39.725145 containerd[1743]: time="2025-07-15T05:19:39.725055038Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:39.727982 containerd[1743]: time="2025-07-15T05:19:39.727927679Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:19:39.730984 containerd[1743]: time="2025-07-15T05:19:39.729435375Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 344.620364ms" Jul 15 05:19:39.731109 containerd[1743]: time="2025-07-15T05:19:39.731002339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:19:39.733447 containerd[1743]: time="2025-07-15T05:19:39.732898217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 05:19:39.734967 containerd[1743]: time="2025-07-15T05:19:39.733840749Z" level=info msg="CreateContainer within sandbox \"9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:19:40.069080 containerd[1743]: time="2025-07-15T05:19:40.068091208Z" level=info msg="Container 1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:40.087803 containerd[1743]: time="2025-07-15T05:19:40.087768419Z" level=info msg="CreateContainer within sandbox \"9527d49bc903fb0094b6590bc3abffcd1c335c02233af8f9232821f554d14f2e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2\"" Jul 15 05:19:40.088276 containerd[1743]: time="2025-07-15T05:19:40.088255435Z" level=info msg="StartContainer for \"1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2\"" Jul 15 05:19:40.089317 containerd[1743]: time="2025-07-15T05:19:40.089281390Z" level=info msg="connecting to shim 1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2" address="unix:///run/containerd/s/a1a86566e4b1e33065f46c7c461838512713fade9616da4e3e9ae71f3c69c667" protocol=ttrpc version=3 Jul 15 05:19:40.108129 systemd[1]: Started cri-containerd-1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2.scope - libcontainer container 1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2. Jul 15 05:19:40.152657 containerd[1743]: time="2025-07-15T05:19:40.152513644Z" level=info msg="StartContainer for \"1d42acff4807457824bfb3c81a9b80190acc6a9587ca79ac09a2c04b50dc51e2\" returns successfully" Jul 15 05:19:41.907674 kubelet[3163]: I0715 05:19:41.906825 3163 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:19:42.421897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4245875380.mount: Deactivated successfully. Jul 15 05:19:43.309982 containerd[1743]: time="2025-07-15T05:19:43.309843103Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:43.312234 containerd[1743]: time="2025-07-15T05:19:43.312200208Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 15 05:19:43.314924 containerd[1743]: time="2025-07-15T05:19:43.314873551Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:43.320930 containerd[1743]: time="2025-07-15T05:19:43.318663733Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:43.321398 containerd[1743]: time="2025-07-15T05:19:43.321376222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.588446227s" Jul 15 05:19:43.321489 containerd[1743]: time="2025-07-15T05:19:43.321477943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 15 05:19:43.322591 containerd[1743]: time="2025-07-15T05:19:43.322551897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 05:19:43.325027 containerd[1743]: time="2025-07-15T05:19:43.323945848Z" level=info msg="CreateContainer within sandbox \"7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 05:19:43.353123 containerd[1743]: time="2025-07-15T05:19:43.353095875Z" level=info msg="Container 4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:43.371403 containerd[1743]: time="2025-07-15T05:19:43.371359860Z" level=info msg="CreateContainer within sandbox \"7228484e39fe513e52429b680beca104629ea99658960c5cb2fc0d45846b71f9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\"" Jul 15 05:19:43.371980 containerd[1743]: time="2025-07-15T05:19:43.371961443Z" level=info msg="StartContainer for \"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\"" Jul 15 05:19:43.372976 containerd[1743]: time="2025-07-15T05:19:43.372889423Z" level=info msg="connecting to shim 4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5" address="unix:///run/containerd/s/2e7368957923d3e4eb42e67a60a23185a17e7006b16fe8c656ad4a067ea0e6bf" protocol=ttrpc version=3 Jul 15 05:19:43.402244 systemd[1]: Started cri-containerd-4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5.scope - libcontainer container 4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5. Jul 15 05:19:43.494495 containerd[1743]: time="2025-07-15T05:19:43.494455682Z" level=info msg="StartContainer for \"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" returns successfully" Jul 15 05:19:43.934384 kubelet[3163]: I0715 05:19:43.933635 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-796c4d775c-b6jrw" podStartSLOduration=37.465201965 podStartE2EDuration="42.933620379s" podCreationTimestamp="2025-07-15 05:19:01 +0000 UTC" firstStartedPulling="2025-07-15 05:19:34.263363142 +0000 UTC m=+49.634358030" lastFinishedPulling="2025-07-15 05:19:39.731781547 +0000 UTC m=+55.102776444" observedRunningTime="2025-07-15 05:19:40.926923982 +0000 UTC m=+56.297918881" watchObservedRunningTime="2025-07-15 05:19:43.933620379 +0000 UTC m=+59.304615280" Jul 15 05:19:43.995701 containerd[1743]: time="2025-07-15T05:19:43.995659382Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"171a151ff4aca35c28d9ff312b7ce8b891f2179fd56348380bef2c0f983d3bb0\" pid:5521 exit_status:1 exited_at:{seconds:1752556783 nanos:995022259}" Jul 15 05:19:45.052458 containerd[1743]: time="2025-07-15T05:19:45.052416724Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"5bb20007fa78f14119ea9bbfc816862d1ac7acd6028bb5c0a51ff8663dc79e18\" pid:5549 exit_status:1 exited_at:{seconds:1752556785 nanos:52001507}" Jul 15 05:19:45.938971 containerd[1743]: time="2025-07-15T05:19:45.937229292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:45.940065 containerd[1743]: time="2025-07-15T05:19:45.940039581Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 15 05:19:45.943258 containerd[1743]: time="2025-07-15T05:19:45.942847927Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:45.947967 containerd[1743]: time="2025-07-15T05:19:45.947533368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:19:45.948216 containerd[1743]: time="2025-07-15T05:19:45.948075772Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.624900464s" Jul 15 05:19:45.948216 containerd[1743]: time="2025-07-15T05:19:45.948121127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 15 05:19:45.950765 containerd[1743]: time="2025-07-15T05:19:45.950734294Z" level=info msg="CreateContainer within sandbox \"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 05:19:45.974239 containerd[1743]: time="2025-07-15T05:19:45.973224138Z" level=info msg="Container d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:19:45.975665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount972642592.mount: Deactivated successfully. Jul 15 05:19:45.994624 containerd[1743]: time="2025-07-15T05:19:45.994595098Z" level=info msg="CreateContainer within sandbox \"f567281022f82ccd8a978ddfc3896214b3798a91aed024d9cf677e7aa0ccc31f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb\"" Jul 15 05:19:45.995426 containerd[1743]: time="2025-07-15T05:19:45.995404264Z" level=info msg="StartContainer for \"d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb\"" Jul 15 05:19:45.997421 containerd[1743]: time="2025-07-15T05:19:45.997348503Z" level=info msg="connecting to shim d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb" address="unix:///run/containerd/s/1e6d21cc7803fb1fa33d25e40c47c0eacb1d69fad8925fa85319c9ac8c58fba9" protocol=ttrpc version=3 Jul 15 05:19:46.004987 containerd[1743]: time="2025-07-15T05:19:46.004266256Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"8898a3e32d21e92d7ad50dafe8883350b9520478bb98b08236ec38aa43f77bf8\" pid:5578 exited_at:{seconds:1752556786 nanos:3642886}" Jul 15 05:19:46.028162 systemd[1]: Started cri-containerd-d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb.scope - libcontainer container d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb. Jul 15 05:19:46.032826 kubelet[3163]: I0715 05:19:46.032777 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-sxqg6" podStartSLOduration=31.897650579 podStartE2EDuration="40.032760629s" podCreationTimestamp="2025-07-15 05:19:06 +0000 UTC" firstStartedPulling="2025-07-15 05:19:35.18726883 +0000 UTC m=+50.558263720" lastFinishedPulling="2025-07-15 05:19:43.322378882 +0000 UTC m=+58.693373770" observedRunningTime="2025-07-15 05:19:43.935950948 +0000 UTC m=+59.306945858" watchObservedRunningTime="2025-07-15 05:19:46.032760629 +0000 UTC m=+61.403755528" Jul 15 05:19:46.081049 containerd[1743]: time="2025-07-15T05:19:46.080973617Z" level=info msg="StartContainer for \"d397dcf84c416f3b6fa341510929e3676f18c3966743f466b62fc128c5d397bb\" returns successfully" Jul 15 05:19:46.643987 kubelet[3163]: I0715 05:19:46.643620 3163 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:19:46.818178 kubelet[3163]: I0715 05:19:46.818153 3163 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 05:19:46.818352 kubelet[3163]: I0715 05:19:46.818344 3163 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 05:19:46.940769 kubelet[3163]: I0715 05:19:46.940115 3163 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dg9p5" podStartSLOduration=27.889406678 podStartE2EDuration="40.940097668s" podCreationTimestamp="2025-07-15 05:19:06 +0000 UTC" firstStartedPulling="2025-07-15 05:19:32.898835458 +0000 UTC m=+48.269830344" lastFinishedPulling="2025-07-15 05:19:45.949526438 +0000 UTC m=+61.320521334" observedRunningTime="2025-07-15 05:19:46.939654968 +0000 UTC m=+62.310649865" watchObservedRunningTime="2025-07-15 05:19:46.940097668 +0000 UTC m=+62.311092656" Jul 15 05:19:48.297750 containerd[1743]: time="2025-07-15T05:19:48.297706259Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"d5adbbc43426858dbde8b14ecb70893e3a7ccf03501e5dfd2baf70e6736e6dd7\" pid:5641 exited_at:{seconds:1752556788 nanos:297197320}" Jul 15 05:19:49.901004 containerd[1743]: time="2025-07-15T05:19:49.900963604Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"0dfb5a95b39bfcbe89e8214ab488f82e3b7f57ce2421b8502949c701ad905362\" pid:5663 exited_at:{seconds:1752556789 nanos:900454295}" Jul 15 05:19:54.660269 containerd[1743]: time="2025-07-15T05:19:54.660225248Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"23f76d55a433d6c16fadc66fa4ecc9949d9ad1bba15267d495bf0693913995e6\" pid:5687 exited_at:{seconds:1752556794 nanos:659546870}" Jul 15 05:20:08.070542 containerd[1743]: time="2025-07-15T05:20:08.070495142Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"2e76a00c7f3db4eb12e8dbf8aae9e095bff3aea194125ddceef0b46638ab79a0\" pid:5714 exited_at:{seconds:1752556808 nanos:70251960}" Jul 15 05:20:18.241337 containerd[1743]: time="2025-07-15T05:20:18.241291666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"082089269401d48815d1e37c32697949e09063cf2049044112089022403a36bf\" pid:5748 exited_at:{seconds:1752556818 nanos:240841648}" Jul 15 05:20:18.886031 kubelet[3163]: I0715 05:20:18.885850 3163 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:20:19.807897 containerd[1743]: time="2025-07-15T05:20:19.807847089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"b7c33ce7e6396b5287a2f989c069b9097baf23d675b3d644d45b64f2afe7f0b1\" pid:5773 exited_at:{seconds:1752556819 nanos:807565445}" Jul 15 05:20:30.952435 containerd[1743]: time="2025-07-15T05:20:30.952390013Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"b8ea732c9cb2b9e1713ef197701837abd72fef99056b333304f5040cf596c840\" pid:5798 exited_at:{seconds:1752556830 nanos:952161135}" Jul 15 05:20:38.070190 containerd[1743]: time="2025-07-15T05:20:38.070144765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"1fb12f62b77be3eb863d6fe55e7d0aca90cf162f863ad5ddc4cf4f87e3e46b50\" pid:5820 exited_at:{seconds:1752556838 nanos:69765737}" Jul 15 05:20:48.237363 containerd[1743]: time="2025-07-15T05:20:48.237225954Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"3feb193dec9d4ef51a1bd7caf2665200e31e5a20f4e9001966b4dbcb96d5c79c\" pid:5854 exited_at:{seconds:1752556848 nanos:237055624}" Jul 15 05:20:49.798548 containerd[1743]: time="2025-07-15T05:20:49.798486024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"643d641adf5bc4acc3d4f8d9d886074863b2db5df639fc9d6c0d6537e65d01f6\" pid:5875 exited_at:{seconds:1752556849 nanos:798282242}" Jul 15 05:20:54.525763 containerd[1743]: time="2025-07-15T05:20:54.525643387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"994b0a0407e49393c1533a6965afe111289f64dedfa16e2b1f22b3e4f2a203ac\" pid:5901 exited_at:{seconds:1752556854 nanos:525400616}" Jul 15 05:21:08.070693 containerd[1743]: time="2025-07-15T05:21:08.070640690Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"967052b790b5087a48ee11e7657808e87eca240ba85360cf2e0493021609569a\" pid:5947 exited_at:{seconds:1752556868 nanos:70277465}" Jul 15 05:21:13.354658 systemd[1]: Started sshd@7-10.200.8.4:22-10.200.16.10:34384.service - OpenSSH per-connection server daemon (10.200.16.10:34384). Jul 15 05:21:13.987078 sshd[5963]: Accepted publickey for core from 10.200.16.10 port 34384 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:13.989119 sshd-session[5963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:13.993499 systemd-logind[1717]: New session 10 of user core. Jul 15 05:21:13.998295 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 05:21:14.517059 sshd[5966]: Connection closed by 10.200.16.10 port 34384 Jul 15 05:21:14.517924 sshd-session[5963]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:14.521923 systemd-logind[1717]: Session 10 logged out. Waiting for processes to exit. Jul 15 05:21:14.523328 systemd[1]: sshd@7-10.200.8.4:22-10.200.16.10:34384.service: Deactivated successfully. Jul 15 05:21:14.525556 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 05:21:14.529398 systemd-logind[1717]: Removed session 10. Jul 15 05:21:18.241158 containerd[1743]: time="2025-07-15T05:21:18.241097128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"7bc088467c426faf50eb15893bfeeef59ab8beec053059b57872aae162f55b3a\" pid:5991 exited_at:{seconds:1752556878 nanos:240870641}" Jul 15 05:21:19.636528 systemd[1]: Started sshd@8-10.200.8.4:22-10.200.16.10:34386.service - OpenSSH per-connection server daemon (10.200.16.10:34386). Jul 15 05:21:19.798716 containerd[1743]: time="2025-07-15T05:21:19.798677727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"d738529c3c4b068ffc28cb363169bc5ac5ba4bd7c36a2057fc9725a3783295da\" pid:6017 exited_at:{seconds:1752556879 nanos:798447696}" Jul 15 05:21:20.262358 sshd[6002]: Accepted publickey for core from 10.200.16.10 port 34386 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:20.263445 sshd-session[6002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:20.267457 systemd-logind[1717]: New session 11 of user core. Jul 15 05:21:20.272131 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 05:21:20.746918 sshd[6027]: Connection closed by 10.200.16.10 port 34386 Jul 15 05:21:20.747435 sshd-session[6002]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:20.750967 systemd[1]: sshd@8-10.200.8.4:22-10.200.16.10:34386.service: Deactivated successfully. Jul 15 05:21:20.752755 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 05:21:20.753465 systemd-logind[1717]: Session 11 logged out. Waiting for processes to exit. Jul 15 05:21:20.754814 systemd-logind[1717]: Removed session 11. Jul 15 05:21:25.865681 systemd[1]: Started sshd@9-10.200.8.4:22-10.200.16.10:52110.service - OpenSSH per-connection server daemon (10.200.16.10:52110). Jul 15 05:21:26.491068 sshd[6043]: Accepted publickey for core from 10.200.16.10 port 52110 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:26.492165 sshd-session[6043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:26.496098 systemd-logind[1717]: New session 12 of user core. Jul 15 05:21:26.499089 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 05:21:26.978939 sshd[6046]: Connection closed by 10.200.16.10 port 52110 Jul 15 05:21:26.979465 sshd-session[6043]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:26.982492 systemd[1]: sshd@9-10.200.8.4:22-10.200.16.10:52110.service: Deactivated successfully. Jul 15 05:21:26.984069 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 05:21:26.984795 systemd-logind[1717]: Session 12 logged out. Waiting for processes to exit. Jul 15 05:21:26.985796 systemd-logind[1717]: Removed session 12. Jul 15 05:21:27.094591 systemd[1]: Started sshd@10-10.200.8.4:22-10.200.16.10:52116.service - OpenSSH per-connection server daemon (10.200.16.10:52116). Jul 15 05:21:27.719705 sshd[6059]: Accepted publickey for core from 10.200.16.10 port 52116 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:27.720820 sshd-session[6059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:27.724924 systemd-logind[1717]: New session 13 of user core. Jul 15 05:21:27.729100 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 05:21:28.232025 sshd[6062]: Connection closed by 10.200.16.10 port 52116 Jul 15 05:21:28.232536 sshd-session[6059]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:28.235229 systemd[1]: sshd@10-10.200.8.4:22-10.200.16.10:52116.service: Deactivated successfully. Jul 15 05:21:28.237151 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 05:21:28.239076 systemd-logind[1717]: Session 13 logged out. Waiting for processes to exit. Jul 15 05:21:28.239748 systemd-logind[1717]: Removed session 13. Jul 15 05:21:28.352488 systemd[1]: Started sshd@11-10.200.8.4:22-10.200.16.10:52122.service - OpenSSH per-connection server daemon (10.200.16.10:52122). Jul 15 05:21:28.976272 sshd[6072]: Accepted publickey for core from 10.200.16.10 port 52122 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:28.977424 sshd-session[6072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:28.981488 systemd-logind[1717]: New session 14 of user core. Jul 15 05:21:28.987092 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 05:21:29.461564 sshd[6075]: Connection closed by 10.200.16.10 port 52122 Jul 15 05:21:29.462146 sshd-session[6072]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:29.465182 systemd[1]: sshd@11-10.200.8.4:22-10.200.16.10:52122.service: Deactivated successfully. Jul 15 05:21:29.466938 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 05:21:29.467819 systemd-logind[1717]: Session 14 logged out. Waiting for processes to exit. Jul 15 05:21:29.468924 systemd-logind[1717]: Removed session 14. Jul 15 05:21:30.955712 containerd[1743]: time="2025-07-15T05:21:30.955664068Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"831a6feefbf810e82df7547b9f6c632dd3c98b3d781eb69d753361adfb463519\" pid:6100 exited_at:{seconds:1752556890 nanos:955244649}" Jul 15 05:21:34.574771 systemd[1]: Started sshd@12-10.200.8.4:22-10.200.16.10:54984.service - OpenSSH per-connection server daemon (10.200.16.10:54984). Jul 15 05:21:35.199985 sshd[6110]: Accepted publickey for core from 10.200.16.10 port 54984 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:35.201053 sshd-session[6110]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:35.205064 systemd-logind[1717]: New session 15 of user core. Jul 15 05:21:35.209114 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 05:21:35.684693 sshd[6113]: Connection closed by 10.200.16.10 port 54984 Jul 15 05:21:35.685138 sshd-session[6110]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:35.688151 systemd[1]: sshd@12-10.200.8.4:22-10.200.16.10:54984.service: Deactivated successfully. Jul 15 05:21:35.689636 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 05:21:35.690313 systemd-logind[1717]: Session 15 logged out. Waiting for processes to exit. Jul 15 05:21:35.691351 systemd-logind[1717]: Removed session 15. Jul 15 05:21:38.068121 containerd[1743]: time="2025-07-15T05:21:38.068057564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"a75bc753dca0d1bdac1efcf052b8af4dc5aa685f4aecfe78acbecb088d5188f1\" pid:6136 exited_at:{seconds:1752556898 nanos:67727684}" Jul 15 05:21:40.807652 systemd[1]: Started sshd@13-10.200.8.4:22-10.200.16.10:53658.service - OpenSSH per-connection server daemon (10.200.16.10:53658). Jul 15 05:21:41.437255 sshd[6150]: Accepted publickey for core from 10.200.16.10 port 53658 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:41.438331 sshd-session[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:41.442583 systemd-logind[1717]: New session 16 of user core. Jul 15 05:21:41.445149 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 05:21:41.920746 sshd[6153]: Connection closed by 10.200.16.10 port 53658 Jul 15 05:21:41.921242 sshd-session[6150]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:41.924439 systemd[1]: sshd@13-10.200.8.4:22-10.200.16.10:53658.service: Deactivated successfully. Jul 15 05:21:41.926213 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 05:21:41.927477 systemd-logind[1717]: Session 16 logged out. Waiting for processes to exit. Jul 15 05:21:41.928284 systemd-logind[1717]: Removed session 16. Jul 15 05:21:47.032836 systemd[1]: Started sshd@14-10.200.8.4:22-10.200.16.10:53660.service - OpenSSH per-connection server daemon (10.200.16.10:53660). Jul 15 05:21:47.660378 sshd[6171]: Accepted publickey for core from 10.200.16.10 port 53660 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:47.661557 sshd-session[6171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:47.665404 systemd-logind[1717]: New session 17 of user core. Jul 15 05:21:47.672094 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 05:21:48.154982 sshd[6174]: Connection closed by 10.200.16.10 port 53660 Jul 15 05:21:48.153635 sshd-session[6171]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:48.158326 systemd-logind[1717]: Session 17 logged out. Waiting for processes to exit. Jul 15 05:21:48.160320 systemd[1]: sshd@14-10.200.8.4:22-10.200.16.10:53660.service: Deactivated successfully. Jul 15 05:21:48.162890 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 05:21:48.166387 systemd-logind[1717]: Removed session 17. Jul 15 05:21:48.239823 containerd[1743]: time="2025-07-15T05:21:48.239784350Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"2f55bb351fd47536ae7956b299e4b0831e90b3f7dcf89846ab2ffb23f485df91\" pid:6198 exited_at:{seconds:1752556908 nanos:239285139}" Jul 15 05:21:49.796861 containerd[1743]: time="2025-07-15T05:21:49.796811706Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"b40bc96510dbafb8f85dd008b34b0c8ecaba34544fb94cc9691b6b9f02d8bc25\" pid:6219 exited_at:{seconds:1752556909 nanos:796552365}" Jul 15 05:21:53.268888 systemd[1]: Started sshd@15-10.200.8.4:22-10.200.16.10:55520.service - OpenSSH per-connection server daemon (10.200.16.10:55520). Jul 15 05:21:53.900597 sshd[6233]: Accepted publickey for core from 10.200.16.10 port 55520 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:21:53.901673 sshd-session[6233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:53.905774 systemd-logind[1717]: New session 18 of user core. Jul 15 05:21:53.912094 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 05:21:54.386595 sshd[6236]: Connection closed by 10.200.16.10 port 55520 Jul 15 05:21:54.387119 sshd-session[6233]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:54.390140 systemd[1]: sshd@15-10.200.8.4:22-10.200.16.10:55520.service: Deactivated successfully. Jul 15 05:21:54.391963 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 05:21:54.392660 systemd-logind[1717]: Session 18 logged out. Waiting for processes to exit. Jul 15 05:21:54.393675 systemd-logind[1717]: Removed session 18. Jul 15 05:21:54.523167 containerd[1743]: time="2025-07-15T05:21:54.523118937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"bfa81ad26bcb92cd963ca325af3a29ea00715ba17ce57b23e7fe5f04fdad5b03\" pid:6260 exited_at:{seconds:1752556914 nanos:522842694}" Jul 15 05:21:59.496906 systemd[1]: Started sshd@16-10.200.8.4:22-10.200.16.10:55532.service - OpenSSH per-connection server daemon (10.200.16.10:55532). Jul 15 05:22:00.120637 sshd[6272]: Accepted publickey for core from 10.200.16.10 port 55532 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:00.121783 sshd-session[6272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:00.125853 systemd-logind[1717]: New session 19 of user core. Jul 15 05:22:00.130094 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 05:22:00.608649 sshd[6275]: Connection closed by 10.200.16.10 port 55532 Jul 15 05:22:00.609171 sshd-session[6272]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:00.611924 systemd[1]: sshd@16-10.200.8.4:22-10.200.16.10:55532.service: Deactivated successfully. Jul 15 05:22:00.613591 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 05:22:00.614859 systemd-logind[1717]: Session 19 logged out. Waiting for processes to exit. Jul 15 05:22:00.616018 systemd-logind[1717]: Removed session 19. Jul 15 05:22:00.722552 systemd[1]: Started sshd@17-10.200.8.4:22-10.200.16.10:53656.service - OpenSSH per-connection server daemon (10.200.16.10:53656). Jul 15 05:22:01.347333 sshd[6287]: Accepted publickey for core from 10.200.16.10 port 53656 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:01.348475 sshd-session[6287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:01.352439 systemd-logind[1717]: New session 20 of user core. Jul 15 05:22:01.355116 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 15 05:22:01.898594 sshd[6290]: Connection closed by 10.200.16.10 port 53656 Jul 15 05:22:01.899119 sshd-session[6287]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:01.902203 systemd[1]: sshd@17-10.200.8.4:22-10.200.16.10:53656.service: Deactivated successfully. Jul 15 05:22:01.904073 systemd[1]: session-20.scope: Deactivated successfully. Jul 15 05:22:01.905708 systemd-logind[1717]: Session 20 logged out. Waiting for processes to exit. Jul 15 05:22:01.906486 systemd-logind[1717]: Removed session 20. Jul 15 05:22:02.008903 systemd[1]: Started sshd@18-10.200.8.4:22-10.200.16.10:53662.service - OpenSSH per-connection server daemon (10.200.16.10:53662). Jul 15 05:22:02.630589 sshd[6299]: Accepted publickey for core from 10.200.16.10 port 53662 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:02.631724 sshd-session[6299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:02.635808 systemd-logind[1717]: New session 21 of user core. Jul 15 05:22:02.641126 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 15 05:22:04.438776 sshd[6302]: Connection closed by 10.200.16.10 port 53662 Jul 15 05:22:04.439375 sshd-session[6299]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:04.442549 systemd[1]: sshd@18-10.200.8.4:22-10.200.16.10:53662.service: Deactivated successfully. Jul 15 05:22:04.444242 systemd[1]: session-21.scope: Deactivated successfully. Jul 15 05:22:04.444406 systemd[1]: session-21.scope: Consumed 411ms CPU time, 73.1M memory peak. Jul 15 05:22:04.445004 systemd-logind[1717]: Session 21 logged out. Waiting for processes to exit. Jul 15 05:22:04.446198 systemd-logind[1717]: Removed session 21. Jul 15 05:22:04.548523 systemd[1]: Started sshd@19-10.200.8.4:22-10.200.16.10:53674.service - OpenSSH per-connection server daemon (10.200.16.10:53674). Jul 15 05:22:05.174789 sshd[6319]: Accepted publickey for core from 10.200.16.10 port 53674 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:05.175833 sshd-session[6319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:05.179494 systemd-logind[1717]: New session 22 of user core. Jul 15 05:22:05.186106 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 15 05:22:05.749344 sshd[6323]: Connection closed by 10.200.16.10 port 53674 Jul 15 05:22:05.749861 sshd-session[6319]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:05.753089 systemd[1]: sshd@19-10.200.8.4:22-10.200.16.10:53674.service: Deactivated successfully. Jul 15 05:22:05.754860 systemd[1]: session-22.scope: Deactivated successfully. Jul 15 05:22:05.755577 systemd-logind[1717]: Session 22 logged out. Waiting for processes to exit. Jul 15 05:22:05.757090 systemd-logind[1717]: Removed session 22. Jul 15 05:22:05.883037 systemd[1]: Started sshd@20-10.200.8.4:22-10.200.16.10:53678.service - OpenSSH per-connection server daemon (10.200.16.10:53678). Jul 15 05:22:06.507706 sshd[6333]: Accepted publickey for core from 10.200.16.10 port 53678 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:06.508806 sshd-session[6333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:06.513026 systemd-logind[1717]: New session 23 of user core. Jul 15 05:22:06.521103 systemd[1]: Started session-23.scope - Session 23 of User core. Jul 15 05:22:06.994153 sshd[6336]: Connection closed by 10.200.16.10 port 53678 Jul 15 05:22:06.994628 sshd-session[6333]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:06.997598 systemd[1]: sshd@20-10.200.8.4:22-10.200.16.10:53678.service: Deactivated successfully. Jul 15 05:22:06.999288 systemd[1]: session-23.scope: Deactivated successfully. Jul 15 05:22:07.000147 systemd-logind[1717]: Session 23 logged out. Waiting for processes to exit. Jul 15 05:22:07.001261 systemd-logind[1717]: Removed session 23. Jul 15 05:22:08.071138 containerd[1743]: time="2025-07-15T05:22:08.071045872Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"b4035558ecca496ee9274e5a4e3a05afd14a6b729c26691ffa8fe3041d09eb20\" pid:6359 exited_at:{seconds:1752556928 nanos:70723871}" Jul 15 05:22:12.113991 systemd[1]: Started sshd@21-10.200.8.4:22-10.200.16.10:43932.service - OpenSSH per-connection server daemon (10.200.16.10:43932). Jul 15 05:22:12.739395 sshd[6380]: Accepted publickey for core from 10.200.16.10 port 43932 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:12.741217 sshd-session[6380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:12.745581 systemd-logind[1717]: New session 24 of user core. Jul 15 05:22:12.753259 systemd[1]: Started session-24.scope - Session 24 of User core. Jul 15 05:22:13.223362 sshd[6383]: Connection closed by 10.200.16.10 port 43932 Jul 15 05:22:13.223869 sshd-session[6380]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:13.227121 systemd[1]: sshd@21-10.200.8.4:22-10.200.16.10:43932.service: Deactivated successfully. Jul 15 05:22:13.228789 systemd[1]: session-24.scope: Deactivated successfully. Jul 15 05:22:13.229514 systemd-logind[1717]: Session 24 logged out. Waiting for processes to exit. Jul 15 05:22:13.230548 systemd-logind[1717]: Removed session 24. Jul 15 05:22:18.241799 containerd[1743]: time="2025-07-15T05:22:18.241640489Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"0faccd5b3a11140487cc09ac6c781f9f677e3aa0cd1f5a303051b8f3f50dd69d\" pid:6407 exited_at:{seconds:1752556938 nanos:240946785}" Jul 15 05:22:18.334571 systemd[1]: Started sshd@22-10.200.8.4:22-10.200.16.10:43940.service - OpenSSH per-connection server daemon (10.200.16.10:43940). Jul 15 05:22:18.968980 sshd[6417]: Accepted publickey for core from 10.200.16.10 port 43940 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:18.968835 sshd-session[6417]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:18.976617 systemd-logind[1717]: New session 25 of user core. Jul 15 05:22:18.982356 systemd[1]: Started session-25.scope - Session 25 of User core. Jul 15 05:22:19.480005 sshd[6420]: Connection closed by 10.200.16.10 port 43940 Jul 15 05:22:19.480549 sshd-session[6417]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:19.483810 systemd[1]: sshd@22-10.200.8.4:22-10.200.16.10:43940.service: Deactivated successfully. Jul 15 05:22:19.485394 systemd[1]: session-25.scope: Deactivated successfully. Jul 15 05:22:19.486104 systemd-logind[1717]: Session 25 logged out. Waiting for processes to exit. Jul 15 05:22:19.487029 systemd-logind[1717]: Removed session 25. Jul 15 05:22:19.801245 containerd[1743]: time="2025-07-15T05:22:19.800912780Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b84ed86c8c874a8ab20efbbf0bc4ab1c0e6c5396aa1a4422f51b28b127a56b5\" id:\"e34d434b5444eeb709c8ec6f7ebacf3eeeb368ad6751a89c29fc9ac10138f701\" pid:6444 exited_at:{seconds:1752556939 nanos:800521145}" Jul 15 05:22:24.599004 systemd[1]: Started sshd@23-10.200.8.4:22-10.200.16.10:48246.service - OpenSSH per-connection server daemon (10.200.16.10:48246). Jul 15 05:22:25.228529 sshd[6457]: Accepted publickey for core from 10.200.16.10 port 48246 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:25.229734 sshd-session[6457]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:25.233838 systemd-logind[1717]: New session 26 of user core. Jul 15 05:22:25.239109 systemd[1]: Started session-26.scope - Session 26 of User core. Jul 15 05:22:25.711763 sshd[6460]: Connection closed by 10.200.16.10 port 48246 Jul 15 05:22:25.712323 sshd-session[6457]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:25.715053 systemd[1]: sshd@23-10.200.8.4:22-10.200.16.10:48246.service: Deactivated successfully. Jul 15 05:22:25.716547 systemd[1]: session-26.scope: Deactivated successfully. Jul 15 05:22:25.718196 systemd-logind[1717]: Session 26 logged out. Waiting for processes to exit. Jul 15 05:22:25.718849 systemd-logind[1717]: Removed session 26. Jul 15 05:22:30.822576 systemd[1]: Started sshd@24-10.200.8.4:22-10.200.16.10:60434.service - OpenSSH per-connection server daemon (10.200.16.10:60434). Jul 15 05:22:30.954832 containerd[1743]: time="2025-07-15T05:22:30.954794599Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e143fe032330837bacce8ea478609dbfbb8828cc790d5949bf12cacde98030\" id:\"f03b1609e3dc4f862d63760549394f21e42ee1ddb0774ad265a771670e58a53a\" pid:6487 exited_at:{seconds:1752556950 nanos:954560425}" Jul 15 05:22:31.451150 sshd[6472]: Accepted publickey for core from 10.200.16.10 port 60434 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:31.452310 sshd-session[6472]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:31.456337 systemd-logind[1717]: New session 27 of user core. Jul 15 05:22:31.461105 systemd[1]: Started session-27.scope - Session 27 of User core. Jul 15 05:22:31.932509 sshd[6497]: Connection closed by 10.200.16.10 port 60434 Jul 15 05:22:31.933040 sshd-session[6472]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:31.936229 systemd[1]: sshd@24-10.200.8.4:22-10.200.16.10:60434.service: Deactivated successfully. Jul 15 05:22:31.937826 systemd[1]: session-27.scope: Deactivated successfully. Jul 15 05:22:31.938489 systemd-logind[1717]: Session 27 logged out. Waiting for processes to exit. Jul 15 05:22:31.939485 systemd-logind[1717]: Removed session 27. Jul 15 05:22:37.050815 systemd[1]: Started sshd@25-10.200.8.4:22-10.200.16.10:60436.service - OpenSSH per-connection server daemon (10.200.16.10:60436). Jul 15 05:22:37.677475 sshd[6530]: Accepted publickey for core from 10.200.16.10 port 60436 ssh2: RSA SHA256:WldnkDLUmxPu3hAw0Hy9/rmNIFGGN2Id/cNmLPJWvxw Jul 15 05:22:37.678672 sshd-session[6530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:37.682899 systemd-logind[1717]: New session 28 of user core. Jul 15 05:22:37.688096 systemd[1]: Started session-28.scope - Session 28 of User core. Jul 15 05:22:38.091344 containerd[1743]: time="2025-07-15T05:22:38.091302685Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aca2ad3acbdae752ca5f94faed6c4d40176863e037e91f6c37624d3519cab7db\" id:\"f7e7199167e1543619486267660a72fa36c5826c113eb7a93d3c49445541880d\" pid:6554 exited_at:{seconds:1752556958 nanos:90606795}" Jul 15 05:22:38.198770 sshd[6533]: Connection closed by 10.200.16.10 port 60436 Jul 15 05:22:38.199233 sshd-session[6530]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:38.202224 systemd[1]: sshd@25-10.200.8.4:22-10.200.16.10:60436.service: Deactivated successfully. Jul 15 05:22:38.203936 systemd[1]: session-28.scope: Deactivated successfully. Jul 15 05:22:38.204622 systemd-logind[1717]: Session 28 logged out. Waiting for processes to exit. Jul 15 05:22:38.205661 systemd-logind[1717]: Removed session 28.