Aug 19 08:15:36.996673 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Aug 18 22:19:37 -00 2025 Aug 19 08:15:36.996696 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:36.996706 kernel: BIOS-provided physical RAM map: Aug 19 08:15:36.996712 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Aug 19 08:15:36.996717 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Aug 19 08:15:36.996723 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000044fdfff] usable Aug 19 08:15:36.996730 kernel: BIOS-e820: [mem 0x00000000044fe000-0x00000000048fdfff] reserved Aug 19 08:15:36.996737 kernel: BIOS-e820: [mem 0x00000000048fe000-0x000000003ff1efff] usable Aug 19 08:15:36.996743 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ffc8fff] reserved Aug 19 08:15:36.996749 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Aug 19 08:15:36.996755 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Aug 19 08:15:36.996760 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Aug 19 08:15:36.996766 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Aug 19 08:15:36.996772 kernel: printk: legacy bootconsole [earlyser0] enabled Aug 19 08:15:36.996781 kernel: NX (Execute Disable) protection: active Aug 19 08:15:36.996787 kernel: APIC: Static calls initialized Aug 19 08:15:36.996793 kernel: efi: EFI v2.7 by Microsoft Aug 19 08:15:36.996800 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3e9da518 RNG=0x3ffd2018 Aug 19 08:15:36.996806 kernel: random: crng init done Aug 19 08:15:36.996813 kernel: secureboot: Secure boot disabled Aug 19 08:15:36.996819 kernel: SMBIOS 3.1.0 present. Aug 19 08:15:36.996826 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 01/28/2025 Aug 19 08:15:36.996832 kernel: DMI: Memory slots populated: 2/2 Aug 19 08:15:36.996840 kernel: Hypervisor detected: Microsoft Hyper-V Aug 19 08:15:36.996846 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 Aug 19 08:15:36.996852 kernel: Hyper-V: Nested features: 0x3e0101 Aug 19 08:15:36.996859 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Aug 19 08:15:36.996865 kernel: Hyper-V: Using hypercall for remote TLB flush Aug 19 08:15:36.996872 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Aug 19 08:15:36.996878 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Aug 19 08:15:36.996884 kernel: tsc: Detected 2300.001 MHz processor Aug 19 08:15:36.996890 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 19 08:15:36.996897 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 19 08:15:36.996904 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 Aug 19 08:15:36.996912 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Aug 19 08:15:36.996919 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 19 08:15:36.996925 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved Aug 19 08:15:36.996932 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 Aug 19 08:15:36.996939 kernel: Using GB pages for direct mapping Aug 19 08:15:36.996946 kernel: ACPI: Early table checksum verification disabled Aug 19 08:15:36.996955 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Aug 19 08:15:36.996963 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.996970 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.996977 kernel: ACPI: DSDT 0x000000003FFD6000 01E27A (v02 MSFTVM DSDT01 00000001 INTL 20230628) Aug 19 08:15:36.996984 kernel: ACPI: FACS 0x000000003FFFE000 000040 Aug 19 08:15:36.996991 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.996997 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.997006 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.997013 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) Aug 19 08:15:36.997020 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) Aug 19 08:15:36.997026 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Aug 19 08:15:36.997034 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Aug 19 08:15:36.997040 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4279] Aug 19 08:15:36.997047 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Aug 19 08:15:36.997054 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Aug 19 08:15:36.997061 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Aug 19 08:15:36.997069 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Aug 19 08:15:36.997075 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] Aug 19 08:15:36.997082 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] Aug 19 08:15:36.997089 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Aug 19 08:15:36.997096 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Aug 19 08:15:36.997103 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] Aug 19 08:15:36.997110 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] Aug 19 08:15:36.997117 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] Aug 19 08:15:36.997123 kernel: Zone ranges: Aug 19 08:15:36.997132 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 19 08:15:36.997139 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Aug 19 08:15:36.997145 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Aug 19 08:15:36.997152 kernel: Device empty Aug 19 08:15:36.997159 kernel: Movable zone start for each node Aug 19 08:15:36.997165 kernel: Early memory node ranges Aug 19 08:15:36.997172 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Aug 19 08:15:36.997179 kernel: node 0: [mem 0x0000000000100000-0x00000000044fdfff] Aug 19 08:15:36.997186 kernel: node 0: [mem 0x00000000048fe000-0x000000003ff1efff] Aug 19 08:15:36.997194 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Aug 19 08:15:36.997201 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Aug 19 08:15:36.997208 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Aug 19 08:15:36.997215 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 19 08:15:36.997221 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Aug 19 08:15:36.997228 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Aug 19 08:15:36.997235 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Aug 19 08:15:36.997241 kernel: ACPI: PM-Timer IO Port: 0x408 Aug 19 08:15:36.997249 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 19 08:15:36.997257 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 19 08:15:36.997264 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 19 08:15:36.997271 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Aug 19 08:15:36.997278 kernel: TSC deadline timer available Aug 19 08:15:36.997285 kernel: CPU topo: Max. logical packages: 1 Aug 19 08:15:36.997291 kernel: CPU topo: Max. logical dies: 1 Aug 19 08:15:36.997298 kernel: CPU topo: Max. dies per package: 1 Aug 19 08:15:36.997304 kernel: CPU topo: Max. threads per core: 2 Aug 19 08:15:36.997311 kernel: CPU topo: Num. cores per package: 1 Aug 19 08:15:36.997320 kernel: CPU topo: Num. threads per package: 2 Aug 19 08:15:36.997326 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Aug 19 08:15:36.997333 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Aug 19 08:15:36.997340 kernel: Booting paravirtualized kernel on Hyper-V Aug 19 08:15:36.997347 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 19 08:15:36.997354 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Aug 19 08:15:36.997360 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Aug 19 08:15:36.997367 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Aug 19 08:15:36.997374 kernel: pcpu-alloc: [0] 0 1 Aug 19 08:15:36.997382 kernel: Hyper-V: PV spinlocks enabled Aug 19 08:15:36.997389 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Aug 19 08:15:36.997397 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:36.997415 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 08:15:36.997422 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Aug 19 08:15:36.997429 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 08:15:36.997436 kernel: Fallback order for Node 0: 0 Aug 19 08:15:36.997443 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2095807 Aug 19 08:15:36.997451 kernel: Policy zone: Normal Aug 19 08:15:36.997458 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 08:15:36.997464 kernel: software IO TLB: area num 2. Aug 19 08:15:36.997471 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 19 08:15:36.997478 kernel: ftrace: allocating 40101 entries in 157 pages Aug 19 08:15:36.997485 kernel: ftrace: allocated 157 pages with 5 groups Aug 19 08:15:36.997491 kernel: Dynamic Preempt: voluntary Aug 19 08:15:36.997498 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 08:15:36.997506 kernel: rcu: RCU event tracing is enabled. Aug 19 08:15:36.997520 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 19 08:15:36.997527 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 08:15:36.997535 kernel: Rude variant of Tasks RCU enabled. Aug 19 08:15:36.997543 kernel: Tracing variant of Tasks RCU enabled. Aug 19 08:15:36.997550 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 08:15:36.997558 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 19 08:15:36.997565 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:36.997572 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:36.997580 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:36.997587 kernel: Using NULL legacy PIC Aug 19 08:15:36.997596 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Aug 19 08:15:36.997603 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 08:15:36.997611 kernel: Console: colour dummy device 80x25 Aug 19 08:15:36.997619 kernel: printk: legacy console [tty1] enabled Aug 19 08:15:36.997626 kernel: printk: legacy console [ttyS0] enabled Aug 19 08:15:36.997633 kernel: printk: legacy bootconsole [earlyser0] disabled Aug 19 08:15:36.997641 kernel: ACPI: Core revision 20240827 Aug 19 08:15:36.997649 kernel: Failed to register legacy timer interrupt Aug 19 08:15:36.997656 kernel: APIC: Switch to symmetric I/O mode setup Aug 19 08:15:36.997664 kernel: x2apic enabled Aug 19 08:15:36.997671 kernel: APIC: Switched APIC routing to: physical x2apic Aug 19 08:15:36.997679 kernel: Hyper-V: Host Build 10.0.26100.1293-1-0 Aug 19 08:15:36.997686 kernel: Hyper-V: enabling crash_kexec_post_notifiers Aug 19 08:15:36.997694 kernel: Hyper-V: Disabling IBT because of Hyper-V bug Aug 19 08:15:36.997701 kernel: Hyper-V: Using IPI hypercalls Aug 19 08:15:36.997708 kernel: APIC: send_IPI() replaced with hv_send_ipi() Aug 19 08:15:36.997717 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Aug 19 08:15:36.997725 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Aug 19 08:15:36.997732 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Aug 19 08:15:36.997740 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Aug 19 08:15:36.997747 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Aug 19 08:15:36.997755 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212735f0517, max_idle_ns: 440795237604 ns Aug 19 08:15:36.997763 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4600.00 BogoMIPS (lpj=2300001) Aug 19 08:15:36.997770 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Aug 19 08:15:36.997778 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Aug 19 08:15:36.997786 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Aug 19 08:15:36.997794 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 19 08:15:36.997800 kernel: Spectre V2 : Mitigation: Retpolines Aug 19 08:15:36.997808 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 19 08:15:36.997815 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Aug 19 08:15:36.997822 kernel: RETBleed: Vulnerable Aug 19 08:15:36.997831 kernel: Speculative Store Bypass: Vulnerable Aug 19 08:15:36.997838 kernel: ITS: Mitigation: Aligned branch/return thunks Aug 19 08:15:36.997845 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 19 08:15:36.997852 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 19 08:15:36.997857 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 19 08:15:36.997863 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Aug 19 08:15:36.997868 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Aug 19 08:15:36.997872 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Aug 19 08:15:36.997877 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' Aug 19 08:15:36.997881 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' Aug 19 08:15:36.997886 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' Aug 19 08:15:36.997890 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 19 08:15:36.997895 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Aug 19 08:15:36.997899 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Aug 19 08:15:36.997904 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Aug 19 08:15:36.997910 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 Aug 19 08:15:36.997917 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 Aug 19 08:15:36.997925 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 Aug 19 08:15:36.997932 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. Aug 19 08:15:36.997938 kernel: Freeing SMP alternatives memory: 32K Aug 19 08:15:36.997942 kernel: pid_max: default: 32768 minimum: 301 Aug 19 08:15:36.997947 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 08:15:36.997952 kernel: landlock: Up and running. Aug 19 08:15:36.997956 kernel: SELinux: Initializing. Aug 19 08:15:36.997961 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 19 08:15:36.997965 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 19 08:15:36.997970 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) Aug 19 08:15:36.997975 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. Aug 19 08:15:36.997980 kernel: signal: max sigframe size: 11952 Aug 19 08:15:36.997984 kernel: rcu: Hierarchical SRCU implementation. Aug 19 08:15:36.997989 kernel: rcu: Max phase no-delay instances is 400. Aug 19 08:15:36.997994 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 08:15:36.997998 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Aug 19 08:15:36.998003 kernel: smp: Bringing up secondary CPUs ... Aug 19 08:15:36.998008 kernel: smpboot: x86: Booting SMP configuration: Aug 19 08:15:36.998016 kernel: .... node #0, CPUs: #1 Aug 19 08:15:36.998027 kernel: smp: Brought up 1 node, 2 CPUs Aug 19 08:15:36.998035 kernel: smpboot: Total of 2 processors activated (9200.00 BogoMIPS) Aug 19 08:15:36.998040 kernel: Memory: 8077020K/8383228K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54040K init, 2928K bss, 299992K reserved, 0K cma-reserved) Aug 19 08:15:36.998044 kernel: devtmpfs: initialized Aug 19 08:15:36.998049 kernel: x86/mm: Memory block size: 128MB Aug 19 08:15:36.998053 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Aug 19 08:15:36.998058 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 08:15:36.998063 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 19 08:15:36.998071 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 08:15:36.998083 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 08:15:36.998091 kernel: audit: initializing netlink subsys (disabled) Aug 19 08:15:36.998098 kernel: audit: type=2000 audit(1755591333.030:1): state=initialized audit_enabled=0 res=1 Aug 19 08:15:36.998104 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 08:15:36.998109 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 19 08:15:36.998113 kernel: cpuidle: using governor menu Aug 19 08:15:36.998118 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 08:15:36.998122 kernel: dca service started, version 1.12.1 Aug 19 08:15:36.998127 kernel: e820: reserve RAM buffer [mem 0x044fe000-0x07ffffff] Aug 19 08:15:36.998133 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Aug 19 08:15:36.998141 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 19 08:15:36.998148 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 08:15:36.998156 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 08:15:36.998163 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 08:15:36.998168 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 08:15:36.998172 kernel: ACPI: Added _OSI(Module Device) Aug 19 08:15:36.998177 kernel: ACPI: Added _OSI(Processor Device) Aug 19 08:15:36.998181 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 08:15:36.998188 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 08:15:36.998197 kernel: ACPI: Interpreter enabled Aug 19 08:15:36.998204 kernel: ACPI: PM: (supports S0 S5) Aug 19 08:15:36.998209 kernel: ACPI: Using IOAPIC for interrupt routing Aug 19 08:15:36.998214 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 19 08:15:36.998218 kernel: PCI: Ignoring E820 reservations for host bridge windows Aug 19 08:15:36.998223 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Aug 19 08:15:36.998231 kernel: iommu: Default domain type: Translated Aug 19 08:15:36.998241 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 19 08:15:36.998250 kernel: efivars: Registered efivars operations Aug 19 08:15:36.998255 kernel: PCI: Using ACPI for IRQ routing Aug 19 08:15:36.998260 kernel: PCI: System does not support PCI Aug 19 08:15:36.998264 kernel: vgaarb: loaded Aug 19 08:15:36.998269 kernel: clocksource: Switched to clocksource tsc-early Aug 19 08:15:36.998274 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 08:15:36.998283 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 08:15:36.998292 kernel: pnp: PnP ACPI init Aug 19 08:15:36.998300 kernel: pnp: PnP ACPI: found 3 devices Aug 19 08:15:36.998307 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 19 08:15:36.998311 kernel: NET: Registered PF_INET protocol family Aug 19 08:15:36.998316 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 19 08:15:36.998320 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Aug 19 08:15:36.998325 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 08:15:36.998332 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 08:15:36.998342 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Aug 19 08:15:36.998350 kernel: TCP: Hash tables configured (established 65536 bind 65536) Aug 19 08:15:36.998358 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Aug 19 08:15:36.998364 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Aug 19 08:15:36.998368 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 08:15:36.998373 kernel: NET: Registered PF_XDP protocol family Aug 19 08:15:36.998377 kernel: PCI: CLS 0 bytes, default 64 Aug 19 08:15:36.998382 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Aug 19 08:15:36.998387 kernel: software IO TLB: mapped [mem 0x000000003a9da000-0x000000003e9da000] (64MB) Aug 19 08:15:36.998395 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer Aug 19 08:15:36.998413 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules Aug 19 08:15:36.998419 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212735f0517, max_idle_ns: 440795237604 ns Aug 19 08:15:36.998425 kernel: clocksource: Switched to clocksource tsc Aug 19 08:15:36.998430 kernel: Initialise system trusted keyrings Aug 19 08:15:36.998434 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Aug 19 08:15:36.998439 kernel: Key type asymmetric registered Aug 19 08:15:36.998445 kernel: Asymmetric key parser 'x509' registered Aug 19 08:15:36.998456 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 19 08:15:36.998463 kernel: io scheduler mq-deadline registered Aug 19 08:15:36.998470 kernel: io scheduler kyber registered Aug 19 08:15:36.998474 kernel: io scheduler bfq registered Aug 19 08:15:36.998480 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 19 08:15:36.998485 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 08:15:36.998490 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 19 08:15:36.998494 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Aug 19 08:15:36.998502 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A Aug 19 08:15:36.998510 kernel: i8042: PNP: No PS/2 controller found. Aug 19 08:15:36.998617 kernel: rtc_cmos 00:02: registered as rtc0 Aug 19 08:15:36.998665 kernel: rtc_cmos 00:02: setting system clock to 2025-08-19T08:15:36 UTC (1755591336) Aug 19 08:15:36.998729 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Aug 19 08:15:36.998735 kernel: intel_pstate: Intel P-state driver initializing Aug 19 08:15:36.998740 kernel: efifb: probing for efifb Aug 19 08:15:36.998745 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Aug 19 08:15:36.998750 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Aug 19 08:15:36.998755 kernel: efifb: scrolling: redraw Aug 19 08:15:36.998765 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Aug 19 08:15:36.998773 kernel: Console: switching to colour frame buffer device 128x48 Aug 19 08:15:36.998781 kernel: fb0: EFI VGA frame buffer device Aug 19 08:15:36.998786 kernel: pstore: Using crash dump compression: deflate Aug 19 08:15:36.998791 kernel: pstore: Registered efi_pstore as persistent store backend Aug 19 08:15:36.998795 kernel: NET: Registered PF_INET6 protocol family Aug 19 08:15:36.998801 kernel: Segment Routing with IPv6 Aug 19 08:15:36.998810 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 08:15:36.998818 kernel: NET: Registered PF_PACKET protocol family Aug 19 08:15:36.998824 kernel: Key type dns_resolver registered Aug 19 08:15:36.998829 kernel: IPI shorthand broadcast: enabled Aug 19 08:15:36.998835 kernel: sched_clock: Marking stable (3087004217, 112911162)->(3547437812, -347522433) Aug 19 08:15:36.998841 kernel: registered taskstats version 1 Aug 19 08:15:36.998848 kernel: Loading compiled-in X.509 certificates Aug 19 08:15:36.998856 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: 93a065b103c00d4b81cc5822e4e7f9674e63afaf' Aug 19 08:15:36.998863 kernel: Demotion targets for Node 0: null Aug 19 08:15:36.998867 kernel: Key type .fscrypt registered Aug 19 08:15:36.998872 kernel: Key type fscrypt-provisioning registered Aug 19 08:15:36.998877 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 08:15:36.998884 kernel: ima: Allocated hash algorithm: sha1 Aug 19 08:15:36.998896 kernel: ima: No architecture policies found Aug 19 08:15:36.998903 kernel: clk: Disabling unused clocks Aug 19 08:15:36.998910 kernel: Warning: unable to open an initial console. Aug 19 08:15:36.998915 kernel: Freeing unused kernel image (initmem) memory: 54040K Aug 19 08:15:36.998919 kernel: Write protecting the kernel read-only data: 24576k Aug 19 08:15:36.998924 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 19 08:15:36.998928 kernel: Run /init as init process Aug 19 08:15:36.998936 kernel: with arguments: Aug 19 08:15:36.998945 kernel: /init Aug 19 08:15:36.998958 kernel: with environment: Aug 19 08:15:36.998964 kernel: HOME=/ Aug 19 08:15:36.998969 kernel: TERM=linux Aug 19 08:15:36.998973 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 08:15:36.998979 systemd[1]: Successfully made /usr/ read-only. Aug 19 08:15:36.998990 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:15:36.999000 systemd[1]: Detected virtualization microsoft. Aug 19 08:15:36.999007 systemd[1]: Detected architecture x86-64. Aug 19 08:15:36.999019 systemd[1]: Running in initrd. Aug 19 08:15:36.999026 systemd[1]: No hostname configured, using default hostname. Aug 19 08:15:36.999032 systemd[1]: Hostname set to . Aug 19 08:15:36.999036 systemd[1]: Initializing machine ID from random generator. Aug 19 08:15:36.999041 systemd[1]: Queued start job for default target initrd.target. Aug 19 08:15:36.999048 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:15:36.999056 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:15:36.999065 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 08:15:36.999072 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:15:36.999077 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 08:15:36.999083 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 08:15:36.999092 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 08:15:36.999100 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 08:15:36.999107 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:15:36.999115 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:15:36.999120 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:15:36.999125 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:15:36.999134 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:15:36.999142 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:15:36.999150 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:15:36.999157 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:15:36.999163 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 08:15:36.999168 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 08:15:36.999174 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:15:36.999182 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:15:36.999190 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:15:36.999198 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:15:36.999205 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 08:15:37.000444 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:15:37.000457 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 08:15:37.000467 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 08:15:37.000478 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 08:15:37.000486 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:15:37.000494 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:15:37.000512 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:37.000521 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 08:15:37.000550 systemd-journald[205]: Collecting audit messages is disabled. Aug 19 08:15:37.000570 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:15:37.000581 systemd-journald[205]: Journal started Aug 19 08:15:37.000603 systemd-journald[205]: Runtime Journal (/run/log/journal/57138af4c98247038263c448a4225f9a) is 8M, max 158.9M, 150.9M free. Aug 19 08:15:37.007420 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:15:37.011016 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 08:15:37.012206 systemd-modules-load[206]: Inserted module 'overlay' Aug 19 08:15:37.013605 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:37.021503 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 08:15:37.026465 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 08:15:37.034947 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:15:37.042715 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 08:15:37.047487 kernel: Bridge firewalling registered Aug 19 08:15:37.049516 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 08:15:37.050461 systemd-modules-load[206]: Inserted module 'br_netfilter' Aug 19 08:15:37.051418 systemd-tmpfiles[221]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 08:15:37.054511 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:15:37.058459 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:15:37.061997 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:15:37.066631 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 08:15:37.069976 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:15:37.083505 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:15:37.094726 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:15:37.099574 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:37.100154 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:15:37.112394 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:15:37.140133 systemd-resolved[258]: Positive Trust Anchors: Aug 19 08:15:37.141442 systemd-resolved[258]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:15:37.141537 systemd-resolved[258]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:15:37.143978 systemd-resolved[258]: Defaulting to hostname 'linux'. Aug 19 08:15:37.159262 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:15:37.161953 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:15:37.180418 kernel: SCSI subsystem initialized Aug 19 08:15:37.187413 kernel: Loading iSCSI transport class v2.0-870. Aug 19 08:15:37.195419 kernel: iscsi: registered transport (tcp) Aug 19 08:15:37.212460 kernel: iscsi: registered transport (qla4xxx) Aug 19 08:15:37.212498 kernel: QLogic iSCSI HBA Driver Aug 19 08:15:37.225105 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:15:37.237516 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:15:37.238190 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:15:37.271035 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 08:15:37.273509 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 08:15:37.325421 kernel: raid6: avx512x4 gen() 44277 MB/s Aug 19 08:15:37.342413 kernel: raid6: avx512x2 gen() 44340 MB/s Aug 19 08:15:37.359418 kernel: raid6: avx512x1 gen() 27558 MB/s Aug 19 08:15:37.378409 kernel: raid6: avx2x4 gen() 36443 MB/s Aug 19 08:15:37.396411 kernel: raid6: avx2x2 gen() 38538 MB/s Aug 19 08:15:37.413796 kernel: raid6: avx2x1 gen() 30958 MB/s Aug 19 08:15:37.413816 kernel: raid6: using algorithm avx512x2 gen() 44340 MB/s Aug 19 08:15:37.433419 kernel: raid6: .... xor() 31751 MB/s, rmw enabled Aug 19 08:15:37.433438 kernel: raid6: using avx512x2 recovery algorithm Aug 19 08:15:37.450419 kernel: xor: automatically using best checksumming function avx Aug 19 08:15:37.562420 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 08:15:37.567238 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:15:37.570326 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:15:37.591131 systemd-udevd[453]: Using default interface naming scheme 'v255'. Aug 19 08:15:37.595389 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:15:37.602686 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 08:15:37.623879 dracut-pre-trigger[464]: rd.md=0: removing MD RAID activation Aug 19 08:15:37.639948 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:15:37.641084 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:15:37.677801 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:15:37.684935 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 08:15:37.731427 kernel: cryptd: max_cpu_qlen set to 1000 Aug 19 08:15:37.746470 kernel: AES CTR mode by8 optimization enabled Aug 19 08:15:37.752619 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:15:37.752730 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:37.761619 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:37.776893 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:37.778826 kernel: hv_vmbus: Vmbus version:5.3 Aug 19 08:15:37.794898 kernel: pps_core: LinuxPPS API ver. 1 registered Aug 19 08:15:37.794933 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Aug 19 08:15:37.794945 kernel: PTP clock support registered Aug 19 08:15:37.794956 kernel: hv_utils: Registering HyperV Utility Driver Aug 19 08:15:37.796656 kernel: hv_vmbus: registering driver hv_utils Aug 19 08:15:37.798049 kernel: hv_utils: Shutdown IC version 3.2 Aug 19 08:15:37.800525 kernel: hv_utils: Heartbeat IC version 3.0 Aug 19 08:15:37.803210 kernel: hv_vmbus: registering driver hyperv_keyboard Aug 19 08:15:37.800589 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:15:37.800683 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:37.813635 kernel: hv_utils: TimeSync IC version 4.0 Aug 19 08:15:37.813730 kernel: hv_vmbus: registering driver hv_netvsc Aug 19 08:15:37.487692 systemd-resolved[258]: Clock change detected. Flushing caches. Aug 19 08:15:37.495945 systemd-journald[205]: Time jumped backwards, rotating. Aug 19 08:15:37.495990 kernel: hv_vmbus: registering driver hv_pci Aug 19 08:15:37.506510 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Aug 19 08:15:37.506546 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 Aug 19 08:15:37.496134 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:37.514103 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 19 08:15:37.516553 kernel: hv_vmbus: registering driver hid_hyperv Aug 19 08:15:37.522126 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fada84 (unnamed net_device) (uninitialized): VF slot 1 added Aug 19 08:15:37.526081 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Aug 19 08:15:37.533684 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 Aug 19 08:15:37.533830 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Aug 19 08:15:37.538315 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:37.542139 kernel: hv_vmbus: registering driver hv_storvsc Aug 19 08:15:37.544879 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] Aug 19 08:15:37.545353 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] Aug 19 08:15:37.548324 kernel: scsi host0: storvsc_host_t Aug 19 08:15:37.548376 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint Aug 19 08:15:37.554240 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Aug 19 08:15:37.554282 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] Aug 19 08:15:37.565551 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Aug 19 08:15:37.565707 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 19 08:15:37.567079 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Aug 19 08:15:37.567516 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) Aug 19 08:15:37.575222 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 Aug 19 08:15:37.575366 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned Aug 19 08:15:37.583180 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#199 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 08:15:37.591001 kernel: nvme nvme0: pci function c05b:00:00.0 Aug 19 08:15:37.591200 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) Aug 19 08:15:37.605113 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#144 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 08:15:37.768084 kernel: nvme nvme0: 2/0/0 default/read/poll queues Aug 19 08:15:37.782050 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Aug 19 08:15:38.125076 kernel: nvme nvme0: using unchecked data buffer Aug 19 08:15:38.531315 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. Aug 19 08:15:38.532737 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. Aug 19 08:15:38.546328 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 Aug 19 08:15:38.537835 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 08:15:38.553960 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 Aug 19 08:15:38.554183 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] Aug 19 08:15:38.554281 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] Aug 19 08:15:38.558259 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint Aug 19 08:15:38.562172 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] Aug 19 08:15:38.567561 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] Aug 19 08:15:38.567585 kernel: pci 7870:00:00.0: enabling Extended Tags Aug 19 08:15:38.582963 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 Aug 19 08:15:38.583164 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned Aug 19 08:15:38.587949 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned Aug 19 08:15:38.592014 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) Aug 19 08:15:38.602048 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 Aug 19 08:15:38.606382 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fada84 eth0: VF registering: eth1 Aug 19 08:15:38.606550 kernel: mana 7870:00:00.0 eth1: joined to eth0 Aug 19 08:15:38.610124 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 Aug 19 08:15:38.883946 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. Aug 19 08:15:38.913013 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Aug 19 08:15:38.926441 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. Aug 19 08:15:39.010695 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 08:15:39.014828 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:15:39.016284 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:15:39.020721 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:15:39.026011 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 08:15:39.042554 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:15:39.633056 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Aug 19 08:15:39.633412 disk-uuid[663]: The operation has completed successfully. Aug 19 08:15:39.685439 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 08:15:39.685519 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 08:15:39.713303 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 08:15:39.721390 sh[717]: Success Aug 19 08:15:39.753515 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 08:15:39.753579 kernel: device-mapper: uevent: version 1.0.3 Aug 19 08:15:39.754581 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 08:15:39.763048 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Aug 19 08:15:40.048346 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 08:15:40.054058 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 08:15:40.070339 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 08:15:40.136114 kernel: BTRFS: device fsid 99050df3-5e04-4f37-acde-dec46aab7896 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (730) Aug 19 08:15:40.136152 kernel: BTRFS info (device dm-0): first mount of filesystem 99050df3-5e04-4f37-acde-dec46aab7896 Aug 19 08:15:40.138594 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:40.140337 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 08:15:40.410430 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 08:15:40.411139 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:15:40.416592 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 08:15:40.417206 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 08:15:40.424229 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 08:15:40.450197 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (761) Aug 19 08:15:40.457065 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:40.457104 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:40.457114 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:40.480079 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:40.481008 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 08:15:40.486158 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 08:15:40.498827 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:15:40.502159 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:15:40.529336 systemd-networkd[899]: lo: Link UP Aug 19 08:15:40.529343 systemd-networkd[899]: lo: Gained carrier Aug 19 08:15:40.537130 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Aug 19 08:15:40.537303 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Aug 19 08:15:40.530328 systemd-networkd[899]: Enumeration completed Aug 19 08:15:40.541107 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fada84 eth0: Data path switched to VF: enP30832s1 Aug 19 08:15:40.530691 systemd-networkd[899]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:40.530694 systemd-networkd[899]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:15:40.530979 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:15:40.531273 systemd[1]: Reached target network.target - Network. Aug 19 08:15:40.540444 systemd-networkd[899]: enP30832s1: Link UP Aug 19 08:15:40.540511 systemd-networkd[899]: eth0: Link UP Aug 19 08:15:40.540619 systemd-networkd[899]: eth0: Gained carrier Aug 19 08:15:40.540629 systemd-networkd[899]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:40.544169 systemd-networkd[899]: enP30832s1: Gained carrier Aug 19 08:15:40.559066 systemd-networkd[899]: eth0: DHCPv4 address 10.200.8.40/24, gateway 10.200.8.1 acquired from 168.63.129.16 Aug 19 08:15:41.748900 ignition[878]: Ignition 2.21.0 Aug 19 08:15:41.748911 ignition[878]: Stage: fetch-offline Aug 19 08:15:41.749002 ignition[878]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:41.750575 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:15:41.749009 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:41.758648 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 19 08:15:41.749117 ignition[878]: parsed url from cmdline: "" Aug 19 08:15:41.749120 ignition[878]: no config URL provided Aug 19 08:15:41.749125 ignition[878]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 08:15:41.749130 ignition[878]: no config at "/usr/lib/ignition/user.ign" Aug 19 08:15:41.749135 ignition[878]: failed to fetch config: resource requires networking Aug 19 08:15:41.749287 ignition[878]: Ignition finished successfully Aug 19 08:15:41.780160 ignition[919]: Ignition 2.21.0 Aug 19 08:15:41.780171 ignition[919]: Stage: fetch Aug 19 08:15:41.780342 ignition[919]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:41.780349 ignition[919]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:41.780415 ignition[919]: parsed url from cmdline: "" Aug 19 08:15:41.780417 ignition[919]: no config URL provided Aug 19 08:15:41.780421 ignition[919]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 08:15:41.780427 ignition[919]: no config at "/usr/lib/ignition/user.ign" Aug 19 08:15:41.780458 ignition[919]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Aug 19 08:15:41.841824 ignition[919]: GET result: OK Aug 19 08:15:41.841893 ignition[919]: config has been read from IMDS userdata Aug 19 08:15:41.841919 ignition[919]: parsing config with SHA512: ff10b24173fb93eebf1171601d7feeaf332d519a4bb4d057944b93851e9b3222de9cdfc2b779f25606d799e6bc8c7b9096a3206767935f143da7c86b9fbbcd49 Aug 19 08:15:41.845502 unknown[919]: fetched base config from "system" Aug 19 08:15:41.845511 unknown[919]: fetched base config from "system" Aug 19 08:15:41.845825 ignition[919]: fetch: fetch complete Aug 19 08:15:41.845516 unknown[919]: fetched user config from "azure" Aug 19 08:15:41.845829 ignition[919]: fetch: fetch passed Aug 19 08:15:41.848436 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 19 08:15:41.845863 ignition[919]: Ignition finished successfully Aug 19 08:15:41.853728 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 08:15:41.882532 ignition[925]: Ignition 2.21.0 Aug 19 08:15:41.882542 ignition[925]: Stage: kargs Aug 19 08:15:41.882767 ignition[925]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:41.884633 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 08:15:41.882775 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:41.887211 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 08:15:41.883603 ignition[925]: kargs: kargs passed Aug 19 08:15:41.883639 ignition[925]: Ignition finished successfully Aug 19 08:15:41.907335 ignition[931]: Ignition 2.21.0 Aug 19 08:15:41.907345 ignition[931]: Stage: disks Aug 19 08:15:41.907505 ignition[931]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:41.909719 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 08:15:41.907512 ignition[931]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:41.908203 ignition[931]: disks: disks passed Aug 19 08:15:41.908234 ignition[931]: Ignition finished successfully Aug 19 08:15:41.917136 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 08:15:41.918272 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 08:15:41.923067 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:15:41.926237 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:15:41.932073 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:15:41.935957 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 08:15:42.145503 systemd-fsck[939]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Aug 19 08:15:42.150818 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 08:15:42.155341 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 08:15:42.458324 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 41966107-04fa-426e-9830-6b4efa50e27b r/w with ordered data mode. Quota mode: none. Aug 19 08:15:42.458920 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 08:15:42.460428 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 08:15:42.477456 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:15:42.482126 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 08:15:42.493219 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 19 08:15:42.497716 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 08:15:42.497748 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:15:42.500505 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 08:15:42.516520 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (948) Aug 19 08:15:42.516549 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:42.516559 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:42.516567 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:42.509296 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 08:15:42.518945 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:15:42.533161 systemd-networkd[899]: eth0: Gained IPv6LL Aug 19 08:15:43.023217 coreos-metadata[950]: Aug 19 08:15:43.023 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Aug 19 08:15:43.039806 coreos-metadata[950]: Aug 19 08:15:43.039 INFO Fetch successful Aug 19 08:15:43.039806 coreos-metadata[950]: Aug 19 08:15:43.039 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Aug 19 08:15:43.050973 coreos-metadata[950]: Aug 19 08:15:43.050 INFO Fetch successful Aug 19 08:15:43.068899 coreos-metadata[950]: Aug 19 08:15:43.068 INFO wrote hostname ci-4426.0.0-a-5588c1b4cf to /sysroot/etc/hostname Aug 19 08:15:43.071872 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 08:15:43.396773 initrd-setup-root[978]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 08:15:43.430864 initrd-setup-root[985]: cut: /sysroot/etc/group: No such file or directory Aug 19 08:15:43.468533 initrd-setup-root[992]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 08:15:43.486634 initrd-setup-root[999]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 08:15:44.960591 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 08:15:44.963058 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 08:15:44.967368 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 08:15:44.977433 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 08:15:44.982233 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:44.995817 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 08:15:45.006112 ignition[1067]: INFO : Ignition 2.21.0 Aug 19 08:15:45.006112 ignition[1067]: INFO : Stage: mount Aug 19 08:15:45.011649 ignition[1067]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:45.011649 ignition[1067]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:45.011649 ignition[1067]: INFO : mount: mount passed Aug 19 08:15:45.011649 ignition[1067]: INFO : Ignition finished successfully Aug 19 08:15:45.008149 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 08:15:45.010249 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 08:15:45.032264 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:15:45.053051 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1078) Aug 19 08:15:45.055227 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:45.055263 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:45.056198 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:45.060900 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:15:45.080497 ignition[1095]: INFO : Ignition 2.21.0 Aug 19 08:15:45.080497 ignition[1095]: INFO : Stage: files Aug 19 08:15:45.082393 ignition[1095]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:45.082393 ignition[1095]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:15:45.082393 ignition[1095]: DEBUG : files: compiled without relabeling support, skipping Aug 19 08:15:45.109695 ignition[1095]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 08:15:45.109695 ignition[1095]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 08:15:45.129542 ignition[1095]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 08:15:45.132096 ignition[1095]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 08:15:45.132096 ignition[1095]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 08:15:45.129879 unknown[1095]: wrote ssh authorized keys file for user: core Aug 19 08:15:45.145521 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 19 08:15:45.149097 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Aug 19 08:16:04.516818 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 08:16:04.824284 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 19 08:16:04.824284 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:16:04.833128 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:16:04.862072 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:16:04.862072 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:16:04.862072 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Aug 19 08:16:05.429395 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 08:16:06.460224 ignition[1095]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:16:06.460224 ignition[1095]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 08:16:06.505607 ignition[1095]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:16:06.512167 ignition[1095]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:16:06.512167 ignition[1095]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 08:16:06.517981 ignition[1095]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 19 08:16:06.517981 ignition[1095]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 08:16:06.517981 ignition[1095]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:16:06.517981 ignition[1095]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:16:06.517981 ignition[1095]: INFO : files: files passed Aug 19 08:16:06.517981 ignition[1095]: INFO : Ignition finished successfully Aug 19 08:16:06.515420 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 08:16:06.520148 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 08:16:06.539793 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 08:16:06.545375 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 08:16:06.545844 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 08:16:06.558972 initrd-setup-root-after-ignition[1124]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:16:06.558972 initrd-setup-root-after-ignition[1124]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:16:06.566104 initrd-setup-root-after-ignition[1128]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:16:06.562719 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:16:06.566828 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 08:16:06.572013 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 08:16:06.599508 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 08:16:06.599592 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 08:16:06.604254 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 08:16:06.607047 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 08:16:06.609239 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 08:16:06.609821 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 08:16:06.631571 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:16:06.633406 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 08:16:06.648720 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:16:06.651200 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:16:06.653610 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 08:16:06.654530 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 08:16:06.654630 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:16:06.655084 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 08:16:06.655357 systemd[1]: Stopped target basic.target - Basic System. Aug 19 08:16:06.655734 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 08:16:06.662184 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:16:06.664334 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 08:16:06.664596 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:16:06.664834 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 08:16:06.665131 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:16:06.672184 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 08:16:06.676178 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 08:16:06.680167 systemd[1]: Stopped target swap.target - Swaps. Aug 19 08:16:06.680293 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 08:16:06.680404 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:16:06.684121 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:16:06.687159 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:16:06.692145 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 08:16:06.692583 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:16:06.694239 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 08:16:06.694329 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 08:16:06.711581 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 08:16:06.711699 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:16:06.713502 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 08:16:06.713612 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 08:16:06.718087 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 19 08:16:06.718195 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 08:16:06.723604 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 08:16:06.729097 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 08:16:06.729286 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:16:06.735797 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 08:16:06.743202 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 08:16:06.743346 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:16:06.759129 ignition[1148]: INFO : Ignition 2.21.0 Aug 19 08:16:06.759129 ignition[1148]: INFO : Stage: umount Aug 19 08:16:06.759129 ignition[1148]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:06.759129 ignition[1148]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Aug 19 08:16:06.759129 ignition[1148]: INFO : umount: umount passed Aug 19 08:16:06.759129 ignition[1148]: INFO : Ignition finished successfully Aug 19 08:16:06.752240 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 08:16:06.752346 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:16:06.757820 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 08:16:06.757909 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 08:16:06.765018 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 08:16:06.765136 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 08:16:06.773948 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 08:16:06.774017 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 08:16:06.776124 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 08:16:06.776165 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 08:16:06.779102 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 19 08:16:06.779137 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 19 08:16:06.782108 systemd[1]: Stopped target network.target - Network. Aug 19 08:16:06.785084 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 08:16:06.785131 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:16:06.788109 systemd[1]: Stopped target paths.target - Path Units. Aug 19 08:16:06.789895 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 08:16:06.790644 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:16:06.794172 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 08:16:06.797230 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 08:16:06.800692 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 08:16:06.800733 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:16:06.803505 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 08:16:06.803533 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:16:06.806842 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 08:16:06.806886 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 08:16:06.810271 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 08:16:06.810304 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 08:16:06.816202 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 08:16:06.818715 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 08:16:06.822068 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 08:16:06.829324 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 08:16:06.829404 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 08:16:06.835526 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 08:16:06.835696 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 08:16:06.835770 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 08:16:06.839500 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 08:16:06.840051 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 08:16:06.895287 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fada84 eth0: Data path switched from VF: enP30832s1 Aug 19 08:16:06.896580 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Aug 19 08:16:06.842324 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 08:16:06.842360 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:16:06.844268 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 08:16:06.848123 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 08:16:06.848990 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:16:06.853449 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 08:16:06.853495 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:16:06.860597 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 08:16:06.860638 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 08:16:06.865337 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 08:16:06.865672 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:16:06.869246 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:16:06.874598 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 08:16:06.874644 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:16:06.880226 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 08:16:06.885419 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:16:06.890149 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 08:16:06.890207 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 08:16:06.895148 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 08:16:06.895180 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:16:06.897645 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 08:16:06.897688 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:16:06.899947 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 08:16:06.899986 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 08:16:06.902128 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 08:16:06.902164 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:16:06.909597 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 08:16:06.914097 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 08:16:06.914492 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:16:06.918941 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 08:16:06.918978 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:16:06.924224 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:16:06.924275 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:06.928678 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 08:16:06.928722 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 08:16:06.928754 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:16:06.928991 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 08:16:06.929084 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 08:16:06.933458 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 08:16:06.933530 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 08:16:07.088185 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 08:16:07.088288 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 08:16:07.090636 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 08:16:07.094099 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 08:16:07.094147 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 08:16:07.099854 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 08:16:07.111322 systemd[1]: Switching root. Aug 19 08:16:07.175279 systemd-journald[205]: Journal stopped Aug 19 08:16:11.043690 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). Aug 19 08:16:11.043724 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 08:16:11.043738 kernel: SELinux: policy capability open_perms=1 Aug 19 08:16:11.043748 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 08:16:11.043757 kernel: SELinux: policy capability always_check_network=0 Aug 19 08:16:11.043766 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 08:16:11.043776 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 08:16:11.043787 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 08:16:11.043796 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 08:16:11.043805 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 08:16:11.043814 kernel: audit: type=1403 audit(1755591368.131:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 08:16:11.043824 systemd[1]: Successfully loaded SELinux policy in 124.916ms. Aug 19 08:16:11.043835 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.622ms. Aug 19 08:16:11.043847 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:16:11.043860 systemd[1]: Detected virtualization microsoft. Aug 19 08:16:11.043870 systemd[1]: Detected architecture x86-64. Aug 19 08:16:11.043880 systemd[1]: Detected first boot. Aug 19 08:16:11.043891 systemd[1]: Hostname set to . Aug 19 08:16:11.043936 systemd[1]: Initializing machine ID from random generator. Aug 19 08:16:11.043947 zram_generator::config[1191]: No configuration found. Aug 19 08:16:11.043958 kernel: Guest personality initialized and is inactive Aug 19 08:16:11.043968 kernel: VMCI host device registered (name=vmci, major=10, minor=124) Aug 19 08:16:11.043978 kernel: Initialized host personality Aug 19 08:16:11.043987 kernel: NET: Registered PF_VSOCK protocol family Aug 19 08:16:11.043997 systemd[1]: Populated /etc with preset unit settings. Aug 19 08:16:11.044012 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 08:16:11.044023 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 08:16:11.044033 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 08:16:11.044066 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 08:16:11.044076 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 08:16:11.044086 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 08:16:11.044095 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 08:16:11.044105 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 08:16:11.044117 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 08:16:11.044126 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 08:16:11.044136 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 08:16:11.044146 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 08:16:11.044155 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:16:11.044164 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:16:11.044173 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 08:16:11.044186 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 08:16:11.044198 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 08:16:11.044208 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:16:11.044219 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 19 08:16:11.044228 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:16:11.044606 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:16:11.046512 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 08:16:11.046526 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 08:16:11.046541 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 08:16:11.046553 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 08:16:11.046563 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:16:11.046574 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:16:11.046585 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:16:11.046596 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:16:11.046607 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 08:16:11.046618 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 08:16:11.046632 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 08:16:11.046643 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:16:11.046654 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:16:11.046665 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:16:11.046676 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 08:16:11.046689 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 08:16:11.046700 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 08:16:11.046712 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 08:16:11.046723 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:11.046735 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 08:16:11.046746 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 08:16:11.046757 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 08:16:11.046769 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 08:16:11.046780 systemd[1]: Reached target machines.target - Containers. Aug 19 08:16:11.046793 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 08:16:11.046805 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:11.046816 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:16:11.046827 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 08:16:11.046838 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:16:11.046849 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:16:11.046860 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:16:11.046871 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 08:16:11.046883 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:16:11.046895 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 08:16:11.046905 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 08:16:11.046916 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 08:16:11.046926 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 08:16:11.046937 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 08:16:11.046948 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:11.046959 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:16:11.046971 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:16:11.046982 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:16:11.046993 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 08:16:11.047004 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 08:16:11.047014 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:16:11.047024 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 08:16:11.047076 systemd[1]: Stopped verity-setup.service. Aug 19 08:16:11.047089 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:11.047102 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 08:16:11.047113 kernel: loop: module loaded Aug 19 08:16:11.047124 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 08:16:11.047134 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 08:16:11.047145 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 08:16:11.047156 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 08:16:11.047167 kernel: fuse: init (API version 7.41) Aug 19 08:16:11.047204 systemd-journald[1291]: Collecting audit messages is disabled. Aug 19 08:16:11.047233 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 08:16:11.047244 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 08:16:11.047255 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:16:11.047267 systemd-journald[1291]: Journal started Aug 19 08:16:11.047293 systemd-journald[1291]: Runtime Journal (/run/log/journal/0f8c814771e74c7993b57c293e663374) is 8M, max 158.9M, 150.9M free. Aug 19 08:16:10.615284 systemd[1]: Queued start job for default target multi-user.target. Aug 19 08:16:10.626444 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Aug 19 08:16:10.626757 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 08:16:11.050312 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:16:11.053571 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 08:16:11.053736 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 08:16:11.055628 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:16:11.055787 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:16:11.059284 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:16:11.059426 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:16:11.063279 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 08:16:11.063418 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 08:16:11.065632 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:16:11.065796 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:16:11.067458 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:16:11.070284 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:16:11.071960 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 08:16:11.078916 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:16:11.085911 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 08:16:11.090950 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 08:16:11.095138 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 08:16:11.095172 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:16:11.098865 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 08:16:11.104198 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 08:16:11.107481 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:11.116373 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 08:16:11.118925 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 08:16:11.120845 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:16:11.122200 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 08:16:11.123991 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:16:11.128188 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:16:11.136474 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 08:16:11.142158 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 08:16:11.147363 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 08:16:11.149225 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:16:11.152234 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 08:16:11.153997 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 08:16:11.165189 systemd-journald[1291]: Time spent on flushing to /var/log/journal/0f8c814771e74c7993b57c293e663374 is 41.170ms for 980 entries. Aug 19 08:16:11.165189 systemd-journald[1291]: System Journal (/var/log/journal/0f8c814771e74c7993b57c293e663374) is 11.8M, max 2.6G, 2.6G free. Aug 19 08:16:11.244195 systemd-journald[1291]: Received client request to flush runtime journal. Aug 19 08:16:11.244230 systemd-journald[1291]: /var/log/journal/0f8c814771e74c7993b57c293e663374/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Aug 19 08:16:11.244247 kernel: loop0: detected capacity change from 0 to 29256 Aug 19 08:16:11.244258 systemd-journald[1291]: Rotating system journal. Aug 19 08:16:11.244273 kernel: ACPI: bus type drm_connector registered Aug 19 08:16:11.201650 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:16:11.201777 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:16:11.226918 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:16:11.244867 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 08:16:11.274402 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 08:16:11.277299 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:16:11.313460 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 08:16:11.315991 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 08:16:11.320570 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Aug 19 08:16:11.320817 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Aug 19 08:16:11.321164 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 08:16:11.325372 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:16:11.408951 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 08:16:11.551068 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 08:16:11.597056 kernel: loop1: detected capacity change from 0 to 221472 Aug 19 08:16:11.627857 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 08:16:11.636066 kernel: loop2: detected capacity change from 0 to 128016 Aug 19 08:16:12.083505 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 08:16:12.087361 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:16:12.113287 systemd-udevd[1356]: Using default interface naming scheme 'v255'. Aug 19 08:16:12.116059 kernel: loop3: detected capacity change from 0 to 111000 Aug 19 08:16:12.299352 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:16:12.306308 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:16:12.371535 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 08:16:12.389738 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 19 08:16:12.447104 kernel: hv_vmbus: registering driver hyperv_fb Aug 19 08:16:12.463466 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 08:16:12.468056 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Aug 19 08:16:12.468166 kernel: hv_vmbus: registering driver hv_balloon Aug 19 08:16:12.471095 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Aug 19 08:16:12.474385 kernel: Console: switching to colour dummy device 80x25 Aug 19 08:16:12.475065 kernel: loop4: detected capacity change from 0 to 29256 Aug 19 08:16:12.477054 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Aug 19 08:16:12.484116 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#233 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Aug 19 08:16:12.486567 kernel: Console: switching to colour frame buffer device 128x48 Aug 19 08:16:12.489160 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 08:16:12.518054 kernel: loop5: detected capacity change from 0 to 221472 Aug 19 08:16:12.539077 kernel: loop6: detected capacity change from 0 to 128016 Aug 19 08:16:12.555123 kernel: loop7: detected capacity change from 0 to 111000 Aug 19 08:16:12.571228 (sd-merge)[1402]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Aug 19 08:16:12.576086 (sd-merge)[1402]: Merged extensions into '/usr'. Aug 19 08:16:12.588110 systemd[1]: Reload requested from client PID 1330 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 08:16:12.588124 systemd[1]: Reloading... Aug 19 08:16:12.710100 zram_generator::config[1457]: No configuration found. Aug 19 08:16:12.727984 systemd-networkd[1370]: lo: Link UP Aug 19 08:16:12.733742 systemd-networkd[1370]: lo: Gained carrier Aug 19 08:16:12.738003 systemd-networkd[1370]: Enumeration completed Aug 19 08:16:12.739595 systemd-networkd[1370]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:12.739846 systemd-networkd[1370]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:16:12.743075 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Aug 19 08:16:12.750149 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Aug 19 08:16:12.755020 kernel: hv_netvsc f8615163-0000-1000-2000-7c1e52fada84 eth0: Data path switched to VF: enP30832s1 Aug 19 08:16:12.754487 systemd-networkd[1370]: enP30832s1: Link UP Aug 19 08:16:12.754640 systemd-networkd[1370]: eth0: Link UP Aug 19 08:16:12.754643 systemd-networkd[1370]: eth0: Gained carrier Aug 19 08:16:12.754659 systemd-networkd[1370]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:12.758251 systemd-networkd[1370]: enP30832s1: Gained carrier Aug 19 08:16:12.773119 systemd-networkd[1370]: eth0: DHCPv4 address 10.200.8.40/24, gateway 10.200.8.1 acquired from 168.63.129.16 Aug 19 08:16:12.849341 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Aug 19 08:16:12.992547 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Aug 19 08:16:12.994639 systemd[1]: Reloading finished in 406 ms. Aug 19 08:16:13.019502 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:16:13.023375 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 08:16:13.058782 systemd[1]: Starting ensure-sysext.service... Aug 19 08:16:13.062494 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 08:16:13.065654 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 08:16:13.068907 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 08:16:13.072644 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:16:13.078115 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:16:13.090125 systemd[1]: Reload requested from client PID 1528 ('systemctl') (unit ensure-sysext.service)... Aug 19 08:16:13.090307 systemd[1]: Reloading... Aug 19 08:16:13.091714 systemd-tmpfiles[1532]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 08:16:13.091736 systemd-tmpfiles[1532]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 08:16:13.091931 systemd-tmpfiles[1532]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 08:16:13.092154 systemd-tmpfiles[1532]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 08:16:13.092789 systemd-tmpfiles[1532]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 08:16:13.093020 systemd-tmpfiles[1532]: ACLs are not supported, ignoring. Aug 19 08:16:13.096225 systemd-tmpfiles[1532]: ACLs are not supported, ignoring. Aug 19 08:16:13.115371 systemd-tmpfiles[1532]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:16:13.115532 systemd-tmpfiles[1532]: Skipping /boot Aug 19 08:16:13.128529 systemd-tmpfiles[1532]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:16:13.128609 systemd-tmpfiles[1532]: Skipping /boot Aug 19 08:16:13.164056 zram_generator::config[1564]: No configuration found. Aug 19 08:16:13.325242 systemd[1]: Reloading finished in 234 ms. Aug 19 08:16:13.354197 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 08:16:13.354945 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 08:16:13.355326 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:16:13.363159 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:16:13.374155 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 08:16:13.378170 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 08:16:13.384816 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:16:13.386154 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 08:16:13.394956 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.395442 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:13.397500 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:16:13.401111 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:16:13.404122 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:16:13.406416 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:13.406533 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:13.406626 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.411711 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.411874 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:13.412019 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:13.412146 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:13.412233 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.422576 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.422877 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:13.431155 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:16:13.433199 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:13.433486 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:13.433749 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 08:16:13.435942 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:13.440342 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:16:13.444258 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:16:13.446666 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:16:13.447243 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:16:13.451210 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 08:16:13.458250 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 08:16:13.461977 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:16:13.462242 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:16:13.462903 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:16:13.463021 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:16:13.470935 systemd[1]: Finished ensure-sysext.service. Aug 19 08:16:13.475190 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:16:13.475238 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:16:13.482141 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:13.523393 augenrules[1670]: No rules Aug 19 08:16:13.524124 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:16:13.524287 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:16:13.555993 systemd-resolved[1633]: Positive Trust Anchors: Aug 19 08:16:13.556004 systemd-resolved[1633]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:16:13.556032 systemd-resolved[1633]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:16:13.559415 systemd-resolved[1633]: Using system hostname 'ci-4426.0.0-a-5588c1b4cf'. Aug 19 08:16:13.560385 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:16:13.563130 systemd[1]: Reached target network.target - Network. Aug 19 08:16:13.563782 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:16:13.868986 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 08:16:13.873249 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 08:16:14.725282 systemd-networkd[1370]: eth0: Gained IPv6LL Aug 19 08:16:14.727029 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 08:16:14.730798 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 08:16:15.871581 ldconfig[1325]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 08:16:15.895907 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 08:16:15.900209 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 08:16:15.920459 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 08:16:15.921693 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:16:15.922932 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 08:16:15.925104 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 08:16:15.928075 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 19 08:16:15.930169 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 08:16:15.933123 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 08:16:15.936096 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 08:16:15.939085 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 08:16:15.939116 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:16:15.942073 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:16:15.944962 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 08:16:15.949058 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 08:16:15.953556 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 08:16:15.957195 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 08:16:15.958931 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 08:16:15.968453 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 08:16:15.970211 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 08:16:15.973615 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 08:16:15.977737 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:16:15.978893 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:16:15.980301 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:16:15.980319 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:16:15.982064 systemd[1]: Starting chronyd.service - NTP client/server... Aug 19 08:16:15.985818 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 08:16:15.990204 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 19 08:16:15.995156 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 08:16:15.999278 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 08:16:16.005124 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 08:16:16.008618 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 08:16:16.011160 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 08:16:16.013226 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 19 08:16:16.015096 systemd[1]: hv_fcopy_uio_daemon.service - Hyper-V FCOPY UIO daemon was skipped because of an unmet condition check (ConditionPathExists=/sys/bus/vmbus/devices/eb765408-105f-49b6-b4aa-c123b64d17d4/uio). Aug 19 08:16:16.017203 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Aug 19 08:16:16.019317 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Aug 19 08:16:16.023802 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:16.031600 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 08:16:16.035312 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 08:16:16.039192 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 08:16:16.045217 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 08:16:16.045572 jq[1688]: false Aug 19 08:16:16.049933 KVP[1694]: KVP starting; pid is:1694 Aug 19 08:16:16.055818 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing passwd entry cache Aug 19 08:16:16.056026 oslogin_cache_refresh[1693]: Refreshing passwd entry cache Aug 19 08:16:16.061533 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 08:16:16.063095 KVP[1694]: KVP LIC Version: 3.1 Aug 19 08:16:16.064095 kernel: hv_utils: KVP IC version 4.0 Aug 19 08:16:16.067291 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 08:16:16.069766 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 08:16:16.070166 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 08:16:16.072264 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 08:16:16.081452 extend-filesystems[1691]: Found /dev/nvme0n1p6 Aug 19 08:16:16.084420 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting users, quitting Aug 19 08:16:16.084420 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:16:16.084420 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing group entry cache Aug 19 08:16:16.084079 oslogin_cache_refresh[1693]: Failure getting users, quitting Aug 19 08:16:16.084093 oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:16:16.084123 oslogin_cache_refresh[1693]: Refreshing group entry cache Aug 19 08:16:16.084695 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 08:16:16.089376 chronyd[1683]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Aug 19 08:16:16.096332 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting groups, quitting Aug 19 08:16:16.096332 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:16:16.089989 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 08:16:16.094134 oslogin_cache_refresh[1693]: Failure getting groups, quitting Aug 19 08:16:16.093367 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 08:16:16.094143 oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:16:16.093541 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 08:16:16.095271 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 19 08:16:16.095442 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 19 08:16:16.109937 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 08:16:16.110200 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 08:16:16.113131 chronyd[1683]: Timezone right/UTC failed leap second check, ignoring Aug 19 08:16:16.113468 systemd[1]: Started chronyd.service - NTP client/server. Aug 19 08:16:16.113252 chronyd[1683]: Loaded seccomp filter (level 2) Aug 19 08:16:16.118094 extend-filesystems[1691]: Found /dev/nvme0n1p9 Aug 19 08:16:16.117784 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 08:16:16.120332 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 08:16:16.124545 (ntainerd)[1722]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 08:16:16.131944 jq[1714]: true Aug 19 08:16:16.138606 extend-filesystems[1691]: Checking size of /dev/nvme0n1p9 Aug 19 08:16:16.153244 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 08:16:16.154391 update_engine[1707]: I20250819 08:16:16.154280 1707 main.cc:92] Flatcar Update Engine starting Aug 19 08:16:16.161011 jq[1734]: true Aug 19 08:16:16.171094 extend-filesystems[1691]: Old size kept for /dev/nvme0n1p9 Aug 19 08:16:16.170167 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 08:16:16.174660 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 08:16:16.205700 tar[1718]: linux-amd64/helm Aug 19 08:16:16.266730 dbus-daemon[1686]: [system] SELinux support is enabled Aug 19 08:16:16.266850 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 08:16:16.274830 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 08:16:16.274859 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 08:16:16.278175 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 08:16:16.278281 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 08:16:16.288419 systemd[1]: Started update-engine.service - Update Engine. Aug 19 08:16:16.291570 update_engine[1707]: I20250819 08:16:16.291333 1707 update_check_scheduler.cc:74] Next update check in 10m32s Aug 19 08:16:16.300301 systemd-logind[1706]: New seat seat0. Aug 19 08:16:16.302820 systemd-logind[1706]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 19 08:16:16.303173 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 08:16:16.306245 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 08:16:16.375569 bash[1770]: Updated "/home/core/.ssh/authorized_keys" Aug 19 08:16:16.377207 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 08:16:16.379828 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 19 08:16:16.399342 coreos-metadata[1685]: Aug 19 08:16:16.399 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Aug 19 08:16:16.403756 coreos-metadata[1685]: Aug 19 08:16:16.403 INFO Fetch successful Aug 19 08:16:16.403756 coreos-metadata[1685]: Aug 19 08:16:16.403 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Aug 19 08:16:16.409010 coreos-metadata[1685]: Aug 19 08:16:16.408 INFO Fetch successful Aug 19 08:16:16.409010 coreos-metadata[1685]: Aug 19 08:16:16.408 INFO Fetching http://168.63.129.16/machine/35cddf2e-eb07-4179-a155-d16a5538242d/9967b208%2D738b%2D43d7%2D8583%2D50d6918ccff7.%5Fci%2D4426.0.0%2Da%2D5588c1b4cf?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Aug 19 08:16:16.412434 coreos-metadata[1685]: Aug 19 08:16:16.412 INFO Fetch successful Aug 19 08:16:16.412434 coreos-metadata[1685]: Aug 19 08:16:16.412 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Aug 19 08:16:16.424693 coreos-metadata[1685]: Aug 19 08:16:16.423 INFO Fetch successful Aug 19 08:16:16.452584 sshd_keygen[1736]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 08:16:16.489300 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 08:16:16.492100 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 19 08:16:16.499846 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 08:16:16.503155 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 08:16:16.506976 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Aug 19 08:16:16.543645 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 08:16:16.543820 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 08:16:16.552753 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 08:16:16.571603 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Aug 19 08:16:16.601264 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 08:16:16.606358 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 08:16:16.614171 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 19 08:16:16.616164 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 08:16:16.638522 locksmithd[1780]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 08:16:16.786632 tar[1718]: linux-amd64/LICENSE Aug 19 08:16:16.786765 tar[1718]: linux-amd64/README.md Aug 19 08:16:16.802521 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 08:16:16.945364 containerd[1722]: time="2025-08-19T08:16:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 08:16:16.946404 containerd[1722]: time="2025-08-19T08:16:16.945829961Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 08:16:16.954307 containerd[1722]: time="2025-08-19T08:16:16.954276702Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.669µs" Aug 19 08:16:16.954365 containerd[1722]: time="2025-08-19T08:16:16.954337740Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 08:16:16.954365 containerd[1722]: time="2025-08-19T08:16:16.954359684Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 08:16:16.954482 containerd[1722]: time="2025-08-19T08:16:16.954461407Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 08:16:16.954482 containerd[1722]: time="2025-08-19T08:16:16.954478034Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 08:16:16.954525 containerd[1722]: time="2025-08-19T08:16:16.954497353Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954549663Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954563005Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954761816Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954771703Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954781831Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954789643Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954846937Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.954983663Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.955001576Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.955010772Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 08:16:16.955433 containerd[1722]: time="2025-08-19T08:16:16.955065262Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 08:16:16.955664 containerd[1722]: time="2025-08-19T08:16:16.955288434Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 08:16:16.955664 containerd[1722]: time="2025-08-19T08:16:16.955330282Z" level=info msg="metadata content store policy set" policy=shared Aug 19 08:16:16.969462 containerd[1722]: time="2025-08-19T08:16:16.969417645Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 08:16:16.969687 containerd[1722]: time="2025-08-19T08:16:16.969598579Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 08:16:16.969687 containerd[1722]: time="2025-08-19T08:16:16.969638599Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 08:16:16.969687 containerd[1722]: time="2025-08-19T08:16:16.969651240Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 08:16:16.969687 containerd[1722]: time="2025-08-19T08:16:16.969664643Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969675082Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969803420Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969815398Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969825779Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969834928Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 08:16:16.969865 containerd[1722]: time="2025-08-19T08:16:16.969843556Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 08:16:16.970128 containerd[1722]: time="2025-08-19T08:16:16.969855827Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 08:16:16.970179 containerd[1722]: time="2025-08-19T08:16:16.970115823Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 08:16:16.970225 containerd[1722]: time="2025-08-19T08:16:16.970215785Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 08:16:16.970339 containerd[1722]: time="2025-08-19T08:16:16.970260719Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 08:16:16.970339 containerd[1722]: time="2025-08-19T08:16:16.970271507Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 08:16:16.970339 containerd[1722]: time="2025-08-19T08:16:16.970281026Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 08:16:16.970339 containerd[1722]: time="2025-08-19T08:16:16.970297972Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 08:16:16.970504 containerd[1722]: time="2025-08-19T08:16:16.970315283Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 08:16:16.970504 containerd[1722]: time="2025-08-19T08:16:16.970440673Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 08:16:16.970504 containerd[1722]: time="2025-08-19T08:16:16.970452129Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 08:16:16.970504 containerd[1722]: time="2025-08-19T08:16:16.970462178Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 08:16:16.970504 containerd[1722]: time="2025-08-19T08:16:16.970472227Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 08:16:16.970688 containerd[1722]: time="2025-08-19T08:16:16.970653329Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 08:16:16.970688 containerd[1722]: time="2025-08-19T08:16:16.970669369Z" level=info msg="Start snapshots syncer" Aug 19 08:16:16.970810 containerd[1722]: time="2025-08-19T08:16:16.970753416Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 08:16:16.972248 containerd[1722]: time="2025-08-19T08:16:16.971055583Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 08:16:16.972248 containerd[1722]: time="2025-08-19T08:16:16.971108649Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971176226Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971264142Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971280859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971290329Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971300582Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971311760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971322006Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971332125Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971353110Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971362395Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971371842Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971391141Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971402993Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:16:16.972402 containerd[1722]: time="2025-08-19T08:16:16.971411676Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971421377Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971428809Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971438029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971447514Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971462297Z" level=info msg="runtime interface created" Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971467460Z" level=info msg="created NRI interface" Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971475858Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971496399Z" level=info msg="Connect containerd service" Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.971522279Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 08:16:16.972623 containerd[1722]: time="2025-08-19T08:16:16.972074200Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 08:16:17.341479 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:17.348457 (kubelet)[1845]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482798729Z" level=info msg="Start subscribing containerd event" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482856556Z" level=info msg="Start recovering state" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482943430Z" level=info msg="Start event monitor" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482956001Z" level=info msg="Start cni network conf syncer for default" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482964734Z" level=info msg="Start streaming server" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482973025Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482980644Z" level=info msg="runtime interface starting up..." Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482987660Z" level=info msg="starting plugins..." Aug 19 08:16:17.483063 containerd[1722]: time="2025-08-19T08:16:17.482999333Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 08:16:17.485229 containerd[1722]: time="2025-08-19T08:16:17.483487236Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 08:16:17.485229 containerd[1722]: time="2025-08-19T08:16:17.483541250Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 08:16:17.485229 containerd[1722]: time="2025-08-19T08:16:17.485164796Z" level=info msg="containerd successfully booted in 0.540076s" Aug 19 08:16:17.483674 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 08:16:17.486015 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 08:16:17.487906 systemd[1]: Startup finished in 3.237s (kernel) + 31.593s (initrd) + 9.480s (userspace) = 44.310s. Aug 19 08:16:17.708182 login[1823]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 19 08:16:17.710702 login[1824]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 19 08:16:17.717703 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 08:16:17.718727 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 08:16:17.734604 systemd-logind[1706]: New session 1 of user core. Aug 19 08:16:17.738368 systemd-logind[1706]: New session 2 of user core. Aug 19 08:16:17.742569 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 08:16:17.744773 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 08:16:17.754895 (systemd)[1860]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 08:16:17.757403 systemd-logind[1706]: New session c1 of user core. Aug 19 08:16:17.920222 systemd[1860]: Queued start job for default target default.target. Aug 19 08:16:17.927258 kubelet[1845]: E0819 08:16:17.927224 1845 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:16:17.928763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:16:17.928888 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:16:17.929365 systemd[1]: kubelet.service: Consumed 881ms CPU time, 263.6M memory peak. Aug 19 08:16:17.930150 systemd[1860]: Created slice app.slice - User Application Slice. Aug 19 08:16:17.930185 systemd[1860]: Reached target paths.target - Paths. Aug 19 08:16:17.930220 systemd[1860]: Reached target timers.target - Timers. Aug 19 08:16:17.931141 systemd[1860]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 08:16:17.939365 systemd[1860]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 08:16:17.939483 systemd[1860]: Reached target sockets.target - Sockets. Aug 19 08:16:17.939568 systemd[1860]: Reached target basic.target - Basic System. Aug 19 08:16:17.939653 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 08:16:17.939670 systemd[1860]: Reached target default.target - Main User Target. Aug 19 08:16:17.939692 systemd[1860]: Startup finished in 177ms. Aug 19 08:16:17.946155 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 08:16:17.946737 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 08:16:18.350431 waagent[1818]: 2025-08-19T08:16:18.350353Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Aug 19 08:16:18.352181 waagent[1818]: 2025-08-19T08:16:18.352136Z INFO Daemon Daemon OS: flatcar 4426.0.0 Aug 19 08:16:18.353348 waagent[1818]: 2025-08-19T08:16:18.353275Z INFO Daemon Daemon Python: 3.11.13 Aug 19 08:16:18.354649 waagent[1818]: 2025-08-19T08:16:18.354585Z INFO Daemon Daemon Run daemon Aug 19 08:16:18.355760 waagent[1818]: 2025-08-19T08:16:18.355730Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4426.0.0' Aug 19 08:16:18.355978 waagent[1818]: 2025-08-19T08:16:18.355954Z INFO Daemon Daemon Using waagent for provisioning Aug 19 08:16:18.358715 waagent[1818]: 2025-08-19T08:16:18.358687Z INFO Daemon Daemon Activate resource disk Aug 19 08:16:18.359315 waagent[1818]: 2025-08-19T08:16:18.359153Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Aug 19 08:16:18.360878 waagent[1818]: 2025-08-19T08:16:18.360828Z INFO Daemon Daemon Found device: None Aug 19 08:16:18.360952 waagent[1818]: 2025-08-19T08:16:18.360935Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Aug 19 08:16:18.361027 waagent[1818]: 2025-08-19T08:16:18.361003Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Aug 19 08:16:18.361476 waagent[1818]: 2025-08-19T08:16:18.361444Z INFO Daemon Daemon Clean protocol and wireserver endpoint Aug 19 08:16:18.361575 waagent[1818]: 2025-08-19T08:16:18.361552Z INFO Daemon Daemon Running default provisioning handler Aug 19 08:16:18.373646 waagent[1818]: 2025-08-19T08:16:18.373595Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Aug 19 08:16:18.376204 waagent[1818]: 2025-08-19T08:16:18.376163Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Aug 19 08:16:18.376825 waagent[1818]: 2025-08-19T08:16:18.376581Z INFO Daemon Daemon cloud-init is enabled: False Aug 19 08:16:18.376825 waagent[1818]: 2025-08-19T08:16:18.376649Z INFO Daemon Daemon Copying ovf-env.xml Aug 19 08:16:18.453346 waagent[1818]: 2025-08-19T08:16:18.451622Z INFO Daemon Daemon Successfully mounted dvd Aug 19 08:16:18.462357 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Aug 19 08:16:18.464207 waagent[1818]: 2025-08-19T08:16:18.464162Z INFO Daemon Daemon Detect protocol endpoint Aug 19 08:16:18.464661 waagent[1818]: 2025-08-19T08:16:18.464632Z INFO Daemon Daemon Clean protocol and wireserver endpoint Aug 19 08:16:18.466519 waagent[1818]: 2025-08-19T08:16:18.466493Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Aug 19 08:16:18.467282 waagent[1818]: 2025-08-19T08:16:18.467259Z INFO Daemon Daemon Test for route to 168.63.129.16 Aug 19 08:16:18.468228 waagent[1818]: 2025-08-19T08:16:18.468206Z INFO Daemon Daemon Route to 168.63.129.16 exists Aug 19 08:16:18.468292 waagent[1818]: 2025-08-19T08:16:18.468276Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Aug 19 08:16:18.479347 waagent[1818]: 2025-08-19T08:16:18.479317Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Aug 19 08:16:18.480494 waagent[1818]: 2025-08-19T08:16:18.479893Z INFO Daemon Daemon Wire protocol version:2012-11-30 Aug 19 08:16:18.480494 waagent[1818]: 2025-08-19T08:16:18.480052Z INFO Daemon Daemon Server preferred version:2015-04-05 Aug 19 08:16:18.544653 waagent[1818]: 2025-08-19T08:16:18.544599Z INFO Daemon Daemon Initializing goal state during protocol detection Aug 19 08:16:18.545916 waagent[1818]: 2025-08-19T08:16:18.545723Z INFO Daemon Daemon Forcing an update of the goal state. Aug 19 08:16:18.554811 waagent[1818]: 2025-08-19T08:16:18.554777Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Aug 19 08:16:18.572685 waagent[1818]: 2025-08-19T08:16:18.572658Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Aug 19 08:16:18.574046 waagent[1818]: 2025-08-19T08:16:18.574005Z INFO Daemon Aug 19 08:16:18.574429 waagent[1818]: 2025-08-19T08:16:18.574256Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: bcf77315-afd9-4eae-b644-371e3fa490e5 eTag: 1949534227960534138 source: Fabric] Aug 19 08:16:18.576798 waagent[1818]: 2025-08-19T08:16:18.576772Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Aug 19 08:16:18.578077 waagent[1818]: 2025-08-19T08:16:18.578052Z INFO Daemon Aug 19 08:16:18.578569 waagent[1818]: 2025-08-19T08:16:18.578236Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Aug 19 08:16:18.583136 waagent[1818]: 2025-08-19T08:16:18.583113Z INFO Daemon Daemon Downloading artifacts profile blob Aug 19 08:16:18.669189 waagent[1818]: 2025-08-19T08:16:18.669115Z INFO Daemon Downloaded certificate {'thumbprint': '9FE5332B09D9CC2F6E8B26FAAE2A88C9835609A2', 'hasPrivateKey': True} Aug 19 08:16:18.671353 waagent[1818]: 2025-08-19T08:16:18.671322Z INFO Daemon Fetch goal state completed Aug 19 08:16:18.680522 waagent[1818]: 2025-08-19T08:16:18.680488Z INFO Daemon Daemon Starting provisioning Aug 19 08:16:18.681355 waagent[1818]: 2025-08-19T08:16:18.680961Z INFO Daemon Daemon Handle ovf-env.xml. Aug 19 08:16:18.682104 waagent[1818]: 2025-08-19T08:16:18.681704Z INFO Daemon Daemon Set hostname [ci-4426.0.0-a-5588c1b4cf] Aug 19 08:16:18.700239 waagent[1818]: 2025-08-19T08:16:18.700200Z INFO Daemon Daemon Publish hostname [ci-4426.0.0-a-5588c1b4cf] Aug 19 08:16:18.700801 waagent[1818]: 2025-08-19T08:16:18.700766Z INFO Daemon Daemon Examine /proc/net/route for primary interface Aug 19 08:16:18.702487 waagent[1818]: 2025-08-19T08:16:18.702461Z INFO Daemon Daemon Primary interface is [eth0] Aug 19 08:16:18.708955 systemd-networkd[1370]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:18.708960 systemd-networkd[1370]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:16:18.708981 systemd-networkd[1370]: eth0: DHCP lease lost Aug 19 08:16:18.709725 waagent[1818]: 2025-08-19T08:16:18.709682Z INFO Daemon Daemon Create user account if not exists Aug 19 08:16:18.710862 waagent[1818]: 2025-08-19T08:16:18.709973Z INFO Daemon Daemon User core already exists, skip useradd Aug 19 08:16:18.710862 waagent[1818]: 2025-08-19T08:16:18.710409Z INFO Daemon Daemon Configure sudoer Aug 19 08:16:18.715976 waagent[1818]: 2025-08-19T08:16:18.715932Z INFO Daemon Daemon Configure sshd Aug 19 08:16:18.720183 waagent[1818]: 2025-08-19T08:16:18.720146Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Aug 19 08:16:18.720386 waagent[1818]: 2025-08-19T08:16:18.720362Z INFO Daemon Daemon Deploy ssh public key. Aug 19 08:16:18.743097 systemd-networkd[1370]: eth0: DHCPv4 address 10.200.8.40/24, gateway 10.200.8.1 acquired from 168.63.129.16 Aug 19 08:16:19.796011 waagent[1818]: 2025-08-19T08:16:19.795971Z INFO Daemon Daemon Provisioning complete Aug 19 08:16:19.805347 waagent[1818]: 2025-08-19T08:16:19.805315Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Aug 19 08:16:19.806220 waagent[1818]: 2025-08-19T08:16:19.805778Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Aug 19 08:16:19.806220 waagent[1818]: 2025-08-19T08:16:19.806019Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Aug 19 08:16:19.908800 waagent[1916]: 2025-08-19T08:16:19.908725Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Aug 19 08:16:19.909098 waagent[1916]: 2025-08-19T08:16:19.908822Z INFO ExtHandler ExtHandler OS: flatcar 4426.0.0 Aug 19 08:16:19.909098 waagent[1916]: 2025-08-19T08:16:19.908861Z INFO ExtHandler ExtHandler Python: 3.11.13 Aug 19 08:16:19.909098 waagent[1916]: 2025-08-19T08:16:19.908897Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Aug 19 08:16:19.948588 waagent[1916]: 2025-08-19T08:16:19.948541Z INFO ExtHandler ExtHandler Distro: flatcar-4426.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Aug 19 08:16:19.948717 waagent[1916]: 2025-08-19T08:16:19.948692Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 08:16:19.948761 waagent[1916]: 2025-08-19T08:16:19.948743Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 08:16:19.958811 waagent[1916]: 2025-08-19T08:16:19.958756Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Aug 19 08:16:19.966153 waagent[1916]: 2025-08-19T08:16:19.966121Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Aug 19 08:16:19.966502 waagent[1916]: 2025-08-19T08:16:19.966469Z INFO ExtHandler Aug 19 08:16:19.966543 waagent[1916]: 2025-08-19T08:16:19.966522Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: cc8cfeee-b6f9-498a-974f-4e45f5aafc8c eTag: 1949534227960534138 source: Fabric] Aug 19 08:16:19.966742 waagent[1916]: 2025-08-19T08:16:19.966716Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Aug 19 08:16:19.967087 waagent[1916]: 2025-08-19T08:16:19.967057Z INFO ExtHandler Aug 19 08:16:19.967133 waagent[1916]: 2025-08-19T08:16:19.967099Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Aug 19 08:16:19.969446 waagent[1916]: 2025-08-19T08:16:19.969420Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Aug 19 08:16:20.048380 waagent[1916]: 2025-08-19T08:16:20.048309Z INFO ExtHandler Downloaded certificate {'thumbprint': '9FE5332B09D9CC2F6E8B26FAAE2A88C9835609A2', 'hasPrivateKey': True} Aug 19 08:16:20.048655 waagent[1916]: 2025-08-19T08:16:20.048630Z INFO ExtHandler Fetch goal state completed Aug 19 08:16:20.062811 waagent[1916]: 2025-08-19T08:16:20.062767Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Aug 19 08:16:20.066441 waagent[1916]: 2025-08-19T08:16:20.066400Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1916 Aug 19 08:16:20.066541 waagent[1916]: 2025-08-19T08:16:20.066508Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Aug 19 08:16:20.066748 waagent[1916]: 2025-08-19T08:16:20.066730Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Aug 19 08:16:20.067683 waagent[1916]: 2025-08-19T08:16:20.067652Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] Aug 19 08:16:20.067950 waagent[1916]: 2025-08-19T08:16:20.067928Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Aug 19 08:16:20.068101 waagent[1916]: 2025-08-19T08:16:20.068033Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Aug 19 08:16:20.068446 waagent[1916]: 2025-08-19T08:16:20.068427Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Aug 19 08:16:20.103022 waagent[1916]: 2025-08-19T08:16:20.102998Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Aug 19 08:16:20.103158 waagent[1916]: 2025-08-19T08:16:20.103137Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Aug 19 08:16:20.107996 waagent[1916]: 2025-08-19T08:16:20.107877Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Aug 19 08:16:20.112375 systemd[1]: Reload requested from client PID 1931 ('systemctl') (unit waagent.service)... Aug 19 08:16:20.112385 systemd[1]: Reloading... Aug 19 08:16:20.182071 zram_generator::config[1966]: No configuration found. Aug 19 08:16:20.350865 systemd[1]: Reloading finished in 238 ms. Aug 19 08:16:20.365030 waagent[1916]: 2025-08-19T08:16:20.364955Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Aug 19 08:16:20.365322 waagent[1916]: 2025-08-19T08:16:20.365295Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Aug 19 08:16:20.498057 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#226 cmd 0x4a status: scsi 0x0 srb 0x20 hv 0xc0000001 Aug 19 08:16:21.242600 waagent[1916]: 2025-08-19T08:16:21.242530Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Aug 19 08:16:21.242911 waagent[1916]: 2025-08-19T08:16:21.242870Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Aug 19 08:16:21.243668 waagent[1916]: 2025-08-19T08:16:21.243612Z INFO ExtHandler ExtHandler Starting env monitor service. Aug 19 08:16:21.243740 waagent[1916]: 2025-08-19T08:16:21.243669Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 08:16:21.243770 waagent[1916]: 2025-08-19T08:16:21.243732Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 08:16:21.243920 waagent[1916]: 2025-08-19T08:16:21.243881Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Aug 19 08:16:21.244311 waagent[1916]: 2025-08-19T08:16:21.244264Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Aug 19 08:16:21.244530 waagent[1916]: 2025-08-19T08:16:21.244502Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Aug 19 08:16:21.244530 waagent[1916]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Aug 19 08:16:21.244530 waagent[1916]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Aug 19 08:16:21.244530 waagent[1916]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Aug 19 08:16:21.244530 waagent[1916]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Aug 19 08:16:21.244530 waagent[1916]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Aug 19 08:16:21.244530 waagent[1916]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Aug 19 08:16:21.244662 waagent[1916]: 2025-08-19T08:16:21.244575Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Aug 19 08:16:21.244662 waagent[1916]: 2025-08-19T08:16:21.244619Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Aug 19 08:16:21.244738 waagent[1916]: 2025-08-19T08:16:21.244719Z INFO EnvHandler ExtHandler Configure routes Aug 19 08:16:21.244784 waagent[1916]: 2025-08-19T08:16:21.244767Z INFO EnvHandler ExtHandler Gateway:None Aug 19 08:16:21.244826 waagent[1916]: 2025-08-19T08:16:21.244808Z INFO EnvHandler ExtHandler Routes:None Aug 19 08:16:21.244903 waagent[1916]: 2025-08-19T08:16:21.244870Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Aug 19 08:16:21.245169 waagent[1916]: 2025-08-19T08:16:21.245132Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Aug 19 08:16:21.245455 waagent[1916]: 2025-08-19T08:16:21.245411Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Aug 19 08:16:21.245489 waagent[1916]: 2025-08-19T08:16:21.245461Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Aug 19 08:16:21.245936 waagent[1916]: 2025-08-19T08:16:21.245852Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Aug 19 08:16:21.251722 waagent[1916]: 2025-08-19T08:16:21.251691Z INFO ExtHandler ExtHandler Aug 19 08:16:21.251787 waagent[1916]: 2025-08-19T08:16:21.251748Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 314732ea-d639-4db9-8827-5dc46a2d4c70 correlation 8758a42e-f2be-49f2-b979-2e737bc3ea02 created: 2025-08-19T08:14:39.549263Z] Aug 19 08:16:21.252020 waagent[1916]: 2025-08-19T08:16:21.251998Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Aug 19 08:16:21.252414 waagent[1916]: 2025-08-19T08:16:21.252392Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Aug 19 08:16:21.294225 waagent[1916]: 2025-08-19T08:16:21.294178Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Aug 19 08:16:21.294225 waagent[1916]: Try `iptables -h' or 'iptables --help' for more information.) Aug 19 08:16:21.294632 waagent[1916]: 2025-08-19T08:16:21.294608Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 2D87330B-DAF6-42DD-8506-C331FC0DCC0D;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Aug 19 08:16:21.324226 waagent[1916]: 2025-08-19T08:16:21.324177Z INFO MonitorHandler ExtHandler Network interfaces: Aug 19 08:16:21.324226 waagent[1916]: Executing ['ip', '-a', '-o', 'link']: Aug 19 08:16:21.324226 waagent[1916]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Aug 19 08:16:21.324226 waagent[1916]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fa:da:84 brd ff:ff:ff:ff:ff:ff\ alias Network Device Aug 19 08:16:21.324226 waagent[1916]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:fa:da:84 brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 Aug 19 08:16:21.324226 waagent[1916]: Executing ['ip', '-4', '-a', '-o', 'address']: Aug 19 08:16:21.324226 waagent[1916]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Aug 19 08:16:21.324226 waagent[1916]: 2: eth0 inet 10.200.8.40/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Aug 19 08:16:21.324226 waagent[1916]: Executing ['ip', '-6', '-a', '-o', 'address']: Aug 19 08:16:21.324226 waagent[1916]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Aug 19 08:16:21.324226 waagent[1916]: 2: eth0 inet6 fe80::7e1e:52ff:fefa:da84/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Aug 19 08:16:21.354522 waagent[1916]: 2025-08-19T08:16:21.354474Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Aug 19 08:16:21.354522 waagent[1916]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.354522 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.354522 waagent[1916]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.354522 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.354522 waagent[1916]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.354522 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.354522 waagent[1916]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Aug 19 08:16:21.354522 waagent[1916]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Aug 19 08:16:21.354522 waagent[1916]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Aug 19 08:16:21.357517 waagent[1916]: 2025-08-19T08:16:21.357467Z INFO EnvHandler ExtHandler Current Firewall rules: Aug 19 08:16:21.357517 waagent[1916]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.357517 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.357517 waagent[1916]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.357517 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.357517 waagent[1916]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Aug 19 08:16:21.357517 waagent[1916]: pkts bytes target prot opt in out source destination Aug 19 08:16:21.357517 waagent[1916]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Aug 19 08:16:21.357517 waagent[1916]: 4 406 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Aug 19 08:16:21.357517 waagent[1916]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Aug 19 08:16:26.447786 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 08:16:26.448934 systemd[1]: Started sshd@0-10.200.8.40:22-10.200.16.10:55252.service - OpenSSH per-connection server daemon (10.200.16.10:55252). Aug 19 08:16:27.149597 sshd[2061]: Accepted publickey for core from 10.200.16.10 port 55252 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:27.150701 sshd-session[2061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:27.154823 systemd-logind[1706]: New session 3 of user core. Aug 19 08:16:27.160181 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 08:16:27.712571 systemd[1]: Started sshd@1-10.200.8.40:22-10.200.16.10:55266.service - OpenSSH per-connection server daemon (10.200.16.10:55266). Aug 19 08:16:27.934017 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 08:16:27.935300 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:28.349131 sshd[2067]: Accepted publickey for core from 10.200.16.10 port 55266 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:28.350215 sshd-session[2067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:28.354117 systemd-logind[1706]: New session 4 of user core. Aug 19 08:16:28.361177 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 08:16:28.468895 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:28.471742 (kubelet)[2079]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:16:28.512407 kubelet[2079]: E0819 08:16:28.512357 2079 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:16:28.514950 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:16:28.515091 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:16:28.515355 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.3M memory peak. Aug 19 08:16:28.798055 sshd[2073]: Connection closed by 10.200.16.10 port 55266 Aug 19 08:16:28.798510 sshd-session[2067]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:28.801237 systemd[1]: sshd@1-10.200.8.40:22-10.200.16.10:55266.service: Deactivated successfully. Aug 19 08:16:28.802604 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 08:16:28.804102 systemd-logind[1706]: Session 4 logged out. Waiting for processes to exit. Aug 19 08:16:28.804817 systemd-logind[1706]: Removed session 4. Aug 19 08:16:28.917408 systemd[1]: Started sshd@2-10.200.8.40:22-10.200.16.10:55274.service - OpenSSH per-connection server daemon (10.200.16.10:55274). Aug 19 08:16:29.558958 sshd[2091]: Accepted publickey for core from 10.200.16.10 port 55274 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:29.560025 sshd-session[2091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:29.563907 systemd-logind[1706]: New session 5 of user core. Aug 19 08:16:29.571182 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 08:16:30.012005 sshd[2094]: Connection closed by 10.200.16.10 port 55274 Aug 19 08:16:30.012498 sshd-session[2091]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:30.015232 systemd[1]: sshd@2-10.200.8.40:22-10.200.16.10:55274.service: Deactivated successfully. Aug 19 08:16:30.016624 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 08:16:30.017689 systemd-logind[1706]: Session 5 logged out. Waiting for processes to exit. Aug 19 08:16:30.018741 systemd-logind[1706]: Removed session 5. Aug 19 08:16:30.132420 systemd[1]: Started sshd@3-10.200.8.40:22-10.200.16.10:55276.service - OpenSSH per-connection server daemon (10.200.16.10:55276). Aug 19 08:16:30.768763 sshd[2100]: Accepted publickey for core from 10.200.16.10 port 55276 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:30.769814 sshd-session[2100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:30.773677 systemd-logind[1706]: New session 6 of user core. Aug 19 08:16:30.784211 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 08:16:31.220481 sshd[2103]: Connection closed by 10.200.16.10 port 55276 Aug 19 08:16:31.220949 sshd-session[2100]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:31.223797 systemd[1]: sshd@3-10.200.8.40:22-10.200.16.10:55276.service: Deactivated successfully. Aug 19 08:16:31.225216 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 08:16:31.225811 systemd-logind[1706]: Session 6 logged out. Waiting for processes to exit. Aug 19 08:16:31.226779 systemd-logind[1706]: Removed session 6. Aug 19 08:16:31.335472 systemd[1]: Started sshd@4-10.200.8.40:22-10.200.16.10:50370.service - OpenSSH per-connection server daemon (10.200.16.10:50370). Aug 19 08:16:31.970723 sshd[2109]: Accepted publickey for core from 10.200.16.10 port 50370 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:31.971845 sshd-session[2109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:31.975899 systemd-logind[1706]: New session 7 of user core. Aug 19 08:16:31.982174 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 08:16:32.428230 sudo[2113]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 08:16:32.428437 sudo[2113]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:32.438814 sudo[2113]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:32.541905 sshd[2112]: Connection closed by 10.200.16.10 port 50370 Aug 19 08:16:32.542519 sshd-session[2109]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:32.545160 systemd[1]: sshd@4-10.200.8.40:22-10.200.16.10:50370.service: Deactivated successfully. Aug 19 08:16:32.546586 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 08:16:32.548099 systemd-logind[1706]: Session 7 logged out. Waiting for processes to exit. Aug 19 08:16:32.548845 systemd-logind[1706]: Removed session 7. Aug 19 08:16:32.670544 systemd[1]: Started sshd@5-10.200.8.40:22-10.200.16.10:50382.service - OpenSSH per-connection server daemon (10.200.16.10:50382). Aug 19 08:16:33.306931 sshd[2119]: Accepted publickey for core from 10.200.16.10 port 50382 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:33.308097 sshd-session[2119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:33.312119 systemd-logind[1706]: New session 8 of user core. Aug 19 08:16:33.319176 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 08:16:33.654291 sudo[2124]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 08:16:33.654509 sudo[2124]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:33.661874 sudo[2124]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:33.665699 sudo[2123]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 08:16:33.665914 sudo[2123]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:33.673414 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:16:33.702335 augenrules[2146]: No rules Aug 19 08:16:33.703144 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:16:33.703328 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:16:33.704400 sudo[2123]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:33.808020 sshd[2122]: Connection closed by 10.200.16.10 port 50382 Aug 19 08:16:33.808440 sshd-session[2119]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:33.811482 systemd[1]: sshd@5-10.200.8.40:22-10.200.16.10:50382.service: Deactivated successfully. Aug 19 08:16:33.812712 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 08:16:33.813345 systemd-logind[1706]: Session 8 logged out. Waiting for processes to exit. Aug 19 08:16:33.814227 systemd-logind[1706]: Removed session 8. Aug 19 08:16:33.925143 systemd[1]: Started sshd@6-10.200.8.40:22-10.200.16.10:50396.service - OpenSSH per-connection server daemon (10.200.16.10:50396). Aug 19 08:16:34.560787 sshd[2155]: Accepted publickey for core from 10.200.16.10 port 50396 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:16:34.561788 sshd-session[2155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:34.565099 systemd-logind[1706]: New session 9 of user core. Aug 19 08:16:34.570202 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 08:16:34.907927 sudo[2159]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 08:16:34.908165 sudo[2159]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:36.340462 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 08:16:36.349342 (dockerd)[2177]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 08:16:37.863430 dockerd[2177]: time="2025-08-19T08:16:37.863380059Z" level=info msg="Starting up" Aug 19 08:16:37.865087 dockerd[2177]: time="2025-08-19T08:16:37.865046836Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 08:16:37.873198 dockerd[2177]: time="2025-08-19T08:16:37.873169413Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 08:16:38.033568 dockerd[2177]: time="2025-08-19T08:16:38.033491844Z" level=info msg="Loading containers: start." Aug 19 08:16:38.063074 kernel: Initializing XFRM netlink socket Aug 19 08:16:38.430216 systemd-networkd[1370]: docker0: Link UP Aug 19 08:16:38.450608 dockerd[2177]: time="2025-08-19T08:16:38.450574318Z" level=info msg="Loading containers: done." Aug 19 08:16:38.469558 dockerd[2177]: time="2025-08-19T08:16:38.469525291Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 08:16:38.469659 dockerd[2177]: time="2025-08-19T08:16:38.469593733Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 08:16:38.469690 dockerd[2177]: time="2025-08-19T08:16:38.469663996Z" level=info msg="Initializing buildkit" Aug 19 08:16:38.516740 dockerd[2177]: time="2025-08-19T08:16:38.516698218Z" level=info msg="Completed buildkit initialization" Aug 19 08:16:38.522675 dockerd[2177]: time="2025-08-19T08:16:38.522645414Z" level=info msg="Daemon has completed initialization" Aug 19 08:16:38.523125 dockerd[2177]: time="2025-08-19T08:16:38.522686357Z" level=info msg="API listen on /run/docker.sock" Aug 19 08:16:38.522995 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 08:16:38.524211 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 19 08:16:38.527199 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:39.544886 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:39.547590 (kubelet)[2391]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:16:39.580169 kubelet[2391]: E0819 08:16:39.580120 2391 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:16:39.581540 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:16:39.581667 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:16:39.581985 systemd[1]: kubelet.service: Consumed 129ms CPU time, 108.7M memory peak. Aug 19 08:16:39.897752 chronyd[1683]: Selected source PHC0 Aug 19 08:16:40.191120 containerd[1722]: time="2025-08-19T08:16:40.191068406Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Aug 19 08:16:40.976152 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4215705355.mount: Deactivated successfully. Aug 19 08:16:42.325187 containerd[1722]: time="2025-08-19T08:16:42.325136261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:42.328567 containerd[1722]: time="2025-08-19T08:16:42.328540146Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079639" Aug 19 08:16:42.331494 containerd[1722]: time="2025-08-19T08:16:42.331452379Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:42.337392 containerd[1722]: time="2025-08-19T08:16:42.337348795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:42.338218 containerd[1722]: time="2025-08-19T08:16:42.338026053Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 2.146901411s" Aug 19 08:16:42.338218 containerd[1722]: time="2025-08-19T08:16:42.338070806Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Aug 19 08:16:42.338756 containerd[1722]: time="2025-08-19T08:16:42.338731581Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Aug 19 08:16:43.953281 containerd[1722]: time="2025-08-19T08:16:43.953228484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:43.956984 containerd[1722]: time="2025-08-19T08:16:43.956947819Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714689" Aug 19 08:16:43.961055 containerd[1722]: time="2025-08-19T08:16:43.961008088Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:43.967360 containerd[1722]: time="2025-08-19T08:16:43.967234686Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 1.628473433s" Aug 19 08:16:43.967360 containerd[1722]: time="2025-08-19T08:16:43.967271876Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Aug 19 08:16:43.967743 containerd[1722]: time="2025-08-19T08:16:43.967721310Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:43.967979 containerd[1722]: time="2025-08-19T08:16:43.967904987Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Aug 19 08:16:45.129858 containerd[1722]: time="2025-08-19T08:16:45.129803775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.132549 containerd[1722]: time="2025-08-19T08:16:45.132514080Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782435" Aug 19 08:16:45.135525 containerd[1722]: time="2025-08-19T08:16:45.135477922Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.139450 containerd[1722]: time="2025-08-19T08:16:45.139392307Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.140168 containerd[1722]: time="2025-08-19T08:16:45.140028367Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 1.172098896s" Aug 19 08:16:45.140168 containerd[1722]: time="2025-08-19T08:16:45.140073240Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Aug 19 08:16:45.140636 containerd[1722]: time="2025-08-19T08:16:45.140606118Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Aug 19 08:16:46.167351 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4293133701.mount: Deactivated successfully. Aug 19 08:16:46.523096 containerd[1722]: time="2025-08-19T08:16:46.523048112Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:46.525931 containerd[1722]: time="2025-08-19T08:16:46.525892218Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384263" Aug 19 08:16:46.528604 containerd[1722]: time="2025-08-19T08:16:46.528546682Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:46.532669 containerd[1722]: time="2025-08-19T08:16:46.532602614Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:46.533110 containerd[1722]: time="2025-08-19T08:16:46.533085349Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 1.392447855s" Aug 19 08:16:46.533416 containerd[1722]: time="2025-08-19T08:16:46.533121762Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Aug 19 08:16:46.533636 containerd[1722]: time="2025-08-19T08:16:46.533619379Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 19 08:16:47.160770 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3227625421.mount: Deactivated successfully. Aug 19 08:16:48.220687 containerd[1722]: time="2025-08-19T08:16:48.220638110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:48.223215 containerd[1722]: time="2025-08-19T08:16:48.223192054Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Aug 19 08:16:48.226115 containerd[1722]: time="2025-08-19T08:16:48.226076282Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:48.236214 containerd[1722]: time="2025-08-19T08:16:48.236169330Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:48.236924 containerd[1722]: time="2025-08-19T08:16:48.236791523Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.703146007s" Aug 19 08:16:48.236924 containerd[1722]: time="2025-08-19T08:16:48.236819355Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 19 08:16:48.237377 containerd[1722]: time="2025-08-19T08:16:48.237349594Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 08:16:48.812358 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3467743711.mount: Deactivated successfully. Aug 19 08:16:48.853441 containerd[1722]: time="2025-08-19T08:16:48.853397100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:16:48.856816 containerd[1722]: time="2025-08-19T08:16:48.856781716Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Aug 19 08:16:48.860097 containerd[1722]: time="2025-08-19T08:16:48.860055564Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:16:48.863597 containerd[1722]: time="2025-08-19T08:16:48.863558359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:16:48.864016 containerd[1722]: time="2025-08-19T08:16:48.863897491Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 626.50523ms" Aug 19 08:16:48.864016 containerd[1722]: time="2025-08-19T08:16:48.863926296Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 19 08:16:48.864514 containerd[1722]: time="2025-08-19T08:16:48.864494585Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Aug 19 08:16:49.575095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4267604150.mount: Deactivated successfully. Aug 19 08:16:49.684226 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 19 08:16:49.685689 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:50.070621 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:50.084284 (kubelet)[2540]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:16:50.114356 kubelet[2540]: E0819 08:16:50.114315 2540 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:16:50.115723 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:16:50.115851 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:16:50.116143 systemd[1]: kubelet.service: Consumed 121ms CPU time, 110.5M memory peak. Aug 19 08:16:51.879713 containerd[1722]: time="2025-08-19T08:16:51.879667657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:51.882239 containerd[1722]: time="2025-08-19T08:16:51.882206137Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910717" Aug 19 08:16:51.884839 containerd[1722]: time="2025-08-19T08:16:51.884798997Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:51.888834 containerd[1722]: time="2025-08-19T08:16:51.888792769Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:51.889489 containerd[1722]: time="2025-08-19T08:16:51.889377954Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.02485772s" Aug 19 08:16:51.889489 containerd[1722]: time="2025-08-19T08:16:51.889404582Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Aug 19 08:16:54.187167 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:54.187320 systemd[1]: kubelet.service: Consumed 121ms CPU time, 110.5M memory peak. Aug 19 08:16:54.189256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:54.210350 systemd[1]: Reload requested from client PID 2620 ('systemctl') (unit session-9.scope)... Aug 19 08:16:54.210364 systemd[1]: Reloading... Aug 19 08:16:54.297466 zram_generator::config[2667]: No configuration found. Aug 19 08:16:54.476334 systemd[1]: Reloading finished in 265 ms. Aug 19 08:16:54.553336 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 08:16:54.553411 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 08:16:54.553688 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:54.553739 systemd[1]: kubelet.service: Consumed 70ms CPU time, 74.3M memory peak. Aug 19 08:16:54.554979 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:55.027251 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:55.036334 (kubelet)[2734]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:16:55.071477 kubelet[2734]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:16:55.071477 kubelet[2734]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 19 08:16:55.071477 kubelet[2734]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:16:55.071737 kubelet[2734]: I0819 08:16:55.071535 2734 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:16:55.295099 kubelet[2734]: I0819 08:16:55.294970 2734 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 19 08:16:55.295649 kubelet[2734]: I0819 08:16:55.295210 2734 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:16:55.295649 kubelet[2734]: I0819 08:16:55.295571 2734 server.go:934] "Client rotation is on, will bootstrap in background" Aug 19 08:16:55.322391 kubelet[2734]: E0819 08:16:55.322366 2734 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.40:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.40:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:16:55.324171 kubelet[2734]: I0819 08:16:55.324153 2734 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:16:55.343593 kubelet[2734]: I0819 08:16:55.343576 2734 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:16:55.359108 kubelet[2734]: I0819 08:16:55.359084 2734 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:16:55.367535 kubelet[2734]: I0819 08:16:55.367513 2734 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 19 08:16:55.367708 kubelet[2734]: I0819 08:16:55.367676 2734 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:16:55.367866 kubelet[2734]: I0819 08:16:55.367704 2734 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-a-5588c1b4cf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:16:55.367985 kubelet[2734]: I0819 08:16:55.367871 2734 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:16:55.367985 kubelet[2734]: I0819 08:16:55.367882 2734 container_manager_linux.go:300] "Creating device plugin manager" Aug 19 08:16:55.367985 kubelet[2734]: I0819 08:16:55.367976 2734 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:16:55.373807 kubelet[2734]: I0819 08:16:55.373552 2734 kubelet.go:408] "Attempting to sync node with API server" Aug 19 08:16:55.373807 kubelet[2734]: I0819 08:16:55.373584 2734 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:16:55.373807 kubelet[2734]: I0819 08:16:55.373619 2734 kubelet.go:314] "Adding apiserver pod source" Aug 19 08:16:55.373807 kubelet[2734]: I0819 08:16:55.373639 2734 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:16:55.382132 kubelet[2734]: W0819 08:16:55.382089 2734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-a-5588c1b4cf&limit=500&resourceVersion=0": dial tcp 10.200.8.40:6443: connect: connection refused Aug 19 08:16:55.382204 kubelet[2734]: E0819 08:16:55.382159 2734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-a-5588c1b4cf&limit=500&resourceVersion=0\": dial tcp 10.200.8.40:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:16:55.384363 kubelet[2734]: W0819 08:16:55.383819 2734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.40:6443: connect: connection refused Aug 19 08:16:55.384363 kubelet[2734]: E0819 08:16:55.383869 2734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.40:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:16:55.384363 kubelet[2734]: I0819 08:16:55.383935 2734 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:16:55.384736 kubelet[2734]: I0819 08:16:55.384720 2734 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:16:55.385345 kubelet[2734]: W0819 08:16:55.385328 2734 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 08:16:55.388338 kubelet[2734]: I0819 08:16:55.388318 2734 server.go:1274] "Started kubelet" Aug 19 08:16:55.394158 kubelet[2734]: I0819 08:16:55.394138 2734 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:16:55.395372 kubelet[2734]: E0819 08:16:55.393770 2734 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.40:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.40:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.0.0-a-5588c1b4cf.185d1d173cf1c514 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.0.0-a-5588c1b4cf,UID:ci-4426.0.0-a-5588c1b4cf,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.0.0-a-5588c1b4cf,},FirstTimestamp:2025-08-19 08:16:55.388292372 +0000 UTC m=+0.348785350,LastTimestamp:2025-08-19 08:16:55.388292372 +0000 UTC m=+0.348785350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.0.0-a-5588c1b4cf,}" Aug 19 08:16:55.398212 kubelet[2734]: E0819 08:16:55.398195 2734 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:16:55.399544 kubelet[2734]: I0819 08:16:55.399514 2734 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:16:55.399956 kubelet[2734]: I0819 08:16:55.399937 2734 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 19 08:16:55.400236 kubelet[2734]: E0819 08:16:55.400217 2734 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.0.0-a-5588c1b4cf\" not found" Aug 19 08:16:55.400670 kubelet[2734]: I0819 08:16:55.400656 2734 server.go:449] "Adding debug handlers to kubelet server" Aug 19 08:16:55.402953 kubelet[2734]: I0819 08:16:55.402557 2734 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 19 08:16:55.402953 kubelet[2734]: I0819 08:16:55.402607 2734 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:16:55.403137 kubelet[2734]: I0819 08:16:55.403113 2734 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:16:55.403387 kubelet[2734]: I0819 08:16:55.403336 2734 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:16:55.403721 kubelet[2734]: I0819 08:16:55.403709 2734 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:16:55.404692 kubelet[2734]: E0819 08:16:55.404659 2734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-5588c1b4cf?timeout=10s\": dial tcp 10.200.8.40:6443: connect: connection refused" interval="200ms" Aug 19 08:16:55.405830 kubelet[2734]: I0819 08:16:55.405812 2734 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:16:55.405830 kubelet[2734]: I0819 08:16:55.405829 2734 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:16:55.405918 kubelet[2734]: I0819 08:16:55.405900 2734 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:16:55.412390 kubelet[2734]: I0819 08:16:55.412070 2734 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:16:55.412988 kubelet[2734]: I0819 08:16:55.412958 2734 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:16:55.412988 kubelet[2734]: I0819 08:16:55.412978 2734 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 19 08:16:55.413103 kubelet[2734]: I0819 08:16:55.412998 2734 kubelet.go:2321] "Starting kubelet main sync loop" Aug 19 08:16:55.413132 kubelet[2734]: E0819 08:16:55.413031 2734 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:16:55.419772 kubelet[2734]: W0819 08:16:55.419693 2734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.40:6443: connect: connection refused Aug 19 08:16:55.419772 kubelet[2734]: E0819 08:16:55.419754 2734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.40:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:16:55.419935 kubelet[2734]: W0819 08:16:55.419908 2734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.40:6443: connect: connection refused Aug 19 08:16:55.419974 kubelet[2734]: E0819 08:16:55.419940 2734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.40:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:16:55.428848 kubelet[2734]: I0819 08:16:55.428816 2734 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 19 08:16:55.428848 kubelet[2734]: I0819 08:16:55.428843 2734 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 19 08:16:55.428943 kubelet[2734]: I0819 08:16:55.428857 2734 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:16:55.434792 kubelet[2734]: I0819 08:16:55.434736 2734 policy_none.go:49] "None policy: Start" Aug 19 08:16:55.435269 kubelet[2734]: I0819 08:16:55.435252 2734 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 19 08:16:55.435327 kubelet[2734]: I0819 08:16:55.435273 2734 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:16:55.444666 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 08:16:55.456932 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 08:16:55.459686 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 08:16:55.466488 kubelet[2734]: I0819 08:16:55.466475 2734 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:16:55.466904 kubelet[2734]: I0819 08:16:55.466896 2734 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:16:55.466972 kubelet[2734]: I0819 08:16:55.466952 2734 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:16:55.467176 kubelet[2734]: I0819 08:16:55.467167 2734 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:16:55.468453 kubelet[2734]: E0819 08:16:55.468436 2734 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.0.0-a-5588c1b4cf\" not found" Aug 19 08:16:55.520854 systemd[1]: Created slice kubepods-burstable-pod3dbabf3eb942a4a19639e8403cebef8d.slice - libcontainer container kubepods-burstable-pod3dbabf3eb942a4a19639e8403cebef8d.slice. Aug 19 08:16:55.535406 systemd[1]: Created slice kubepods-burstable-podbd14e5e9355961a722d5e35bfab36e67.slice - libcontainer container kubepods-burstable-podbd14e5e9355961a722d5e35bfab36e67.slice. Aug 19 08:16:55.545856 systemd[1]: Created slice kubepods-burstable-poda0111172c235d2fe288a8262163313c6.slice - libcontainer container kubepods-burstable-poda0111172c235d2fe288a8262163313c6.slice. Aug 19 08:16:55.568878 kubelet[2734]: I0819 08:16:55.568843 2734 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.569171 kubelet[2734]: E0819 08:16:55.569154 2734 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.40:6443/api/v1/nodes\": dial tcp 10.200.8.40:6443: connect: connection refused" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604420 kubelet[2734]: I0819 08:16:55.604355 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604420 kubelet[2734]: I0819 08:16:55.604395 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604420 kubelet[2734]: I0819 08:16:55.604412 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a0111172c235d2fe288a8262163313c6-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-a-5588c1b4cf\" (UID: \"a0111172c235d2fe288a8262163313c6\") " pod="kube-system/kube-scheduler-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604565 kubelet[2734]: I0819 08:16:55.604432 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604565 kubelet[2734]: I0819 08:16:55.604447 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604565 kubelet[2734]: I0819 08:16:55.604461 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604565 kubelet[2734]: I0819 08:16:55.604475 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604565 kubelet[2734]: I0819 08:16:55.604489 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.604642 kubelet[2734]: I0819 08:16:55.604503 2734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.605617 kubelet[2734]: E0819 08:16:55.605596 2734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-5588c1b4cf?timeout=10s\": dial tcp 10.200.8.40:6443: connect: connection refused" interval="400ms" Aug 19 08:16:55.771290 kubelet[2734]: I0819 08:16:55.771259 2734 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.771561 kubelet[2734]: E0819 08:16:55.771539 2734 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.40:6443/api/v1/nodes\": dial tcp 10.200.8.40:6443: connect: connection refused" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:55.834709 containerd[1722]: time="2025-08-19T08:16:55.834621994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-a-5588c1b4cf,Uid:3dbabf3eb942a4a19639e8403cebef8d,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:55.845060 containerd[1722]: time="2025-08-19T08:16:55.845022185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-a-5588c1b4cf,Uid:bd14e5e9355961a722d5e35bfab36e67,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:55.848762 containerd[1722]: time="2025-08-19T08:16:55.848665765Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-a-5588c1b4cf,Uid:a0111172c235d2fe288a8262163313c6,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:55.937181 containerd[1722]: time="2025-08-19T08:16:55.936534775Z" level=info msg="connecting to shim eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e" address="unix:///run/containerd/s/d75eabda9f62f425df1fc380e3559f89fd722bce8555718f69353265677736fd" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:55.947286 containerd[1722]: time="2025-08-19T08:16:55.947259352Z" level=info msg="connecting to shim 2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57" address="unix:///run/containerd/s/eb877a8136c0b3611e2c8b198e482d80020b371609c68ff371bc42d1c5daf9b1" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:55.970486 systemd[1]: Started cri-containerd-eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e.scope - libcontainer container eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e. Aug 19 08:16:55.974535 containerd[1722]: time="2025-08-19T08:16:55.974509377Z" level=info msg="connecting to shim f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222" address="unix:///run/containerd/s/98de0357e06549c1eab3477bbbb45d6bf3db34a607fe1d08637ce87e537ed160" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:55.975360 systemd[1]: Started cri-containerd-2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57.scope - libcontainer container 2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57. Aug 19 08:16:56.003447 systemd[1]: Started cri-containerd-f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222.scope - libcontainer container f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222. Aug 19 08:16:56.006670 kubelet[2734]: E0819 08:16:56.006639 2734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-a-5588c1b4cf?timeout=10s\": dial tcp 10.200.8.40:6443: connect: connection refused" interval="800ms" Aug 19 08:16:56.029574 containerd[1722]: time="2025-08-19T08:16:56.029474833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-a-5588c1b4cf,Uid:3dbabf3eb942a4a19639e8403cebef8d,Namespace:kube-system,Attempt:0,} returns sandbox id \"eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e\"" Aug 19 08:16:56.033058 containerd[1722]: time="2025-08-19T08:16:56.032566053Z" level=info msg="CreateContainer within sandbox \"eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 08:16:56.055187 containerd[1722]: time="2025-08-19T08:16:56.055163910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-a-5588c1b4cf,Uid:bd14e5e9355961a722d5e35bfab36e67,Namespace:kube-system,Attempt:0,} returns sandbox id \"2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57\"" Aug 19 08:16:56.057318 containerd[1722]: time="2025-08-19T08:16:56.057265979Z" level=info msg="CreateContainer within sandbox \"2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 08:16:56.063405 containerd[1722]: time="2025-08-19T08:16:56.063380977Z" level=info msg="Container d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:56.068783 containerd[1722]: time="2025-08-19T08:16:56.068760332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-a-5588c1b4cf,Uid:a0111172c235d2fe288a8262163313c6,Namespace:kube-system,Attempt:0,} returns sandbox id \"f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222\"" Aug 19 08:16:56.070254 containerd[1722]: time="2025-08-19T08:16:56.070229328Z" level=info msg="CreateContainer within sandbox \"f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 08:16:56.089250 containerd[1722]: time="2025-08-19T08:16:56.089169086Z" level=info msg="CreateContainer within sandbox \"eaf512257e70c7e59899f6f0599e7d718ce4de7d92f2c104bc390479a0f4358e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f\"" Aug 19 08:16:56.089980 containerd[1722]: time="2025-08-19T08:16:56.089940719Z" level=info msg="StartContainer for \"d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f\"" Aug 19 08:16:56.090707 containerd[1722]: time="2025-08-19T08:16:56.090685542Z" level=info msg="connecting to shim d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f" address="unix:///run/containerd/s/d75eabda9f62f425df1fc380e3559f89fd722bce8555718f69353265677736fd" protocol=ttrpc version=3 Aug 19 08:16:56.106158 systemd[1]: Started cri-containerd-d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f.scope - libcontainer container d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f. Aug 19 08:16:56.117397 containerd[1722]: time="2025-08-19T08:16:56.117318462Z" level=info msg="Container d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:56.124636 containerd[1722]: time="2025-08-19T08:16:56.124606820Z" level=info msg="Container 6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:56.142060 containerd[1722]: time="2025-08-19T08:16:56.142021328Z" level=info msg="CreateContainer within sandbox \"f00c75571c83f0043ca00c0bf1add7e0848ea836ff5de7a2013da17174fda222\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1\"" Aug 19 08:16:56.142570 containerd[1722]: time="2025-08-19T08:16:56.142552360Z" level=info msg="StartContainer for \"6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1\"" Aug 19 08:16:56.143877 containerd[1722]: time="2025-08-19T08:16:56.143840667Z" level=info msg="connecting to shim 6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1" address="unix:///run/containerd/s/98de0357e06549c1eab3477bbbb45d6bf3db34a607fe1d08637ce87e537ed160" protocol=ttrpc version=3 Aug 19 08:16:56.151142 containerd[1722]: time="2025-08-19T08:16:56.151103710Z" level=info msg="CreateContainer within sandbox \"2750c23a61a2d131f9f86a4366ecead5c2c8d16346d429720c9e524b35ac8a57\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb\"" Aug 19 08:16:56.152309 containerd[1722]: time="2025-08-19T08:16:56.152287160Z" level=info msg="StartContainer for \"d9f076a55fb6f2628cc0d59551c0e432933c7e3ab6fe9a9b38d23e59a771000f\" returns successfully" Aug 19 08:16:56.153102 containerd[1722]: time="2025-08-19T08:16:56.152830432Z" level=info msg="StartContainer for \"d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb\"" Aug 19 08:16:56.155381 containerd[1722]: time="2025-08-19T08:16:56.155326601Z" level=info msg="connecting to shim d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb" address="unix:///run/containerd/s/eb877a8136c0b3611e2c8b198e482d80020b371609c68ff371bc42d1c5daf9b1" protocol=ttrpc version=3 Aug 19 08:16:56.165183 systemd[1]: Started cri-containerd-6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1.scope - libcontainer container 6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1. Aug 19 08:16:56.174617 kubelet[2734]: I0819 08:16:56.174290 2734 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:56.175381 kubelet[2734]: E0819 08:16:56.175355 2734 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.8.40:6443/api/v1/nodes\": dial tcp 10.200.8.40:6443: connect: connection refused" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:56.179328 systemd[1]: Started cri-containerd-d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb.scope - libcontainer container d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb. Aug 19 08:16:56.289516 containerd[1722]: time="2025-08-19T08:16:56.289476092Z" level=info msg="StartContainer for \"d9aa6dd950bb13a604fab67019b97723a776ad5e56713cedac11745b34641cfb\" returns successfully" Aug 19 08:16:56.291078 containerd[1722]: time="2025-08-19T08:16:56.289856492Z" level=info msg="StartContainer for \"6316e41c90f1ccc5050798ad62a78a1814404f69ae00c547873031e39b065ec1\" returns successfully" Aug 19 08:16:56.977865 kubelet[2734]: I0819 08:16:56.977559 2734 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:57.674192 kubelet[2734]: E0819 08:16:57.674156 2734 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426.0.0-a-5588c1b4cf\" not found" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:57.754609 kubelet[2734]: I0819 08:16:57.754567 2734 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:57.754729 kubelet[2734]: E0819 08:16:57.754619 2734 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4426.0.0-a-5588c1b4cf\": node \"ci-4426.0.0-a-5588c1b4cf\" not found" Aug 19 08:16:57.949676 kubelet[2734]: E0819 08:16:57.949645 2734 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4426.0.0-a-5588c1b4cf\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:16:58.384947 kubelet[2734]: I0819 08:16:58.384838 2734 apiserver.go:52] "Watching apiserver" Aug 19 08:16:58.403555 kubelet[2734]: I0819 08:16:58.403532 2734 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 19 08:16:59.969863 systemd[1]: Reload requested from client PID 3001 ('systemctl') (unit session-9.scope)... Aug 19 08:16:59.969877 systemd[1]: Reloading... Aug 19 08:17:00.054093 zram_generator::config[3048]: No configuration found. Aug 19 08:17:00.270965 systemd[1]: Reloading finished in 300 ms. Aug 19 08:17:00.294608 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:00.311369 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 08:17:00.311554 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:00.311592 systemd[1]: kubelet.service: Consumed 605ms CPU time, 129.9M memory peak. Aug 19 08:17:00.313291 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:00.594097 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Aug 19 08:17:00.729026 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:00.734505 (kubelet)[3115]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:17:00.770197 kubelet[3115]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:00.771991 kubelet[3115]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 19 08:17:00.771991 kubelet[3115]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:00.771991 kubelet[3115]: I0819 08:17:00.770536 3115 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:17:00.778814 kubelet[3115]: I0819 08:17:00.778676 3115 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 19 08:17:00.779551 kubelet[3115]: I0819 08:17:00.778925 3115 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:17:00.779551 kubelet[3115]: I0819 08:17:00.779218 3115 server.go:934] "Client rotation is on, will bootstrap in background" Aug 19 08:17:00.780604 kubelet[3115]: I0819 08:17:00.780583 3115 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 19 08:17:00.782337 kubelet[3115]: I0819 08:17:00.782321 3115 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:17:00.786498 kubelet[3115]: I0819 08:17:00.786484 3115 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:17:00.788886 kubelet[3115]: I0819 08:17:00.788865 3115 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:17:00.788978 kubelet[3115]: I0819 08:17:00.788967 3115 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 19 08:17:00.789124 kubelet[3115]: I0819 08:17:00.789107 3115 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:17:00.789389 kubelet[3115]: I0819 08:17:00.789161 3115 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-a-5588c1b4cf","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789499 3115 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789510 3115 container_manager_linux.go:300] "Creating device plugin manager" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789538 3115 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789616 3115 kubelet.go:408] "Attempting to sync node with API server" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789630 3115 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789657 3115 kubelet.go:314] "Adding apiserver pod source" Aug 19 08:17:00.791065 kubelet[3115]: I0819 08:17:00.789667 3115 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:17:00.799282 kubelet[3115]: I0819 08:17:00.799266 3115 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:17:00.799992 kubelet[3115]: I0819 08:17:00.799976 3115 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:17:00.800440 kubelet[3115]: I0819 08:17:00.800427 3115 server.go:1274] "Started kubelet" Aug 19 08:17:00.801955 kubelet[3115]: I0819 08:17:00.801915 3115 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:17:00.804096 kubelet[3115]: I0819 08:17:00.803443 3115 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:17:00.804096 kubelet[3115]: I0819 08:17:00.803512 3115 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:17:00.804834 kubelet[3115]: I0819 08:17:00.804817 3115 server.go:449] "Adding debug handlers to kubelet server" Aug 19 08:17:00.805051 kubelet[3115]: I0819 08:17:00.805020 3115 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:17:00.807096 kubelet[3115]: I0819 08:17:00.807076 3115 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:17:00.809234 kubelet[3115]: I0819 08:17:00.809217 3115 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 19 08:17:00.809317 kubelet[3115]: I0819 08:17:00.809308 3115 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 19 08:17:00.809400 kubelet[3115]: I0819 08:17:00.809393 3115 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:17:00.810167 kubelet[3115]: I0819 08:17:00.810150 3115 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:17:00.810240 kubelet[3115]: I0819 08:17:00.810226 3115 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:17:00.812154 kubelet[3115]: I0819 08:17:00.812137 3115 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:17:00.815496 kubelet[3115]: I0819 08:17:00.815476 3115 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:17:00.816483 kubelet[3115]: I0819 08:17:00.816468 3115 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:17:00.816559 kubelet[3115]: I0819 08:17:00.816554 3115 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 19 08:17:00.816605 kubelet[3115]: I0819 08:17:00.816600 3115 kubelet.go:2321] "Starting kubelet main sync loop" Aug 19 08:17:00.816665 kubelet[3115]: E0819 08:17:00.816654 3115 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:17:00.861120 kubelet[3115]: I0819 08:17:00.861107 3115 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 19 08:17:00.861120 kubelet[3115]: I0819 08:17:00.861119 3115 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 19 08:17:00.861213 kubelet[3115]: I0819 08:17:00.861132 3115 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:00.861270 kubelet[3115]: I0819 08:17:00.861258 3115 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 08:17:00.861307 kubelet[3115]: I0819 08:17:00.861272 3115 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 08:17:00.861307 kubelet[3115]: I0819 08:17:00.861288 3115 policy_none.go:49] "None policy: Start" Aug 19 08:17:00.861807 kubelet[3115]: I0819 08:17:00.861790 3115 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 19 08:17:00.861855 kubelet[3115]: I0819 08:17:00.861812 3115 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:17:00.861963 kubelet[3115]: I0819 08:17:00.861949 3115 state_mem.go:75] "Updated machine memory state" Aug 19 08:17:00.865083 kubelet[3115]: I0819 08:17:00.864874 3115 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:17:00.865083 kubelet[3115]: I0819 08:17:00.864995 3115 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:17:00.865083 kubelet[3115]: I0819 08:17:00.865006 3115 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:17:00.866159 kubelet[3115]: I0819 08:17:00.866136 3115 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:17:00.924634 kubelet[3115]: W0819 08:17:00.924607 3115 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 19 08:17:00.928317 kubelet[3115]: W0819 08:17:00.928271 3115 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 19 08:17:00.928317 kubelet[3115]: W0819 08:17:00.928289 3115 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 19 08:17:00.969651 kubelet[3115]: I0819 08:17:00.969636 3115 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:00.978481 kubelet[3115]: I0819 08:17:00.978113 3115 kubelet_node_status.go:111] "Node was previously registered" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:00.978481 kubelet[3115]: I0819 08:17:00.978175 3115 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010268 kubelet[3115]: I0819 08:17:01.010108 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010268 kubelet[3115]: I0819 08:17:01.010134 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010268 kubelet[3115]: I0819 08:17:01.010153 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010268 kubelet[3115]: I0819 08:17:01.010170 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010268 kubelet[3115]: I0819 08:17:01.010186 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010975 kubelet[3115]: I0819 08:17:01.010201 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010975 kubelet[3115]: I0819 08:17:01.010215 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dbabf3eb942a4a19639e8403cebef8d-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-a-5588c1b4cf\" (UID: \"3dbabf3eb942a4a19639e8403cebef8d\") " pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010975 kubelet[3115]: I0819 08:17:01.010229 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd14e5e9355961a722d5e35bfab36e67-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-a-5588c1b4cf\" (UID: \"bd14e5e9355961a722d5e35bfab36e67\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.010975 kubelet[3115]: I0819 08:17:01.010243 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a0111172c235d2fe288a8262163313c6-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-a-5588c1b4cf\" (UID: \"a0111172c235d2fe288a8262163313c6\") " pod="kube-system/kube-scheduler-ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:01.177458 update_engine[1707]: I20250819 08:17:01.177074 1707 update_attempter.cc:509] Updating boot flags... Aug 19 08:17:01.799869 kubelet[3115]: I0819 08:17:01.799841 3115 apiserver.go:52] "Watching apiserver" Aug 19 08:17:01.809858 kubelet[3115]: I0819 08:17:01.809823 3115 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 19 08:17:01.863894 kubelet[3115]: I0819 08:17:01.863747 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.0.0-a-5588c1b4cf" podStartSLOduration=1.863718568 podStartE2EDuration="1.863718568s" podCreationTimestamp="2025-08-19 08:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:01.863494388 +0000 UTC m=+1.124777914" watchObservedRunningTime="2025-08-19 08:17:01.863718568 +0000 UTC m=+1.125002086" Aug 19 08:17:01.882710 kubelet[3115]: I0819 08:17:01.882659 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.0.0-a-5588c1b4cf" podStartSLOduration=1.882643345 podStartE2EDuration="1.882643345s" podCreationTimestamp="2025-08-19 08:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:01.882408935 +0000 UTC m=+1.143692456" watchObservedRunningTime="2025-08-19 08:17:01.882643345 +0000 UTC m=+1.143926867" Aug 19 08:17:01.882822 kubelet[3115]: I0819 08:17:01.882734 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.0.0-a-5588c1b4cf" podStartSLOduration=1.8827298890000002 podStartE2EDuration="1.882729889s" podCreationTimestamp="2025-08-19 08:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:01.871745735 +0000 UTC m=+1.133029255" watchObservedRunningTime="2025-08-19 08:17:01.882729889 +0000 UTC m=+1.144013406" Aug 19 08:17:04.689826 kubelet[3115]: I0819 08:17:04.689789 3115 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 08:17:04.690241 containerd[1722]: time="2025-08-19T08:17:04.690153670Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 08:17:04.690436 kubelet[3115]: I0819 08:17:04.690375 3115 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 08:17:05.675987 systemd[1]: Created slice kubepods-besteffort-pod3f82cbe2_e096_45c5_b35a_656789ef3939.slice - libcontainer container kubepods-besteffort-pod3f82cbe2_e096_45c5_b35a_656789ef3939.slice. Aug 19 08:17:05.746861 kubelet[3115]: I0819 08:17:05.746831 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3f82cbe2-e096-45c5-b35a-656789ef3939-kube-proxy\") pod \"kube-proxy-jjl9m\" (UID: \"3f82cbe2-e096-45c5-b35a-656789ef3939\") " pod="kube-system/kube-proxy-jjl9m" Aug 19 08:17:05.747163 kubelet[3115]: I0819 08:17:05.746868 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3f82cbe2-e096-45c5-b35a-656789ef3939-xtables-lock\") pod \"kube-proxy-jjl9m\" (UID: \"3f82cbe2-e096-45c5-b35a-656789ef3939\") " pod="kube-system/kube-proxy-jjl9m" Aug 19 08:17:05.747163 kubelet[3115]: I0819 08:17:05.746887 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f82cbe2-e096-45c5-b35a-656789ef3939-lib-modules\") pod \"kube-proxy-jjl9m\" (UID: \"3f82cbe2-e096-45c5-b35a-656789ef3939\") " pod="kube-system/kube-proxy-jjl9m" Aug 19 08:17:05.747163 kubelet[3115]: I0819 08:17:05.746902 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6j4t\" (UniqueName: \"kubernetes.io/projected/3f82cbe2-e096-45c5-b35a-656789ef3939-kube-api-access-r6j4t\") pod \"kube-proxy-jjl9m\" (UID: \"3f82cbe2-e096-45c5-b35a-656789ef3939\") " pod="kube-system/kube-proxy-jjl9m" Aug 19 08:17:05.792740 systemd[1]: Created slice kubepods-besteffort-podc25cfd08_7f95_4e77_b2a7_bbe07d425eee.slice - libcontainer container kubepods-besteffort-podc25cfd08_7f95_4e77_b2a7_bbe07d425eee.slice. Aug 19 08:17:05.847154 kubelet[3115]: I0819 08:17:05.847096 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c25cfd08-7f95-4e77-b2a7-bbe07d425eee-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-9q2f7\" (UID: \"c25cfd08-7f95-4e77-b2a7-bbe07d425eee\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-9q2f7" Aug 19 08:17:05.847154 kubelet[3115]: I0819 08:17:05.847157 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxks6\" (UniqueName: \"kubernetes.io/projected/c25cfd08-7f95-4e77-b2a7-bbe07d425eee-kube-api-access-kxks6\") pod \"tigera-operator-5bf8dfcb4-9q2f7\" (UID: \"c25cfd08-7f95-4e77-b2a7-bbe07d425eee\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-9q2f7" Aug 19 08:17:05.985961 containerd[1722]: time="2025-08-19T08:17:05.985921415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jjl9m,Uid:3f82cbe2-e096-45c5-b35a-656789ef3939,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:06.025639 containerd[1722]: time="2025-08-19T08:17:06.025603877Z" level=info msg="connecting to shim 7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9" address="unix:///run/containerd/s/5b6ec18b0c18db0e5e20162bba0093b879a658597ecfee0014022363b05b3f80" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:06.051165 systemd[1]: Started cri-containerd-7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9.scope - libcontainer container 7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9. Aug 19 08:17:06.071939 containerd[1722]: time="2025-08-19T08:17:06.071908065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jjl9m,Uid:3f82cbe2-e096-45c5-b35a-656789ef3939,Namespace:kube-system,Attempt:0,} returns sandbox id \"7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9\"" Aug 19 08:17:06.074424 containerd[1722]: time="2025-08-19T08:17:06.074401346Z" level=info msg="CreateContainer within sandbox \"7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 08:17:06.093287 containerd[1722]: time="2025-08-19T08:17:06.093226668Z" level=info msg="Container 9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:06.095702 containerd[1722]: time="2025-08-19T08:17:06.095679203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-9q2f7,Uid:c25cfd08-7f95-4e77-b2a7-bbe07d425eee,Namespace:tigera-operator,Attempt:0,}" Aug 19 08:17:06.114652 containerd[1722]: time="2025-08-19T08:17:06.114629262Z" level=info msg="CreateContainer within sandbox \"7589e8d2ef5c13aa330cd0065ef99b1687532a492f1882d62d8c828d8b51e4e9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492\"" Aug 19 08:17:06.115017 containerd[1722]: time="2025-08-19T08:17:06.114977878Z" level=info msg="StartContainer for \"9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492\"" Aug 19 08:17:06.116410 containerd[1722]: time="2025-08-19T08:17:06.116363430Z" level=info msg="connecting to shim 9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492" address="unix:///run/containerd/s/5b6ec18b0c18db0e5e20162bba0093b879a658597ecfee0014022363b05b3f80" protocol=ttrpc version=3 Aug 19 08:17:06.132173 systemd[1]: Started cri-containerd-9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492.scope - libcontainer container 9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492. Aug 19 08:17:06.153973 containerd[1722]: time="2025-08-19T08:17:06.153926365Z" level=info msg="connecting to shim 6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377" address="unix:///run/containerd/s/4a48f04ba791a4dcf093d489231c0677f847213ab72d9628d1e40e17e824ee98" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:06.174773 containerd[1722]: time="2025-08-19T08:17:06.174751276Z" level=info msg="StartContainer for \"9cabdb5b245da22c04c57f8dd2f6c62ea2a27565e9e61bdfa345bd9061ca4492\" returns successfully" Aug 19 08:17:06.180389 systemd[1]: Started cri-containerd-6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377.scope - libcontainer container 6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377. Aug 19 08:17:06.229897 containerd[1722]: time="2025-08-19T08:17:06.229820211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-9q2f7,Uid:c25cfd08-7f95-4e77-b2a7-bbe07d425eee,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377\"" Aug 19 08:17:06.232286 containerd[1722]: time="2025-08-19T08:17:06.232222118Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 08:17:06.875741 kubelet[3115]: I0819 08:17:06.875694 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jjl9m" podStartSLOduration=1.8756773469999999 podStartE2EDuration="1.875677347s" podCreationTimestamp="2025-08-19 08:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:06.875427937 +0000 UTC m=+6.136711459" watchObservedRunningTime="2025-08-19 08:17:06.875677347 +0000 UTC m=+6.136960866" Aug 19 08:17:07.990275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1550128587.mount: Deactivated successfully. Aug 19 08:17:08.544946 containerd[1722]: time="2025-08-19T08:17:08.544890255Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:08.547386 containerd[1722]: time="2025-08-19T08:17:08.547352448Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 19 08:17:08.550415 containerd[1722]: time="2025-08-19T08:17:08.550385941Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:08.553896 containerd[1722]: time="2025-08-19T08:17:08.553856275Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:08.554404 containerd[1722]: time="2025-08-19T08:17:08.554306386Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.322054167s" Aug 19 08:17:08.554404 containerd[1722]: time="2025-08-19T08:17:08.554332645Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 19 08:17:08.556024 containerd[1722]: time="2025-08-19T08:17:08.555990970Z" level=info msg="CreateContainer within sandbox \"6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 08:17:08.580623 containerd[1722]: time="2025-08-19T08:17:08.580110972Z" level=info msg="Container 6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:08.602914 containerd[1722]: time="2025-08-19T08:17:08.602889823Z" level=info msg="CreateContainer within sandbox \"6360d897e94f026ab6b134b43e4f5eb7c02e5b7bd994c66afb43108f289ed377\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e\"" Aug 19 08:17:08.603392 containerd[1722]: time="2025-08-19T08:17:08.603288098Z" level=info msg="StartContainer for \"6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e\"" Aug 19 08:17:08.604288 containerd[1722]: time="2025-08-19T08:17:08.604262325Z" level=info msg="connecting to shim 6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e" address="unix:///run/containerd/s/4a48f04ba791a4dcf093d489231c0677f847213ab72d9628d1e40e17e824ee98" protocol=ttrpc version=3 Aug 19 08:17:08.621195 systemd[1]: Started cri-containerd-6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e.scope - libcontainer container 6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e. Aug 19 08:17:08.647942 containerd[1722]: time="2025-08-19T08:17:08.647868301Z" level=info msg="StartContainer for \"6ea088bcab0ac9cd7f10c780544429718d3ba0545a6a6eb307179b6bbc0a126e\" returns successfully" Aug 19 08:17:09.924054 kubelet[3115]: I0819 08:17:09.923989 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-9q2f7" podStartSLOduration=2.600129107 podStartE2EDuration="4.923970417s" podCreationTimestamp="2025-08-19 08:17:05 +0000 UTC" firstStartedPulling="2025-08-19 08:17:06.231076957 +0000 UTC m=+5.492360473" lastFinishedPulling="2025-08-19 08:17:08.554918267 +0000 UTC m=+7.816201783" observedRunningTime="2025-08-19 08:17:08.878861296 +0000 UTC m=+8.140144816" watchObservedRunningTime="2025-08-19 08:17:09.923970417 +0000 UTC m=+9.185253963" Aug 19 08:17:14.166076 sudo[2159]: pam_unix(sudo:session): session closed for user root Aug 19 08:17:14.270295 sshd[2158]: Connection closed by 10.200.16.10 port 50396 Aug 19 08:17:14.271125 sshd-session[2155]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:14.278170 systemd[1]: sshd@6-10.200.8.40:22-10.200.16.10:50396.service: Deactivated successfully. Aug 19 08:17:14.283220 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 08:17:14.284145 systemd[1]: session-9.scope: Consumed 3.285s CPU time, 220.7M memory peak. Aug 19 08:17:14.286882 systemd-logind[1706]: Session 9 logged out. Waiting for processes to exit. Aug 19 08:17:14.290586 systemd-logind[1706]: Removed session 9. Aug 19 08:17:17.901904 systemd[1]: Created slice kubepods-besteffort-podb3373c25_af5c_462d_80a7_6003a55f0ff6.slice - libcontainer container kubepods-besteffort-podb3373c25_af5c_462d_80a7_6003a55f0ff6.slice. Aug 19 08:17:17.924871 kubelet[3115]: I0819 08:17:17.924835 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b3373c25-af5c-462d-80a7-6003a55f0ff6-typha-certs\") pod \"calico-typha-7fcb6746f-p9t8c\" (UID: \"b3373c25-af5c-462d-80a7-6003a55f0ff6\") " pod="calico-system/calico-typha-7fcb6746f-p9t8c" Aug 19 08:17:17.925188 kubelet[3115]: I0819 08:17:17.924882 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3373c25-af5c-462d-80a7-6003a55f0ff6-tigera-ca-bundle\") pod \"calico-typha-7fcb6746f-p9t8c\" (UID: \"b3373c25-af5c-462d-80a7-6003a55f0ff6\") " pod="calico-system/calico-typha-7fcb6746f-p9t8c" Aug 19 08:17:17.925188 kubelet[3115]: I0819 08:17:17.924902 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wbx\" (UniqueName: \"kubernetes.io/projected/b3373c25-af5c-462d-80a7-6003a55f0ff6-kube-api-access-m5wbx\") pod \"calico-typha-7fcb6746f-p9t8c\" (UID: \"b3373c25-af5c-462d-80a7-6003a55f0ff6\") " pod="calico-system/calico-typha-7fcb6746f-p9t8c" Aug 19 08:17:18.206389 containerd[1722]: time="2025-08-19T08:17:18.206328726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fcb6746f-p9t8c,Uid:b3373c25-af5c-462d-80a7-6003a55f0ff6,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:18.259818 systemd[1]: Created slice kubepods-besteffort-podb9e3b2c3_34d9_4ede_8d01_9b7b73956f78.slice - libcontainer container kubepods-besteffort-podb9e3b2c3_34d9_4ede_8d01_9b7b73956f78.slice. Aug 19 08:17:18.327627 kubelet[3115]: I0819 08:17:18.327542 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-cni-net-dir\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327627 kubelet[3115]: I0819 08:17:18.327583 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-policysync\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327627 kubelet[3115]: I0819 08:17:18.327602 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-node-certs\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327627 kubelet[3115]: I0819 08:17:18.327630 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-flexvol-driver-host\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327904 kubelet[3115]: I0819 08:17:18.327661 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-var-run-calico\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327904 kubelet[3115]: I0819 08:17:18.327687 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-xtables-lock\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327904 kubelet[3115]: I0819 08:17:18.327708 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78m25\" (UniqueName: \"kubernetes.io/projected/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-kube-api-access-78m25\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327904 kubelet[3115]: I0819 08:17:18.327723 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-cni-log-dir\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.327904 kubelet[3115]: I0819 08:17:18.327743 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-lib-modules\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.328005 kubelet[3115]: I0819 08:17:18.327775 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-cni-bin-dir\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.328005 kubelet[3115]: I0819 08:17:18.327795 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-var-lib-calico\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.328005 kubelet[3115]: I0819 08:17:18.327827 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e3b2c3-34d9-4ede-8d01-9b7b73956f78-tigera-ca-bundle\") pod \"calico-node-n2h2x\" (UID: \"b9e3b2c3-34d9-4ede-8d01-9b7b73956f78\") " pod="calico-system/calico-node-n2h2x" Aug 19 08:17:18.340904 containerd[1722]: time="2025-08-19T08:17:18.340427926Z" level=info msg="connecting to shim 4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06" address="unix:///run/containerd/s/3184d7ad6ed51d29ac401c7fa8301190274c20d8fb5edcdbc1b0bba63533f127" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:18.362180 systemd[1]: Started cri-containerd-4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06.scope - libcontainer container 4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06. Aug 19 08:17:18.415241 containerd[1722]: time="2025-08-19T08:17:18.415177027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fcb6746f-p9t8c,Uid:b3373c25-af5c-462d-80a7-6003a55f0ff6,Namespace:calico-system,Attempt:0,} returns sandbox id \"4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06\"" Aug 19 08:17:18.416363 containerd[1722]: time="2025-08-19T08:17:18.416296965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 08:17:18.436009 kubelet[3115]: E0819 08:17:18.435949 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.436009 kubelet[3115]: W0819 08:17:18.435967 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.436232 kubelet[3115]: E0819 08:17:18.435991 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.436381 kubelet[3115]: E0819 08:17:18.436354 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.436381 kubelet[3115]: W0819 08:17:18.436364 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.436471 kubelet[3115]: E0819 08:17:18.436443 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.440277 kubelet[3115]: E0819 08:17:18.440257 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.440277 kubelet[3115]: W0819 08:17:18.440273 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.440370 kubelet[3115]: E0819 08:17:18.440287 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.552130 kubelet[3115]: E0819 08:17:18.551813 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:18.564023 containerd[1722]: time="2025-08-19T08:17:18.563992203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n2h2x,Uid:b9e3b2c3-34d9-4ede-8d01-9b7b73956f78,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:18.606868 containerd[1722]: time="2025-08-19T08:17:18.606821475Z" level=info msg="connecting to shim 4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb" address="unix:///run/containerd/s/cfd0587e3d2bd234b94da1980c1b17db3d83728164009b414766d8363fc44a26" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:18.623673 kubelet[3115]: E0819 08:17:18.623655 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.623905 kubelet[3115]: W0819 08:17:18.623891 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.624272 kubelet[3115]: E0819 08:17:18.624003 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.624680 kubelet[3115]: E0819 08:17:18.624669 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.624820 kubelet[3115]: W0819 08:17:18.624745 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.625306 kubelet[3115]: E0819 08:17:18.624798 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.625194 systemd[1]: Started cri-containerd-4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb.scope - libcontainer container 4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb. Aug 19 08:17:18.627054 kubelet[3115]: E0819 08:17:18.626896 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.627054 kubelet[3115]: W0819 08:17:18.626938 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.627054 kubelet[3115]: E0819 08:17:18.626952 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.627311 kubelet[3115]: E0819 08:17:18.627291 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.627345 kubelet[3115]: W0819 08:17:18.627309 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.627345 kubelet[3115]: E0819 08:17:18.627324 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.627507 kubelet[3115]: E0819 08:17:18.627497 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.627538 kubelet[3115]: W0819 08:17:18.627508 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.627538 kubelet[3115]: E0819 08:17:18.627517 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.627650 kubelet[3115]: E0819 08:17:18.627642 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.627674 kubelet[3115]: W0819 08:17:18.627651 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.627674 kubelet[3115]: E0819 08:17:18.627657 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.627770 kubelet[3115]: E0819 08:17:18.627761 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.627770 kubelet[3115]: W0819 08:17:18.627769 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.627820 kubelet[3115]: E0819 08:17:18.627776 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628032 kubelet[3115]: E0819 08:17:18.627889 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628032 kubelet[3115]: W0819 08:17:18.627895 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628032 kubelet[3115]: E0819 08:17:18.627902 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628032 kubelet[3115]: E0819 08:17:18.628012 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628032 kubelet[3115]: W0819 08:17:18.628028 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628192 kubelet[3115]: E0819 08:17:18.628048 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628192 kubelet[3115]: E0819 08:17:18.628153 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628192 kubelet[3115]: W0819 08:17:18.628159 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628192 kubelet[3115]: E0819 08:17:18.628165 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628277 kubelet[3115]: E0819 08:17:18.628272 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628303 kubelet[3115]: W0819 08:17:18.628278 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628303 kubelet[3115]: E0819 08:17:18.628283 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628433 kubelet[3115]: E0819 08:17:18.628378 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628433 kubelet[3115]: W0819 08:17:18.628384 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628433 kubelet[3115]: E0819 08:17:18.628390 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628509 kubelet[3115]: E0819 08:17:18.628501 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628534 kubelet[3115]: W0819 08:17:18.628509 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628534 kubelet[3115]: E0819 08:17:18.628516 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628615 kubelet[3115]: E0819 08:17:18.628608 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628638 kubelet[3115]: W0819 08:17:18.628616 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628638 kubelet[3115]: E0819 08:17:18.628622 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628946 kubelet[3115]: E0819 08:17:18.628719 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628946 kubelet[3115]: W0819 08:17:18.628738 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628946 kubelet[3115]: E0819 08:17:18.628744 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.628946 kubelet[3115]: E0819 08:17:18.628840 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.628946 kubelet[3115]: W0819 08:17:18.628844 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.628946 kubelet[3115]: E0819 08:17:18.628850 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.629117 kubelet[3115]: E0819 08:17:18.628954 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.629117 kubelet[3115]: W0819 08:17:18.628970 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.629117 kubelet[3115]: E0819 08:17:18.628977 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.629117 kubelet[3115]: E0819 08:17:18.629099 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.629117 kubelet[3115]: W0819 08:17:18.629104 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.629117 kubelet[3115]: E0819 08:17:18.629110 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.629247 kubelet[3115]: E0819 08:17:18.629210 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.629247 kubelet[3115]: W0819 08:17:18.629232 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.629247 kubelet[3115]: E0819 08:17:18.629238 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.629457 kubelet[3115]: E0819 08:17:18.629338 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.629457 kubelet[3115]: W0819 08:17:18.629347 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.629457 kubelet[3115]: E0819 08:17:18.629353 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.631155 kubelet[3115]: E0819 08:17:18.631135 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.631155 kubelet[3115]: W0819 08:17:18.631155 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.631263 kubelet[3115]: E0819 08:17:18.631168 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.631683 kubelet[3115]: I0819 08:17:18.631653 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0d6984f4-4de2-408b-9ee3-92edac203b52-varrun\") pod \"csi-node-driver-rw2mz\" (UID: \"0d6984f4-4de2-408b-9ee3-92edac203b52\") " pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:18.632186 kubelet[3115]: E0819 08:17:18.632168 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.632186 kubelet[3115]: W0819 08:17:18.632184 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.632389 kubelet[3115]: E0819 08:17:18.632330 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.632711 kubelet[3115]: E0819 08:17:18.632692 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.632711 kubelet[3115]: W0819 08:17:18.632710 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.632785 kubelet[3115]: E0819 08:17:18.632769 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.633024 kubelet[3115]: E0819 08:17:18.633005 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.633024 kubelet[3115]: W0819 08:17:18.633020 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.633311 kubelet[3115]: E0819 08:17:18.633032 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.633311 kubelet[3115]: I0819 08:17:18.633115 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d6984f4-4de2-408b-9ee3-92edac203b52-kubelet-dir\") pod \"csi-node-driver-rw2mz\" (UID: \"0d6984f4-4de2-408b-9ee3-92edac203b52\") " pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:18.633380 kubelet[3115]: E0819 08:17:18.633367 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.633404 kubelet[3115]: W0819 08:17:18.633382 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.634096 kubelet[3115]: E0819 08:17:18.634071 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.634181 kubelet[3115]: I0819 08:17:18.634168 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0d6984f4-4de2-408b-9ee3-92edac203b52-registration-dir\") pod \"csi-node-driver-rw2mz\" (UID: \"0d6984f4-4de2-408b-9ee3-92edac203b52\") " pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:18.634330 kubelet[3115]: E0819 08:17:18.634319 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.634491 kubelet[3115]: W0819 08:17:18.634331 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.635118 kubelet[3115]: E0819 08:17:18.635087 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.635379 kubelet[3115]: E0819 08:17:18.635359 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.635514 kubelet[3115]: W0819 08:17:18.635379 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.635514 kubelet[3115]: E0819 08:17:18.635400 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.636213 kubelet[3115]: E0819 08:17:18.636196 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.636213 kubelet[3115]: W0819 08:17:18.636212 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.636310 kubelet[3115]: E0819 08:17:18.636232 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.636310 kubelet[3115]: I0819 08:17:18.636252 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0d6984f4-4de2-408b-9ee3-92edac203b52-socket-dir\") pod \"csi-node-driver-rw2mz\" (UID: \"0d6984f4-4de2-408b-9ee3-92edac203b52\") " pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:18.636406 kubelet[3115]: E0819 08:17:18.636395 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.636442 kubelet[3115]: W0819 08:17:18.636406 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.637747 kubelet[3115]: E0819 08:17:18.637724 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.637843 kubelet[3115]: I0819 08:17:18.637760 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzlr\" (UniqueName: \"kubernetes.io/projected/0d6984f4-4de2-408b-9ee3-92edac203b52-kube-api-access-phzlr\") pod \"csi-node-driver-rw2mz\" (UID: \"0d6984f4-4de2-408b-9ee3-92edac203b52\") " pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:18.637973 kubelet[3115]: E0819 08:17:18.637954 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.637973 kubelet[3115]: W0819 08:17:18.637968 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.638111 kubelet[3115]: E0819 08:17:18.638052 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.638111 kubelet[3115]: E0819 08:17:18.638098 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.638111 kubelet[3115]: W0819 08:17:18.638105 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.638197 kubelet[3115]: E0819 08:17:18.638176 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.638261 kubelet[3115]: E0819 08:17:18.638251 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.638261 kubelet[3115]: W0819 08:17:18.638260 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.638327 kubelet[3115]: E0819 08:17:18.638274 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.638390 kubelet[3115]: E0819 08:17:18.638375 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.638390 kubelet[3115]: W0819 08:17:18.638384 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.638441 kubelet[3115]: E0819 08:17:18.638391 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.638523 kubelet[3115]: E0819 08:17:18.638514 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.638555 kubelet[3115]: W0819 08:17:18.638525 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.638555 kubelet[3115]: E0819 08:17:18.638532 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.639239 kubelet[3115]: E0819 08:17:18.639208 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.639239 kubelet[3115]: W0819 08:17:18.639224 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.639239 kubelet[3115]: E0819 08:17:18.639238 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.664772 containerd[1722]: time="2025-08-19T08:17:18.664721325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n2h2x,Uid:b9e3b2c3-34d9-4ede-8d01-9b7b73956f78,Namespace:calico-system,Attempt:0,} returns sandbox id \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\"" Aug 19 08:17:18.738984 kubelet[3115]: E0819 08:17:18.738967 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.738984 kubelet[3115]: W0819 08:17:18.738983 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.739104 kubelet[3115]: E0819 08:17:18.738995 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.739402 kubelet[3115]: E0819 08:17:18.739384 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.739402 kubelet[3115]: W0819 08:17:18.739398 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.739522 kubelet[3115]: E0819 08:17:18.739510 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.739849 kubelet[3115]: E0819 08:17:18.739834 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.739849 kubelet[3115]: W0819 08:17:18.739848 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.740145 kubelet[3115]: E0819 08:17:18.739863 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.740235 kubelet[3115]: E0819 08:17:18.740225 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.740263 kubelet[3115]: W0819 08:17:18.740237 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.740347 kubelet[3115]: E0819 08:17:18.740295 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.740643 kubelet[3115]: E0819 08:17:18.740631 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.740643 kubelet[3115]: W0819 08:17:18.740642 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.740758 kubelet[3115]: E0819 08:17:18.740746 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.740944 kubelet[3115]: E0819 08:17:18.740932 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.741011 kubelet[3115]: W0819 08:17:18.740944 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.741011 kubelet[3115]: E0819 08:17:18.740964 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.741149 kubelet[3115]: E0819 08:17:18.741097 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.741149 kubelet[3115]: W0819 08:17:18.741103 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741222 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742072 kubelet[3115]: W0819 08:17:18.741229 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741236 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741332 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742072 kubelet[3115]: W0819 08:17:18.741337 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741343 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741443 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742072 kubelet[3115]: W0819 08:17:18.741448 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741455 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742072 kubelet[3115]: E0819 08:17:18.741566 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742351 kubelet[3115]: W0819 08:17:18.741571 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.741577 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.741789 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.741945 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742351 kubelet[3115]: W0819 08:17:18.741953 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.741963 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.742092 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742351 kubelet[3115]: W0819 08:17:18.742098 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.742138 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742351 kubelet[3115]: E0819 08:17:18.742312 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742570 kubelet[3115]: W0819 08:17:18.742317 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742570 kubelet[3115]: E0819 08:17:18.742333 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742570 kubelet[3115]: E0819 08:17:18.742413 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742570 kubelet[3115]: W0819 08:17:18.742418 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.742570 kubelet[3115]: E0819 08:17:18.742446 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.742570 kubelet[3115]: E0819 08:17:18.742532 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.742570 kubelet[3115]: W0819 08:17:18.742536 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742600 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742663 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743077 kubelet[3115]: W0819 08:17:18.742682 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742690 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742784 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743077 kubelet[3115]: W0819 08:17:18.742789 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742795 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742877 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743077 kubelet[3115]: W0819 08:17:18.742882 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743077 kubelet[3115]: E0819 08:17:18.742893 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743018 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743276 kubelet[3115]: W0819 08:17:18.743023 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743030 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743135 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743276 kubelet[3115]: W0819 08:17:18.743140 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743146 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743223 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743276 kubelet[3115]: W0819 08:17:18.743245 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743276 kubelet[3115]: E0819 08:17:18.743252 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743445 kubelet[3115]: E0819 08:17:18.743359 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743445 kubelet[3115]: W0819 08:17:18.743363 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743445 kubelet[3115]: E0819 08:17:18.743369 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.743505 kubelet[3115]: E0819 08:17:18.743463 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.743505 kubelet[3115]: W0819 08:17:18.743468 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.743505 kubelet[3115]: E0819 08:17:18.743473 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.744361 kubelet[3115]: E0819 08:17:18.743569 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.744361 kubelet[3115]: W0819 08:17:18.743576 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.744361 kubelet[3115]: E0819 08:17:18.743581 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:18.787488 kubelet[3115]: E0819 08:17:18.787475 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:18.787599 kubelet[3115]: W0819 08:17:18.787563 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:18.787599 kubelet[3115]: E0819 08:17:18.787576 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:20.818069 kubelet[3115]: E0819 08:17:20.817609 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:21.047583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2189517319.mount: Deactivated successfully. Aug 19 08:17:21.624592 containerd[1722]: time="2025-08-19T08:17:21.624553956Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:21.627140 containerd[1722]: time="2025-08-19T08:17:21.627109287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 19 08:17:21.630670 containerd[1722]: time="2025-08-19T08:17:21.630631709Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:21.637139 containerd[1722]: time="2025-08-19T08:17:21.637090607Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:21.637521 containerd[1722]: time="2025-08-19T08:17:21.637425958Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 3.221099545s" Aug 19 08:17:21.637521 containerd[1722]: time="2025-08-19T08:17:21.637454254Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 19 08:17:21.638460 containerd[1722]: time="2025-08-19T08:17:21.638392910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 08:17:21.650185 containerd[1722]: time="2025-08-19T08:17:21.649698058Z" level=info msg="CreateContainer within sandbox \"4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 08:17:21.683005 containerd[1722]: time="2025-08-19T08:17:21.682980507Z" level=info msg="Container c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:21.713557 containerd[1722]: time="2025-08-19T08:17:21.713533980Z" level=info msg="CreateContainer within sandbox \"4a3f4ec70f43f596d2582b1d9653002d170614e79b6907f6d5ec409b77274b06\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3\"" Aug 19 08:17:21.713972 containerd[1722]: time="2025-08-19T08:17:21.713911116Z" level=info msg="StartContainer for \"c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3\"" Aug 19 08:17:21.714991 containerd[1722]: time="2025-08-19T08:17:21.714935725Z" level=info msg="connecting to shim c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3" address="unix:///run/containerd/s/3184d7ad6ed51d29ac401c7fa8301190274c20d8fb5edcdbc1b0bba63533f127" protocol=ttrpc version=3 Aug 19 08:17:21.730194 systemd[1]: Started cri-containerd-c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3.scope - libcontainer container c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3. Aug 19 08:17:21.772096 containerd[1722]: time="2025-08-19T08:17:21.772054904Z" level=info msg="StartContainer for \"c73491457fa946960f90cef229a869c9538ec1e7f88350abdf256bff4fbe68f3\" returns successfully" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953175 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.953899 kubelet[3115]: W0819 08:17:21.953195 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953215 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953373 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.953899 kubelet[3115]: W0819 08:17:21.953379 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953386 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953840 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.953899 kubelet[3115]: W0819 08:17:21.953852 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.953899 kubelet[3115]: E0819 08:17:21.953866 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.953989 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954614 kubelet[3115]: W0819 08:17:21.953993 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954001 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954117 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954614 kubelet[3115]: W0819 08:17:21.954122 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954129 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954209 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954614 kubelet[3115]: W0819 08:17:21.954214 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954221 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954614 kubelet[3115]: E0819 08:17:21.954304 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954823 kubelet[3115]: W0819 08:17:21.954309 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954314 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954393 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954823 kubelet[3115]: W0819 08:17:21.954398 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954404 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954494 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954823 kubelet[3115]: W0819 08:17:21.954500 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954506 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.954823 kubelet[3115]: E0819 08:17:21.954588 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.954823 kubelet[3115]: W0819 08:17:21.954593 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.955027 kubelet[3115]: E0819 08:17:21.954599 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.956177 kubelet[3115]: E0819 08:17:21.956158 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.956177 kubelet[3115]: W0819 08:17:21.956177 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.956279 kubelet[3115]: E0819 08:17:21.956193 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956328 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.956978 kubelet[3115]: W0819 08:17:21.956335 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956343 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956443 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.956978 kubelet[3115]: W0819 08:17:21.956448 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956454 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956534 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.956978 kubelet[3115]: W0819 08:17:21.956538 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956544 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.956978 kubelet[3115]: E0819 08:17:21.956626 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.957381 kubelet[3115]: W0819 08:17:21.956630 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.957381 kubelet[3115]: E0819 08:17:21.956636 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.967013 kubelet[3115]: E0819 08:17:21.966995 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.967013 kubelet[3115]: W0819 08:17:21.967011 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.967223 kubelet[3115]: E0819 08:17:21.967024 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.967567 kubelet[3115]: E0819 08:17:21.967554 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.967610 kubelet[3115]: W0819 08:17:21.967568 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.967610 kubelet[3115]: E0819 08:17:21.967587 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.967756 kubelet[3115]: E0819 08:17:21.967745 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.967788 kubelet[3115]: W0819 08:17:21.967756 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.967788 kubelet[3115]: E0819 08:17:21.967776 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.969118 kubelet[3115]: E0819 08:17:21.969099 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.969118 kubelet[3115]: W0819 08:17:21.969117 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969141 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969293 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970268 kubelet[3115]: W0819 08:17:21.969299 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969395 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970268 kubelet[3115]: W0819 08:17:21.969399 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969374 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969458 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969494 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970268 kubelet[3115]: W0819 08:17:21.969499 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970268 kubelet[3115]: E0819 08:17:21.969511 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969593 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970488 kubelet[3115]: W0819 08:17:21.969599 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969606 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969677 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970488 kubelet[3115]: W0819 08:17:21.969682 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969688 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969807 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970488 kubelet[3115]: W0819 08:17:21.969815 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969826 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.970488 kubelet[3115]: E0819 08:17:21.969966 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.970683 kubelet[3115]: W0819 08:17:21.969974 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.970683 kubelet[3115]: E0819 08:17:21.969981 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.971167 kubelet[3115]: E0819 08:17:21.971151 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.971167 kubelet[3115]: W0819 08:17:21.971166 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.971248 kubelet[3115]: E0819 08:17:21.971188 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.971929 kubelet[3115]: E0819 08:17:21.971913 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.971929 kubelet[3115]: W0819 08:17:21.971928 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.972004 kubelet[3115]: E0819 08:17:21.971949 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.972269 kubelet[3115]: E0819 08:17:21.972259 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.972307 kubelet[3115]: W0819 08:17:21.972269 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.972307 kubelet[3115]: E0819 08:17:21.972286 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.973189 kubelet[3115]: E0819 08:17:21.973175 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.973189 kubelet[3115]: W0819 08:17:21.973188 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.973256 kubelet[3115]: E0819 08:17:21.973207 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.973362 kubelet[3115]: E0819 08:17:21.973354 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.973399 kubelet[3115]: W0819 08:17:21.973361 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.973458 kubelet[3115]: E0819 08:17:21.973449 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.973589 kubelet[3115]: E0819 08:17:21.973581 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.973615 kubelet[3115]: W0819 08:17:21.973589 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.973615 kubelet[3115]: E0819 08:17:21.973598 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:21.973697 kubelet[3115]: E0819 08:17:21.973689 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:21.973725 kubelet[3115]: W0819 08:17:21.973697 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:21.973725 kubelet[3115]: E0819 08:17:21.973703 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.817436 kubelet[3115]: E0819 08:17:22.817121 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:22.893187 kubelet[3115]: I0819 08:17:22.893157 3115 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:17:22.963179 kubelet[3115]: E0819 08:17:22.963154 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.963716 kubelet[3115]: W0819 08:17:22.963176 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.963716 kubelet[3115]: E0819 08:17:22.963281 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.963716 kubelet[3115]: E0819 08:17:22.963634 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.963716 kubelet[3115]: W0819 08:17:22.963644 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.963716 kubelet[3115]: E0819 08:17:22.963659 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.964342 kubelet[3115]: E0819 08:17:22.964053 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.964342 kubelet[3115]: W0819 08:17:22.964065 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.964342 kubelet[3115]: E0819 08:17:22.964078 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.964440 kubelet[3115]: E0819 08:17:22.964407 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.964519 kubelet[3115]: W0819 08:17:22.964507 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.964541 kubelet[3115]: E0819 08:17:22.964524 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.965204 kubelet[3115]: E0819 08:17:22.965189 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.965204 kubelet[3115]: W0819 08:17:22.965202 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.965435 kubelet[3115]: E0819 08:17:22.965214 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.965435 kubelet[3115]: E0819 08:17:22.965345 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.965435 kubelet[3115]: W0819 08:17:22.965351 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.965435 kubelet[3115]: E0819 08:17:22.965358 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.965907 kubelet[3115]: E0819 08:17:22.965833 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.965907 kubelet[3115]: W0819 08:17:22.965846 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.965907 kubelet[3115]: E0819 08:17:22.965858 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.966187 kubelet[3115]: E0819 08:17:22.966158 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.966187 kubelet[3115]: W0819 08:17:22.966170 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.966256 kubelet[3115]: E0819 08:17:22.966192 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.966388 kubelet[3115]: E0819 08:17:22.966325 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.966388 kubelet[3115]: W0819 08:17:22.966333 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.966388 kubelet[3115]: E0819 08:17:22.966343 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.966570 kubelet[3115]: E0819 08:17:22.966455 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.966570 kubelet[3115]: W0819 08:17:22.966460 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.966570 kubelet[3115]: E0819 08:17:22.966467 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.966710 kubelet[3115]: E0819 08:17:22.966577 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.966710 kubelet[3115]: W0819 08:17:22.966582 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.966710 kubelet[3115]: E0819 08:17:22.966589 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.966710 kubelet[3115]: E0819 08:17:22.966694 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.966710 kubelet[3115]: W0819 08:17:22.966699 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.966710 kubelet[3115]: E0819 08:17:22.966706 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.966815 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.967116 kubelet[3115]: W0819 08:17:22.966821 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.966835 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.966971 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.967116 kubelet[3115]: W0819 08:17:22.966977 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.966996 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.967096 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.967116 kubelet[3115]: W0819 08:17:22.967101 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.967116 kubelet[3115]: E0819 08:17:22.967107 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.974414 kubelet[3115]: E0819 08:17:22.974386 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.974414 kubelet[3115]: W0819 08:17:22.974410 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974421 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974553 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976324 kubelet[3115]: W0819 08:17:22.974571 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974582 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974694 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976324 kubelet[3115]: W0819 08:17:22.974698 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974702 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974829 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976324 kubelet[3115]: W0819 08:17:22.974843 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976324 kubelet[3115]: E0819 08:17:22.974853 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.974950 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976470 kubelet[3115]: W0819 08:17:22.974953 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.974961 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.975101 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976470 kubelet[3115]: W0819 08:17:22.975120 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.975127 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.975260 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976470 kubelet[3115]: W0819 08:17:22.975274 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.975282 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976470 kubelet[3115]: E0819 08:17:22.975418 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976597 kubelet[3115]: W0819 08:17:22.975423 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975430 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975542 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976597 kubelet[3115]: W0819 08:17:22.975560 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975568 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975704 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976597 kubelet[3115]: W0819 08:17:22.975739 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975749 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976597 kubelet[3115]: E0819 08:17:22.975859 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976597 kubelet[3115]: W0819 08:17:22.975874 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976723 kubelet[3115]: E0819 08:17:22.975884 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976723 kubelet[3115]: E0819 08:17:22.975991 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976723 kubelet[3115]: W0819 08:17:22.975995 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976723 kubelet[3115]: E0819 08:17:22.976002 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976723 kubelet[3115]: E0819 08:17:22.976138 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976723 kubelet[3115]: W0819 08:17:22.976143 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976723 kubelet[3115]: E0819 08:17:22.976153 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.976932 kubelet[3115]: E0819 08:17:22.976836 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.976932 kubelet[3115]: W0819 08:17:22.976844 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.976932 kubelet[3115]: E0819 08:17:22.976867 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.977055 kubelet[3115]: E0819 08:17:22.976967 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.977055 kubelet[3115]: W0819 08:17:22.976972 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.977055 kubelet[3115]: E0819 08:17:22.976980 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.977119 kubelet[3115]: E0819 08:17:22.977108 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.977119 kubelet[3115]: W0819 08:17:22.977113 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.977156 kubelet[3115]: E0819 08:17:22.977119 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.977245 kubelet[3115]: E0819 08:17:22.977236 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.977267 kubelet[3115]: W0819 08:17:22.977246 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.977267 kubelet[3115]: E0819 08:17:22.977253 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:22.977570 kubelet[3115]: E0819 08:17:22.977546 3115 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:22.977611 kubelet[3115]: W0819 08:17:22.977589 3115 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:22.977611 kubelet[3115]: E0819 08:17:22.977598 3115 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:23.087496 containerd[1722]: time="2025-08-19T08:17:23.087411479Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:23.146686 containerd[1722]: time="2025-08-19T08:17:23.146643995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 19 08:17:23.150022 containerd[1722]: time="2025-08-19T08:17:23.149958151Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:23.154162 containerd[1722]: time="2025-08-19T08:17:23.154109085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:23.154787 containerd[1722]: time="2025-08-19T08:17:23.154491528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.516071262s" Aug 19 08:17:23.154787 containerd[1722]: time="2025-08-19T08:17:23.154530114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 19 08:17:23.156237 containerd[1722]: time="2025-08-19T08:17:23.156215841Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 08:17:23.175344 containerd[1722]: time="2025-08-19T08:17:23.175323464Z" level=info msg="Container 26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:23.204270 containerd[1722]: time="2025-08-19T08:17:23.204191484Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\"" Aug 19 08:17:23.205070 containerd[1722]: time="2025-08-19T08:17:23.204726067Z" level=info msg="StartContainer for \"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\"" Aug 19 08:17:23.209372 containerd[1722]: time="2025-08-19T08:17:23.209345274Z" level=info msg="connecting to shim 26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420" address="unix:///run/containerd/s/cfd0587e3d2bd234b94da1980c1b17db3d83728164009b414766d8363fc44a26" protocol=ttrpc version=3 Aug 19 08:17:23.236196 systemd[1]: Started cri-containerd-26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420.scope - libcontainer container 26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420. Aug 19 08:17:23.268139 kubelet[3115]: I0819 08:17:23.268020 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7fcb6746f-p9t8c" podStartSLOduration=3.045907573 podStartE2EDuration="6.268002992s" podCreationTimestamp="2025-08-19 08:17:17 +0000 UTC" firstStartedPulling="2025-08-19 08:17:18.416031329 +0000 UTC m=+17.677314845" lastFinishedPulling="2025-08-19 08:17:21.638126741 +0000 UTC m=+20.899410264" observedRunningTime="2025-08-19 08:17:21.916565274 +0000 UTC m=+21.177848796" watchObservedRunningTime="2025-08-19 08:17:23.268002992 +0000 UTC m=+22.529286518" Aug 19 08:17:23.273746 containerd[1722]: time="2025-08-19T08:17:23.273720579Z" level=info msg="StartContainer for \"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\" returns successfully" Aug 19 08:17:23.278806 systemd[1]: cri-containerd-26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420.scope: Deactivated successfully. Aug 19 08:17:23.283676 containerd[1722]: time="2025-08-19T08:17:23.283652025Z" level=info msg="received exit event container_id:\"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\" id:\"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\" pid:3841 exited_at:{seconds:1755591443 nanos:283310525}" Aug 19 08:17:23.284446 containerd[1722]: time="2025-08-19T08:17:23.284418147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\" id:\"26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420\" pid:3841 exited_at:{seconds:1755591443 nanos:283310525}" Aug 19 08:17:23.305298 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-26927cbbb623bb1546a95186d306b362b03c8eea9e811e38b70087c32bae4420-rootfs.mount: Deactivated successfully. Aug 19 08:17:24.817378 kubelet[3115]: E0819 08:17:24.817275 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:25.901654 containerd[1722]: time="2025-08-19T08:17:25.901579323Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 08:17:26.818064 kubelet[3115]: E0819 08:17:26.817392 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:28.817934 kubelet[3115]: E0819 08:17:28.817672 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:29.004720 containerd[1722]: time="2025-08-19T08:17:29.004682012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:29.008333 containerd[1722]: time="2025-08-19T08:17:29.008301458Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 19 08:17:29.014201 containerd[1722]: time="2025-08-19T08:17:29.014159282Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:29.024002 containerd[1722]: time="2025-08-19T08:17:29.023957040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:29.024414 containerd[1722]: time="2025-08-19T08:17:29.024394674Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.122312777s" Aug 19 08:17:29.024482 containerd[1722]: time="2025-08-19T08:17:29.024471317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 19 08:17:29.026218 containerd[1722]: time="2025-08-19T08:17:29.026185682Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 08:17:29.051775 containerd[1722]: time="2025-08-19T08:17:29.051749324Z" level=info msg="Container 7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:29.090086 containerd[1722]: time="2025-08-19T08:17:29.089974091Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\"" Aug 19 08:17:29.090707 containerd[1722]: time="2025-08-19T08:17:29.090669152Z" level=info msg="StartContainer for \"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\"" Aug 19 08:17:29.092318 containerd[1722]: time="2025-08-19T08:17:29.092292277Z" level=info msg="connecting to shim 7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887" address="unix:///run/containerd/s/cfd0587e3d2bd234b94da1980c1b17db3d83728164009b414766d8363fc44a26" protocol=ttrpc version=3 Aug 19 08:17:29.111187 systemd[1]: Started cri-containerd-7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887.scope - libcontainer container 7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887. Aug 19 08:17:29.143974 containerd[1722]: time="2025-08-19T08:17:29.143944314Z" level=info msg="StartContainer for \"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\" returns successfully" Aug 19 08:17:30.402713 containerd[1722]: time="2025-08-19T08:17:30.402666217Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 08:17:30.404311 systemd[1]: cri-containerd-7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887.scope: Deactivated successfully. Aug 19 08:17:30.404845 systemd[1]: cri-containerd-7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887.scope: Consumed 387ms CPU time, 190.7M memory peak, 171.2M written to disk. Aug 19 08:17:30.406488 containerd[1722]: time="2025-08-19T08:17:30.406339090Z" level=info msg="received exit event container_id:\"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\" id:\"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\" pid:3906 exited_at:{seconds:1755591450 nanos:405852264}" Aug 19 08:17:30.406488 containerd[1722]: time="2025-08-19T08:17:30.406374607Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\" id:\"7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887\" pid:3906 exited_at:{seconds:1755591450 nanos:405852264}" Aug 19 08:17:30.407502 kubelet[3115]: I0819 08:17:30.407241 3115 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Aug 19 08:17:30.431838 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7b49f89a3016cf458a0f349ee5c71365ad5adaaa3cd63e69fc390b8e2e36e887-rootfs.mount: Deactivated successfully. Aug 19 08:17:30.454247 systemd[1]: Created slice kubepods-burstable-pod0b186b0f_8823_43c8_9fad_d97a1d8bc250.slice - libcontainer container kubepods-burstable-pod0b186b0f_8823_43c8_9fad_d97a1d8bc250.slice. Aug 19 08:17:30.463217 systemd[1]: Created slice kubepods-burstable-pod6637040d_4229_4a0c_9583_c7b1e340ca6c.slice - libcontainer container kubepods-burstable-pod6637040d_4229_4a0c_9583_c7b1e340ca6c.slice. Aug 19 08:17:30.466661 kubelet[3115]: W0819 08:17:30.466541 3115 reflector.go:561] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:ci-4426.0.0-a-5588c1b4cf" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-a-5588c1b4cf' and this object Aug 19 08:17:30.469162 kubelet[3115]: E0819 08:17:30.468866 3115 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:ci-4426.0.0-a-5588c1b4cf\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-a-5588c1b4cf' and this object" logger="UnhandledError" Aug 19 08:17:30.472576 systemd[1]: Created slice kubepods-besteffort-pod9c9ca255_d68b_47ca_8107_0f46f9e9d40c.slice - libcontainer container kubepods-besteffort-pod9c9ca255_d68b_47ca_8107_0f46f9e9d40c.slice. Aug 19 08:17:30.488223 systemd[1]: Created slice kubepods-besteffort-podafc774c5_998d_4b55_896a_2b3ba14bd2bf.slice - libcontainer container kubepods-besteffort-podafc774c5_998d_4b55_896a_2b3ba14bd2bf.slice. Aug 19 08:17:30.495877 systemd[1]: Created slice kubepods-besteffort-pod2bb59b89_ca2b_49e1_bc22_00af9770adb2.slice - libcontainer container kubepods-besteffort-pod2bb59b89_ca2b_49e1_bc22_00af9770adb2.slice. Aug 19 08:17:30.501783 systemd[1]: Created slice kubepods-besteffort-pod5acb51bf_65bc_4f86_93a4_1cc021ca500f.slice - libcontainer container kubepods-besteffort-pod5acb51bf_65bc_4f86_93a4_1cc021ca500f.slice. Aug 19 08:17:30.508355 systemd[1]: Created slice kubepods-besteffort-pod1fb141e5_61fb_48ee_bab2_be40ed742a47.slice - libcontainer container kubepods-besteffort-pod1fb141e5_61fb_48ee_bab2_be40ed742a47.slice. Aug 19 08:17:30.513164 systemd[1]: Created slice kubepods-besteffort-pod977f07be_ddc3_40de_8dad_97e5d4d17950.slice - libcontainer container kubepods-besteffort-pod977f07be_ddc3_40de_8dad_97e5d4d17950.slice. Aug 19 08:17:30.527176 kubelet[3115]: I0819 08:17:30.527152 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnddt\" (UniqueName: \"kubernetes.io/projected/afc774c5-998d-4b55-896a-2b3ba14bd2bf-kube-api-access-lnddt\") pod \"calico-kube-controllers-86896c85c-k2466\" (UID: \"afc774c5-998d-4b55-896a-2b3ba14bd2bf\") " pod="calico-system/calico-kube-controllers-86896c85c-k2466" Aug 19 08:17:30.527272 kubelet[3115]: I0819 08:17:30.527257 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klwwc\" (UniqueName: \"kubernetes.io/projected/5acb51bf-65bc-4f86-93a4-1cc021ca500f-kube-api-access-klwwc\") pod \"calico-apiserver-655b7bdffd-hjd48\" (UID: \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\") " pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" Aug 19 08:17:30.527309 kubelet[3115]: I0819 08:17:30.527282 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afc774c5-998d-4b55-896a-2b3ba14bd2bf-tigera-ca-bundle\") pod \"calico-kube-controllers-86896c85c-k2466\" (UID: \"afc774c5-998d-4b55-896a-2b3ba14bd2bf\") " pod="calico-system/calico-kube-controllers-86896c85c-k2466" Aug 19 08:17:30.527365 kubelet[3115]: I0819 08:17:30.527312 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9c9ca255-d68b-47ca-8107-0f46f9e9d40c-calico-apiserver-certs\") pod \"calico-apiserver-598d747447-gjlqd\" (UID: \"9c9ca255-d68b-47ca-8107-0f46f9e9d40c\") " pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" Aug 19 08:17:30.527365 kubelet[3115]: I0819 08:17:30.527330 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/977f07be-ddc3-40de-8dad-97e5d4d17950-calico-apiserver-certs\") pod \"calico-apiserver-655b7bdffd-n6vc8\" (UID: \"977f07be-ddc3-40de-8dad-97e5d4d17950\") " pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" Aug 19 08:17:30.527365 kubelet[3115]: I0819 08:17:30.527347 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2bb59b89-ca2b-49e1-bc22-00af9770adb2-goldmane-key-pair\") pod \"goldmane-58fd7646b9-h982d\" (UID: \"2bb59b89-ca2b-49e1-bc22-00af9770adb2\") " pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:30.527438 kubelet[3115]: I0819 08:17:30.527366 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85m2j\" (UniqueName: \"kubernetes.io/projected/1fb141e5-61fb-48ee-bab2-be40ed742a47-kube-api-access-85m2j\") pod \"whisker-5d584f45c9-8d4g5\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " pod="calico-system/whisker-5d584f45c9-8d4g5" Aug 19 08:17:30.527629 kubelet[3115]: I0819 08:17:30.527561 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b186b0f-8823-43c8-9fad-d97a1d8bc250-config-volume\") pod \"coredns-7c65d6cfc9-7mkbh\" (UID: \"0b186b0f-8823-43c8-9fad-d97a1d8bc250\") " pod="kube-system/coredns-7c65d6cfc9-7mkbh" Aug 19 08:17:30.527629 kubelet[3115]: I0819 08:17:30.527583 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477br\" (UniqueName: \"kubernetes.io/projected/6637040d-4229-4a0c-9583-c7b1e340ca6c-kube-api-access-477br\") pod \"coredns-7c65d6cfc9-wkmf2\" (UID: \"6637040d-4229-4a0c-9583-c7b1e340ca6c\") " pod="kube-system/coredns-7c65d6cfc9-wkmf2" Aug 19 08:17:30.527629 kubelet[3115]: I0819 08:17:30.527604 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5kk9\" (UniqueName: \"kubernetes.io/projected/977f07be-ddc3-40de-8dad-97e5d4d17950-kube-api-access-b5kk9\") pod \"calico-apiserver-655b7bdffd-n6vc8\" (UID: \"977f07be-ddc3-40de-8dad-97e5d4d17950\") " pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" Aug 19 08:17:30.527801 kubelet[3115]: I0819 08:17:30.527768 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb59b89-ca2b-49e1-bc22-00af9770adb2-config\") pod \"goldmane-58fd7646b9-h982d\" (UID: \"2bb59b89-ca2b-49e1-bc22-00af9770adb2\") " pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:30.527932 kubelet[3115]: I0819 08:17:30.527873 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2bb59b89-ca2b-49e1-bc22-00af9770adb2-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-h982d\" (UID: \"2bb59b89-ca2b-49e1-bc22-00af9770adb2\") " pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:30.527932 kubelet[3115]: I0819 08:17:30.527897 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-backend-key-pair\") pod \"whisker-5d584f45c9-8d4g5\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " pod="calico-system/whisker-5d584f45c9-8d4g5" Aug 19 08:17:30.527932 kubelet[3115]: I0819 08:17:30.527914 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-ca-bundle\") pod \"whisker-5d584f45c9-8d4g5\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " pod="calico-system/whisker-5d584f45c9-8d4g5" Aug 19 08:17:30.528234 kubelet[3115]: I0819 08:17:30.528119 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27d7\" (UniqueName: \"kubernetes.io/projected/9c9ca255-d68b-47ca-8107-0f46f9e9d40c-kube-api-access-g27d7\") pod \"calico-apiserver-598d747447-gjlqd\" (UID: \"9c9ca255-d68b-47ca-8107-0f46f9e9d40c\") " pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" Aug 19 08:17:30.528234 kubelet[3115]: I0819 08:17:30.528158 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6637040d-4229-4a0c-9583-c7b1e340ca6c-config-volume\") pod \"coredns-7c65d6cfc9-wkmf2\" (UID: \"6637040d-4229-4a0c-9583-c7b1e340ca6c\") " pod="kube-system/coredns-7c65d6cfc9-wkmf2" Aug 19 08:17:30.528469 kubelet[3115]: I0819 08:17:30.528185 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5acb51bf-65bc-4f86-93a4-1cc021ca500f-calico-apiserver-certs\") pod \"calico-apiserver-655b7bdffd-hjd48\" (UID: \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\") " pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" Aug 19 08:17:30.528469 kubelet[3115]: I0819 08:17:30.528346 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58n9\" (UniqueName: \"kubernetes.io/projected/2bb59b89-ca2b-49e1-bc22-00af9770adb2-kube-api-access-q58n9\") pod \"goldmane-58fd7646b9-h982d\" (UID: \"2bb59b89-ca2b-49e1-bc22-00af9770adb2\") " pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:30.528469 kubelet[3115]: I0819 08:17:30.528366 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qth4\" (UniqueName: \"kubernetes.io/projected/0b186b0f-8823-43c8-9fad-d97a1d8bc250-kube-api-access-2qth4\") pod \"coredns-7c65d6cfc9-7mkbh\" (UID: \"0b186b0f-8823-43c8-9fad-d97a1d8bc250\") " pod="kube-system/coredns-7c65d6cfc9-7mkbh" Aug 19 08:17:30.762342 containerd[1722]: time="2025-08-19T08:17:30.762308695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7mkbh,Uid:0b186b0f-8823-43c8-9fad-d97a1d8bc250,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:30.770926 containerd[1722]: time="2025-08-19T08:17:30.770900297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wkmf2,Uid:6637040d-4229-4a0c-9583-c7b1e340ca6c,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:30.783915 containerd[1722]: time="2025-08-19T08:17:30.783880320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-gjlqd,Uid:9c9ca255-d68b-47ca-8107-0f46f9e9d40c,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:30.793762 containerd[1722]: time="2025-08-19T08:17:30.793734597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86896c85c-k2466,Uid:afc774c5-998d-4b55-896a-2b3ba14bd2bf,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:30.804583 containerd[1722]: time="2025-08-19T08:17:30.804559390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-hjd48,Uid:5acb51bf-65bc-4f86-93a4-1cc021ca500f,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:30.811336 containerd[1722]: time="2025-08-19T08:17:30.811247685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d584f45c9-8d4g5,Uid:1fb141e5-61fb-48ee-bab2-be40ed742a47,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:30.815943 containerd[1722]: time="2025-08-19T08:17:30.815920002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-n6vc8,Uid:977f07be-ddc3-40de-8dad-97e5d4d17950,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:30.821839 systemd[1]: Created slice kubepods-besteffort-pod0d6984f4_4de2_408b_9ee3_92edac203b52.slice - libcontainer container kubepods-besteffort-pod0d6984f4_4de2_408b_9ee3_92edac203b52.slice. Aug 19 08:17:30.823441 containerd[1722]: time="2025-08-19T08:17:30.823419848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rw2mz,Uid:0d6984f4-4de2-408b-9ee3-92edac203b52,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:31.538890 containerd[1722]: time="2025-08-19T08:17:31.538845710Z" level=error msg="Failed to destroy network for sandbox \"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.596057 containerd[1722]: time="2025-08-19T08:17:31.595971524Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-hjd48,Uid:5acb51bf-65bc-4f86-93a4-1cc021ca500f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.596281 kubelet[3115]: E0819 08:17:31.596237 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.597243 kubelet[3115]: E0819 08:17:31.596314 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" Aug 19 08:17:31.597243 kubelet[3115]: E0819 08:17:31.596337 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" Aug 19 08:17:31.597243 kubelet[3115]: E0819 08:17:31.596380 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-655b7bdffd-hjd48_calico-apiserver(5acb51bf-65bc-4f86-93a4-1cc021ca500f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-655b7bdffd-hjd48_calico-apiserver(5acb51bf-65bc-4f86-93a4-1cc021ca500f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d52f5f1c85ee62435237b7cc77b79eadbfdbde1f4fcf60901eddb7348cc0773\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" podUID="5acb51bf-65bc-4f86-93a4-1cc021ca500f" Aug 19 08:17:31.627334 containerd[1722]: time="2025-08-19T08:17:31.627244138Z" level=error msg="Failed to destroy network for sandbox \"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.633589 containerd[1722]: time="2025-08-19T08:17:31.633481036Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7mkbh,Uid:0b186b0f-8823-43c8-9fad-d97a1d8bc250,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.633702 kubelet[3115]: E0819 08:17:31.633680 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.633746 kubelet[3115]: E0819 08:17:31.633729 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7mkbh" Aug 19 08:17:31.633771 kubelet[3115]: E0819 08:17:31.633749 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7mkbh" Aug 19 08:17:31.633885 kubelet[3115]: E0819 08:17:31.633788 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-7mkbh_kube-system(0b186b0f-8823-43c8-9fad-d97a1d8bc250)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-7mkbh_kube-system(0b186b0f-8823-43c8-9fad-d97a1d8bc250)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b0cedc820a4c657dd348eddc7d6544d92b1765daf28d89d00a6ed984cc9202e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-7mkbh" podUID="0b186b0f-8823-43c8-9fad-d97a1d8bc250" Aug 19 08:17:31.637128 kubelet[3115]: E0819 08:17:31.637101 3115 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Aug 19 08:17:31.637208 kubelet[3115]: E0819 08:17:31.637190 3115 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bb59b89-ca2b-49e1-bc22-00af9770adb2-goldmane-ca-bundle podName:2bb59b89-ca2b-49e1-bc22-00af9770adb2 nodeName:}" failed. No retries permitted until 2025-08-19 08:17:32.137167502 +0000 UTC m=+31.398451026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/2bb59b89-ca2b-49e1-bc22-00af9770adb2-goldmane-ca-bundle") pod "goldmane-58fd7646b9-h982d" (UID: "2bb59b89-ca2b-49e1-bc22-00af9770adb2") : failed to sync configmap cache: timed out waiting for the condition Aug 19 08:17:31.689210 containerd[1722]: time="2025-08-19T08:17:31.689091186Z" level=error msg="Failed to destroy network for sandbox \"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.693115 containerd[1722]: time="2025-08-19T08:17:31.693022606Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wkmf2,Uid:6637040d-4229-4a0c-9583-c7b1e340ca6c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.693433 kubelet[3115]: E0819 08:17:31.693405 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.693585 kubelet[3115]: E0819 08:17:31.693456 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-wkmf2" Aug 19 08:17:31.693585 kubelet[3115]: E0819 08:17:31.693477 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-wkmf2" Aug 19 08:17:31.693585 kubelet[3115]: E0819 08:17:31.693520 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-wkmf2_kube-system(6637040d-4229-4a0c-9583-c7b1e340ca6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-wkmf2_kube-system(6637040d-4229-4a0c-9583-c7b1e340ca6c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69e7401da35cb1df2944aa2a0f9e71f9ee7718f2427ef81fd58a38c764ccbcfb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-wkmf2" podUID="6637040d-4229-4a0c-9583-c7b1e340ca6c" Aug 19 08:17:31.714681 containerd[1722]: time="2025-08-19T08:17:31.714579507Z" level=error msg="Failed to destroy network for sandbox \"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.716821 containerd[1722]: time="2025-08-19T08:17:31.716789139Z" level=error msg="Failed to destroy network for sandbox \"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.720535 containerd[1722]: time="2025-08-19T08:17:31.720461533Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86896c85c-k2466,Uid:afc774c5-998d-4b55-896a-2b3ba14bd2bf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.720773 kubelet[3115]: E0819 08:17:31.720748 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.720827 kubelet[3115]: E0819 08:17:31.720794 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86896c85c-k2466" Aug 19 08:17:31.720827 kubelet[3115]: E0819 08:17:31.720817 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86896c85c-k2466" Aug 19 08:17:31.720877 kubelet[3115]: E0819 08:17:31.720849 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86896c85c-k2466_calico-system(afc774c5-998d-4b55-896a-2b3ba14bd2bf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86896c85c-k2466_calico-system(afc774c5-998d-4b55-896a-2b3ba14bd2bf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1ab7b2ab625e4c39e2dd3fc3c8c51d1d814f88dcacc100e37bb6a5c40f78bf9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86896c85c-k2466" podUID="afc774c5-998d-4b55-896a-2b3ba14bd2bf" Aug 19 08:17:31.724459 containerd[1722]: time="2025-08-19T08:17:31.724023062Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-n6vc8,Uid:977f07be-ddc3-40de-8dad-97e5d4d17950,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.724853 kubelet[3115]: E0819 08:17:31.724557 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.724853 kubelet[3115]: E0819 08:17:31.724598 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" Aug 19 08:17:31.724853 kubelet[3115]: E0819 08:17:31.724624 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" Aug 19 08:17:31.724968 kubelet[3115]: E0819 08:17:31.724709 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-655b7bdffd-n6vc8_calico-apiserver(977f07be-ddc3-40de-8dad-97e5d4d17950)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-655b7bdffd-n6vc8_calico-apiserver(977f07be-ddc3-40de-8dad-97e5d4d17950)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c3eb90196b40a838920819c406a08133ef5a2bad919fe5f96a24a6ed77d5e42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" podUID="977f07be-ddc3-40de-8dad-97e5d4d17950" Aug 19 08:17:31.731973 containerd[1722]: time="2025-08-19T08:17:31.731598120Z" level=error msg="Failed to destroy network for sandbox \"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.731973 containerd[1722]: time="2025-08-19T08:17:31.731819353Z" level=error msg="Failed to destroy network for sandbox \"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.733824 containerd[1722]: time="2025-08-19T08:17:31.733799391Z" level=error msg="Failed to destroy network for sandbox \"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.734563 containerd[1722]: time="2025-08-19T08:17:31.734540874Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-gjlqd,Uid:9c9ca255-d68b-47ca-8107-0f46f9e9d40c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.734741 kubelet[3115]: E0819 08:17:31.734720 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.734783 kubelet[3115]: E0819 08:17:31.734761 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" Aug 19 08:17:31.734807 kubelet[3115]: E0819 08:17:31.734778 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" Aug 19 08:17:31.734827 kubelet[3115]: E0819 08:17:31.734811 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-598d747447-gjlqd_calico-apiserver(9c9ca255-d68b-47ca-8107-0f46f9e9d40c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-598d747447-gjlqd_calico-apiserver(9c9ca255-d68b-47ca-8107-0f46f9e9d40c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba45b623955aff0fab6d7720b372bbf11d894a5a432774ba3e5d966720f2e639\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" podUID="9c9ca255-d68b-47ca-8107-0f46f9e9d40c" Aug 19 08:17:31.738699 containerd[1722]: time="2025-08-19T08:17:31.738631889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rw2mz,Uid:0d6984f4-4de2-408b-9ee3-92edac203b52,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.738792 kubelet[3115]: E0819 08:17:31.738773 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.738837 kubelet[3115]: E0819 08:17:31.738822 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:31.738890 kubelet[3115]: E0819 08:17:31.738841 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rw2mz" Aug 19 08:17:31.738922 kubelet[3115]: E0819 08:17:31.738879 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rw2mz_calico-system(0d6984f4-4de2-408b-9ee3-92edac203b52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rw2mz_calico-system(0d6984f4-4de2-408b-9ee3-92edac203b52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81452a5047bcb9f8723b9de70567c17d88cb3467493fd7769a4c71d8cdd6d96e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rw2mz" podUID="0d6984f4-4de2-408b-9ee3-92edac203b52" Aug 19 08:17:31.744141 containerd[1722]: time="2025-08-19T08:17:31.744110497Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d584f45c9-8d4g5,Uid:1fb141e5-61fb-48ee-bab2-be40ed742a47,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.744276 kubelet[3115]: E0819 08:17:31.744251 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:31.744320 kubelet[3115]: E0819 08:17:31.744293 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d584f45c9-8d4g5" Aug 19 08:17:31.744320 kubelet[3115]: E0819 08:17:31.744311 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d584f45c9-8d4g5" Aug 19 08:17:31.744381 kubelet[3115]: E0819 08:17:31.744346 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5d584f45c9-8d4g5_calico-system(1fb141e5-61fb-48ee-bab2-be40ed742a47)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5d584f45c9-8d4g5_calico-system(1fb141e5-61fb-48ee-bab2-be40ed742a47)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cbbbf35e45be7e1e8ae558a21a0e2f798c745b48fcceada1714642cdf20e3280\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5d584f45c9-8d4g5" podUID="1fb141e5-61fb-48ee-bab2-be40ed742a47" Aug 19 08:17:31.915397 containerd[1722]: time="2025-08-19T08:17:31.914328440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 08:17:32.299757 containerd[1722]: time="2025-08-19T08:17:32.299705043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-h982d,Uid:2bb59b89-ca2b-49e1-bc22-00af9770adb2,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:32.385885 containerd[1722]: time="2025-08-19T08:17:32.385836319Z" level=error msg="Failed to destroy network for sandbox \"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:32.408590 containerd[1722]: time="2025-08-19T08:17:32.408540478Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-h982d,Uid:2bb59b89-ca2b-49e1-bc22-00af9770adb2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:32.408833 kubelet[3115]: E0819 08:17:32.408797 3115 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:17:32.408893 kubelet[3115]: E0819 08:17:32.408858 3115 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:32.408893 kubelet[3115]: E0819 08:17:32.408881 3115 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-h982d" Aug 19 08:17:32.408956 kubelet[3115]: E0819 08:17:32.408928 3115 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-h982d_calico-system(2bb59b89-ca2b-49e1-bc22-00af9770adb2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-h982d_calico-system(2bb59b89-ca2b-49e1-bc22-00af9770adb2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5faceb0f32fe15ae9f1729bfe1743ca95a18f544d9fd3ad149ba97bab7ee8523\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-h982d" podUID="2bb59b89-ca2b-49e1-bc22-00af9770adb2" Aug 19 08:17:32.431775 systemd[1]: run-netns-cni\x2df2fa55b9\x2d0ab8\x2dc611\x2d8297\x2dd6a07e9acfb8.mount: Deactivated successfully. Aug 19 08:17:32.431862 systemd[1]: run-netns-cni\x2d494df2df\x2d228f\x2dcbfe\x2db755\x2dcbebfd560c86.mount: Deactivated successfully. Aug 19 08:17:32.431917 systemd[1]: run-netns-cni\x2dd9f092be\x2dd190\x2da059\x2d7959\x2d92c80186967e.mount: Deactivated successfully. Aug 19 08:17:32.431963 systemd[1]: run-netns-cni\x2dd9580788\x2d704a\x2d4913\x2d6d44\x2de2d07c457edb.mount: Deactivated successfully. Aug 19 08:17:32.432008 systemd[1]: run-netns-cni\x2d0c0676a8\x2d4111\x2daee5\x2dd38d\x2d33145f9af8d5.mount: Deactivated successfully. Aug 19 08:17:32.432064 systemd[1]: run-netns-cni\x2da87094e5\x2d2a92\x2d7508\x2d3a90\x2d797c84457d19.mount: Deactivated successfully. Aug 19 08:17:36.889864 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount128784351.mount: Deactivated successfully. Aug 19 08:17:36.947114 containerd[1722]: time="2025-08-19T08:17:36.947074272Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:36.949689 containerd[1722]: time="2025-08-19T08:17:36.949652313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 19 08:17:36.953831 containerd[1722]: time="2025-08-19T08:17:36.953772842Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:36.958001 containerd[1722]: time="2025-08-19T08:17:36.957959625Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:36.958333 containerd[1722]: time="2025-08-19T08:17:36.958276783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 5.043911074s" Aug 19 08:17:36.958333 containerd[1722]: time="2025-08-19T08:17:36.958304956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 19 08:17:36.965921 containerd[1722]: time="2025-08-19T08:17:36.965848792Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 08:17:37.100208 containerd[1722]: time="2025-08-19T08:17:37.100175326Z" level=info msg="Container 2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:37.176479 containerd[1722]: time="2025-08-19T08:17:37.176339627Z" level=info msg="CreateContainer within sandbox \"4564ecc3ba0262c7733e22741c54ac0740479f5e66f53c77add88d66c61ceaeb\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\"" Aug 19 08:17:37.177000 containerd[1722]: time="2025-08-19T08:17:37.176950165Z" level=info msg="StartContainer for \"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\"" Aug 19 08:17:37.178222 containerd[1722]: time="2025-08-19T08:17:37.178190873Z" level=info msg="connecting to shim 2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90" address="unix:///run/containerd/s/cfd0587e3d2bd234b94da1980c1b17db3d83728164009b414766d8363fc44a26" protocol=ttrpc version=3 Aug 19 08:17:37.196221 systemd[1]: Started cri-containerd-2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90.scope - libcontainer container 2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90. Aug 19 08:17:37.298696 containerd[1722]: time="2025-08-19T08:17:37.298662273Z" level=info msg="StartContainer for \"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" returns successfully" Aug 19 08:17:37.685515 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 08:17:37.685635 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 08:17:37.873181 kubelet[3115]: I0819 08:17:37.873147 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-ca-bundle\") pod \"1fb141e5-61fb-48ee-bab2-be40ed742a47\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " Aug 19 08:17:37.873498 kubelet[3115]: I0819 08:17:37.873192 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-backend-key-pair\") pod \"1fb141e5-61fb-48ee-bab2-be40ed742a47\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " Aug 19 08:17:37.873498 kubelet[3115]: I0819 08:17:37.873211 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85m2j\" (UniqueName: \"kubernetes.io/projected/1fb141e5-61fb-48ee-bab2-be40ed742a47-kube-api-access-85m2j\") pod \"1fb141e5-61fb-48ee-bab2-be40ed742a47\" (UID: \"1fb141e5-61fb-48ee-bab2-be40ed742a47\") " Aug 19 08:17:37.876164 kubelet[3115]: I0819 08:17:37.873707 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1fb141e5-61fb-48ee-bab2-be40ed742a47" (UID: "1fb141e5-61fb-48ee-bab2-be40ed742a47"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 19 08:17:37.877501 kubelet[3115]: I0819 08:17:37.877474 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb141e5-61fb-48ee-bab2-be40ed742a47-kube-api-access-85m2j" (OuterVolumeSpecName: "kube-api-access-85m2j") pod "1fb141e5-61fb-48ee-bab2-be40ed742a47" (UID: "1fb141e5-61fb-48ee-bab2-be40ed742a47"). InnerVolumeSpecName "kube-api-access-85m2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 19 08:17:37.879291 kubelet[3115]: I0819 08:17:37.879264 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1fb141e5-61fb-48ee-bab2-be40ed742a47" (UID: "1fb141e5-61fb-48ee-bab2-be40ed742a47"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 19 08:17:37.890763 systemd[1]: var-lib-kubelet-pods-1fb141e5\x2d61fb\x2d48ee\x2dbab2\x2dbe40ed742a47-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d85m2j.mount: Deactivated successfully. Aug 19 08:17:37.890859 systemd[1]: var-lib-kubelet-pods-1fb141e5\x2d61fb\x2d48ee\x2dbab2\x2dbe40ed742a47-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 08:17:37.940168 systemd[1]: Removed slice kubepods-besteffort-pod1fb141e5_61fb_48ee_bab2_be40ed742a47.slice - libcontainer container kubepods-besteffort-pod1fb141e5_61fb_48ee_bab2_be40ed742a47.slice. Aug 19 08:17:37.958286 kubelet[3115]: I0819 08:17:37.957551 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n2h2x" podStartSLOduration=1.664973646 podStartE2EDuration="19.957532397s" podCreationTimestamp="2025-08-19 08:17:18 +0000 UTC" firstStartedPulling="2025-08-19 08:17:18.666382154 +0000 UTC m=+17.927665684" lastFinishedPulling="2025-08-19 08:17:36.95894091 +0000 UTC m=+36.220224435" observedRunningTime="2025-08-19 08:17:37.95609752 +0000 UTC m=+37.217381048" watchObservedRunningTime="2025-08-19 08:17:37.957532397 +0000 UTC m=+37.218815923" Aug 19 08:17:37.976095 kubelet[3115]: I0819 08:17:37.976074 3115 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-ca-bundle\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:17:37.976199 kubelet[3115]: I0819 08:17:37.976189 3115 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fb141e5-61fb-48ee-bab2-be40ed742a47-whisker-backend-key-pair\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:17:37.976479 kubelet[3115]: I0819 08:17:37.976238 3115 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85m2j\" (UniqueName: \"kubernetes.io/projected/1fb141e5-61fb-48ee-bab2-be40ed742a47-kube-api-access-85m2j\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:17:38.012523 containerd[1722]: time="2025-08-19T08:17:38.012488708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"e950fefbff68b9343cdc96b0f9b7b425e5540fd528c009770751c3bd9fc08bf9\" pid:4262 exit_status:1 exited_at:{seconds:1755591458 nanos:12078068}" Aug 19 08:17:38.033618 systemd[1]: Created slice kubepods-besteffort-pod171f0308_b0fd_48b5_9f50_3c849f2cbe23.slice - libcontainer container kubepods-besteffort-pod171f0308_b0fd_48b5_9f50_3c849f2cbe23.slice. Aug 19 08:17:38.076906 kubelet[3115]: I0819 08:17:38.076877 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77tp\" (UniqueName: \"kubernetes.io/projected/171f0308-b0fd-48b5-9f50-3c849f2cbe23-kube-api-access-k77tp\") pod \"whisker-6fc69bf457-pzdkm\" (UID: \"171f0308-b0fd-48b5-9f50-3c849f2cbe23\") " pod="calico-system/whisker-6fc69bf457-pzdkm" Aug 19 08:17:38.077004 kubelet[3115]: I0819 08:17:38.076922 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/171f0308-b0fd-48b5-9f50-3c849f2cbe23-whisker-backend-key-pair\") pod \"whisker-6fc69bf457-pzdkm\" (UID: \"171f0308-b0fd-48b5-9f50-3c849f2cbe23\") " pod="calico-system/whisker-6fc69bf457-pzdkm" Aug 19 08:17:38.077004 kubelet[3115]: I0819 08:17:38.076947 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/171f0308-b0fd-48b5-9f50-3c849f2cbe23-whisker-ca-bundle\") pod \"whisker-6fc69bf457-pzdkm\" (UID: \"171f0308-b0fd-48b5-9f50-3c849f2cbe23\") " pod="calico-system/whisker-6fc69bf457-pzdkm" Aug 19 08:17:38.337755 containerd[1722]: time="2025-08-19T08:17:38.337627671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fc69bf457-pzdkm,Uid:171f0308-b0fd-48b5-9f50-3c849f2cbe23,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:38.537667 systemd-networkd[1370]: calia5b458c1219: Link UP Aug 19 08:17:38.538400 systemd-networkd[1370]: calia5b458c1219: Gained carrier Aug 19 08:17:38.554864 containerd[1722]: 2025-08-19 08:17:38.429 [INFO][4289] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:17:38.554864 containerd[1722]: 2025-08-19 08:17:38.436 [INFO][4289] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0 whisker-6fc69bf457- calico-system 171f0308-b0fd-48b5-9f50-3c849f2cbe23 913 0 2025-08-19 08:17:38 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6fc69bf457 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf whisker-6fc69bf457-pzdkm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia5b458c1219 [] [] }} ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-" Aug 19 08:17:38.554864 containerd[1722]: 2025-08-19 08:17:38.437 [INFO][4289] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.554864 containerd[1722]: 2025-08-19 08:17:38.456 [INFO][4301] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" HandleID="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.457 [INFO][4301] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" HandleID="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"whisker-6fc69bf457-pzdkm", "timestamp":"2025-08-19 08:17:38.456951521 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.457 [INFO][4301] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.457 [INFO][4301] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.457 [INFO][4301] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.462 [INFO][4301] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.465 [INFO][4301] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.468 [INFO][4301] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.469 [INFO][4301] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555137 containerd[1722]: 2025-08-19 08:17:38.471 [INFO][4301] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.471 [INFO][4301] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.472 [INFO][4301] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3 Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.478 [INFO][4301] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.482 [INFO][4301] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.193/26] block=192.168.10.192/26 handle="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.482 [INFO][4301] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.193/26] handle="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.482 [INFO][4301] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:38.555438 containerd[1722]: 2025-08-19 08:17:38.482 [INFO][4301] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.193/26] IPv6=[] ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" HandleID="k8s-pod-network.6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.555677 containerd[1722]: 2025-08-19 08:17:38.484 [INFO][4289] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0", GenerateName:"whisker-6fc69bf457-", Namespace:"calico-system", SelfLink:"", UID:"171f0308-b0fd-48b5-9f50-3c849f2cbe23", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6fc69bf457", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"whisker-6fc69bf457-pzdkm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.10.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia5b458c1219", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:38.555677 containerd[1722]: 2025-08-19 08:17:38.485 [INFO][4289] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.193/32] ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.555839 containerd[1722]: 2025-08-19 08:17:38.485 [INFO][4289] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia5b458c1219 ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.555839 containerd[1722]: 2025-08-19 08:17:38.538 [INFO][4289] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.555932 containerd[1722]: 2025-08-19 08:17:38.539 [INFO][4289] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0", GenerateName:"whisker-6fc69bf457-", Namespace:"calico-system", SelfLink:"", UID:"171f0308-b0fd-48b5-9f50-3c849f2cbe23", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6fc69bf457", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3", Pod:"whisker-6fc69bf457-pzdkm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.10.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia5b458c1219", MAC:"56:87:f7:54:4a:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:38.556015 containerd[1722]: 2025-08-19 08:17:38.553 [INFO][4289] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" Namespace="calico-system" Pod="whisker-6fc69bf457-pzdkm" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-whisker--6fc69bf457--pzdkm-eth0" Aug 19 08:17:38.702333 containerd[1722]: time="2025-08-19T08:17:38.702235222Z" level=info msg="connecting to shim 6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3" address="unix:///run/containerd/s/f36a905b2093727cfb97afc1d380d4eec02c4bc4b51490d8548012e42122ba74" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:38.722168 systemd[1]: Started cri-containerd-6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3.scope - libcontainer container 6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3. Aug 19 08:17:38.765524 containerd[1722]: time="2025-08-19T08:17:38.765499204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fc69bf457-pzdkm,Uid:171f0308-b0fd-48b5-9f50-3c849f2cbe23,Namespace:calico-system,Attempt:0,} returns sandbox id \"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3\"" Aug 19 08:17:38.766670 containerd[1722]: time="2025-08-19T08:17:38.766500078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 08:17:38.819348 kubelet[3115]: I0819 08:17:38.819317 3115 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb141e5-61fb-48ee-bab2-be40ed742a47" path="/var/lib/kubelet/pods/1fb141e5-61fb-48ee-bab2-be40ed742a47/volumes" Aug 19 08:17:38.993471 containerd[1722]: time="2025-08-19T08:17:38.993354064Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"ff2681f91ecb61139c95680cb796a18982fe10ad95e34e1480865579c7ebcf33\" pid:4373 exit_status:1 exited_at:{seconds:1755591458 nanos:993124453}" Aug 19 08:17:39.758251 systemd-networkd[1370]: vxlan.calico: Link UP Aug 19 08:17:39.758260 systemd-networkd[1370]: vxlan.calico: Gained carrier Aug 19 08:17:39.782347 systemd-networkd[1370]: calia5b458c1219: Gained IPv6LL Aug 19 08:17:40.748898 containerd[1722]: time="2025-08-19T08:17:40.748859260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:40.757576 containerd[1722]: time="2025-08-19T08:17:40.757538492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 19 08:17:40.762121 containerd[1722]: time="2025-08-19T08:17:40.762080257Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:40.765875 containerd[1722]: time="2025-08-19T08:17:40.765835659Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:40.766338 containerd[1722]: time="2025-08-19T08:17:40.766223874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.999697382s" Aug 19 08:17:40.766338 containerd[1722]: time="2025-08-19T08:17:40.766254643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 19 08:17:40.768472 containerd[1722]: time="2025-08-19T08:17:40.768175558Z" level=info msg="CreateContainer within sandbox \"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 08:17:40.809630 containerd[1722]: time="2025-08-19T08:17:40.808159895Z" level=info msg="Container 47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:40.831431 containerd[1722]: time="2025-08-19T08:17:40.831406078Z" level=info msg="CreateContainer within sandbox \"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541\"" Aug 19 08:17:40.832824 containerd[1722]: time="2025-08-19T08:17:40.831846192Z" level=info msg="StartContainer for \"47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541\"" Aug 19 08:17:40.832971 containerd[1722]: time="2025-08-19T08:17:40.832950825Z" level=info msg="connecting to shim 47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541" address="unix:///run/containerd/s/f36a905b2093727cfb97afc1d380d4eec02c4bc4b51490d8548012e42122ba74" protocol=ttrpc version=3 Aug 19 08:17:40.852191 systemd[1]: Started cri-containerd-47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541.scope - libcontainer container 47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541. Aug 19 08:17:40.895992 containerd[1722]: time="2025-08-19T08:17:40.895966644Z" level=info msg="StartContainer for \"47e916151e4778498a5c2fecd19c83c05e8294ca721cfcf84e472c5f1ad9b541\" returns successfully" Aug 19 08:17:40.898089 containerd[1722]: time="2025-08-19T08:17:40.897812690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 08:17:40.997154 systemd-networkd[1370]: vxlan.calico: Gained IPv6LL Aug 19 08:17:42.818315 containerd[1722]: time="2025-08-19T08:17:42.818253554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7mkbh,Uid:0b186b0f-8823-43c8-9fad-d97a1d8bc250,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:42.910926 systemd-networkd[1370]: calif6f27f0e65b: Link UP Aug 19 08:17:42.911135 systemd-networkd[1370]: calif6f27f0e65b: Gained carrier Aug 19 08:17:42.931379 containerd[1722]: 2025-08-19 08:17:42.854 [INFO][4617] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0 coredns-7c65d6cfc9- kube-system 0b186b0f-8823-43c8-9fad-d97a1d8bc250 845 0 2025-08-19 08:17:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf coredns-7c65d6cfc9-7mkbh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif6f27f0e65b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-" Aug 19 08:17:42.931379 containerd[1722]: 2025-08-19 08:17:42.854 [INFO][4617] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931379 containerd[1722]: 2025-08-19 08:17:42.876 [INFO][4629] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" HandleID="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.876 [INFO][4629] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" HandleID="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"coredns-7c65d6cfc9-7mkbh", "timestamp":"2025-08-19 08:17:42.876644591 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.876 [INFO][4629] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.876 [INFO][4629] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.876 [INFO][4629] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.881 [INFO][4629] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.884 [INFO][4629] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.887 [INFO][4629] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.889 [INFO][4629] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931576 containerd[1722]: 2025-08-19 08:17:42.890 [INFO][4629] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.890 [INFO][4629] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.891 [INFO][4629] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.895 [INFO][4629] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.905 [INFO][4629] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.194/26] block=192.168.10.192/26 handle="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.905 [INFO][4629] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.194/26] handle="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.905 [INFO][4629] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:42.931777 containerd[1722]: 2025-08-19 08:17:42.905 [INFO][4629] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.194/26] IPv6=[] ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" HandleID="k8s-pod-network.17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.907 [INFO][4617] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0b186b0f-8823-43c8-9fad-d97a1d8bc250", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"coredns-7c65d6cfc9-7mkbh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6f27f0e65b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.907 [INFO][4617] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.194/32] ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.907 [INFO][4617] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6f27f0e65b ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.912 [INFO][4617] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.912 [INFO][4617] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0b186b0f-8823-43c8-9fad-d97a1d8bc250", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f", Pod:"coredns-7c65d6cfc9-7mkbh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6f27f0e65b", MAC:"66:da:16:6b:a6:85", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:42.931925 containerd[1722]: 2025-08-19 08:17:42.928 [INFO][4617] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7mkbh" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--7mkbh-eth0" Aug 19 08:17:42.985750 containerd[1722]: time="2025-08-19T08:17:42.985669035Z" level=info msg="connecting to shim 17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f" address="unix:///run/containerd/s/8e25c8da8f2087bdc204b589e878f540e4364e9ab267ca3274e168a15a86a495" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:43.006235 systemd[1]: Started cri-containerd-17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f.scope - libcontainer container 17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f. Aug 19 08:17:43.054054 containerd[1722]: time="2025-08-19T08:17:43.053900119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7mkbh,Uid:0b186b0f-8823-43c8-9fad-d97a1d8bc250,Namespace:kube-system,Attempt:0,} returns sandbox id \"17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f\"" Aug 19 08:17:43.057337 containerd[1722]: time="2025-08-19T08:17:43.057307341Z" level=info msg="CreateContainer within sandbox \"17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:17:43.093060 containerd[1722]: time="2025-08-19T08:17:43.092968702Z" level=info msg="Container e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:43.116545 containerd[1722]: time="2025-08-19T08:17:43.116520431Z" level=info msg="CreateContainer within sandbox \"17065d654f2e648a7881747f7ca6b96b7e7ff6c7c902eed8a75617c78b40002f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766\"" Aug 19 08:17:43.116975 containerd[1722]: time="2025-08-19T08:17:43.116859107Z" level=info msg="StartContainer for \"e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766\"" Aug 19 08:17:43.117899 containerd[1722]: time="2025-08-19T08:17:43.117836456Z" level=info msg="connecting to shim e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766" address="unix:///run/containerd/s/8e25c8da8f2087bdc204b589e878f540e4364e9ab267ca3274e168a15a86a495" protocol=ttrpc version=3 Aug 19 08:17:43.138177 systemd[1]: Started cri-containerd-e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766.scope - libcontainer container e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766. Aug 19 08:17:43.171509 containerd[1722]: time="2025-08-19T08:17:43.171481156Z" level=info msg="StartContainer for \"e136f16e942c8a8c40123a7ddfb27a537c96e1e858bb9688910e86e136e4d766\" returns successfully" Aug 19 08:17:43.806159 containerd[1722]: time="2025-08-19T08:17:43.806115016Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:43.808996 containerd[1722]: time="2025-08-19T08:17:43.808954945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 19 08:17:43.812973 containerd[1722]: time="2025-08-19T08:17:43.812927887Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:43.818282 containerd[1722]: time="2025-08-19T08:17:43.818018531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-hjd48,Uid:5acb51bf-65bc-4f86-93a4-1cc021ca500f,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:43.818461 containerd[1722]: time="2025-08-19T08:17:43.818427815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:43.818996 containerd[1722]: time="2025-08-19T08:17:43.818969668Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.921113627s" Aug 19 08:17:43.819064 containerd[1722]: time="2025-08-19T08:17:43.818997028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 19 08:17:43.819431 containerd[1722]: time="2025-08-19T08:17:43.819344561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86896c85c-k2466,Uid:afc774c5-998d-4b55-896a-2b3ba14bd2bf,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:43.819490 containerd[1722]: time="2025-08-19T08:17:43.819472842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rw2mz,Uid:0d6984f4-4de2-408b-9ee3-92edac203b52,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:43.821091 containerd[1722]: time="2025-08-19T08:17:43.820793812Z" level=info msg="CreateContainer within sandbox \"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 08:17:43.930846 containerd[1722]: time="2025-08-19T08:17:43.930814488Z" level=info msg="Container 22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:43.940897 systemd-networkd[1370]: cali3c4179ea93f: Link UP Aug 19 08:17:43.941834 systemd-networkd[1370]: cali3c4179ea93f: Gained carrier Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.863 [INFO][4729] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0 calico-apiserver-655b7bdffd- calico-apiserver 5acb51bf-65bc-4f86-93a4-1cc021ca500f 854 0 2025-08-19 08:17:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:655b7bdffd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf calico-apiserver-655b7bdffd-hjd48 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3c4179ea93f [] [] }} ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.863 [INFO][4729] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.882 [INFO][4743] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.882 [INFO][4743] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"calico-apiserver-655b7bdffd-hjd48", "timestamp":"2025-08-19 08:17:43.88264821 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.882 [INFO][4743] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.882 [INFO][4743] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.882 [INFO][4743] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.887 [INFO][4743] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.895 [INFO][4743] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.900 [INFO][4743] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.902 [INFO][4743] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.905 [INFO][4743] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.905 [INFO][4743] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.908 [INFO][4743] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40 Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.914 [INFO][4743] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.926 [INFO][4743] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.195/26] block=192.168.10.192/26 handle="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.927 [INFO][4743] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.195/26] handle="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.927 [INFO][4743] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:43.964436 containerd[1722]: 2025-08-19 08:17:43.927 [INFO][4743] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.195/26] IPv6=[] ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.937 [INFO][4729] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0", GenerateName:"calico-apiserver-655b7bdffd-", Namespace:"calico-apiserver", SelfLink:"", UID:"5acb51bf-65bc-4f86-93a4-1cc021ca500f", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"655b7bdffd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"calico-apiserver-655b7bdffd-hjd48", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3c4179ea93f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.937 [INFO][4729] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.195/32] ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.937 [INFO][4729] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3c4179ea93f ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.941 [INFO][4729] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.944 [INFO][4729] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0", GenerateName:"calico-apiserver-655b7bdffd-", Namespace:"calico-apiserver", SelfLink:"", UID:"5acb51bf-65bc-4f86-93a4-1cc021ca500f", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"655b7bdffd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40", Pod:"calico-apiserver-655b7bdffd-hjd48", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3c4179ea93f", MAC:"e2:d7:ed:ae:c0:d0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:43.966013 containerd[1722]: 2025-08-19 08:17:43.961 [INFO][4729] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-hjd48" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:17:43.969052 containerd[1722]: time="2025-08-19T08:17:43.968001553Z" level=info msg="CreateContainer within sandbox \"6e070ac6bac63459aef9d4ccec2fbffa0ea356b1b16157268eb3ca16703a0db3\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235\"" Aug 19 08:17:43.970190 containerd[1722]: time="2025-08-19T08:17:43.970168123Z" level=info msg="StartContainer for \"22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235\"" Aug 19 08:17:43.972863 containerd[1722]: time="2025-08-19T08:17:43.972838986Z" level=info msg="connecting to shim 22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235" address="unix:///run/containerd/s/f36a905b2093727cfb97afc1d380d4eec02c4bc4b51490d8548012e42122ba74" protocol=ttrpc version=3 Aug 19 08:17:43.979142 kubelet[3115]: I0819 08:17:43.978778 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-7mkbh" podStartSLOduration=38.978761543 podStartE2EDuration="38.978761543s" podCreationTimestamp="2025-08-19 08:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:43.978597962 +0000 UTC m=+43.239881484" watchObservedRunningTime="2025-08-19 08:17:43.978761543 +0000 UTC m=+43.240045065" Aug 19 08:17:44.014314 systemd[1]: Started cri-containerd-22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235.scope - libcontainer container 22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235. Aug 19 08:17:44.088261 containerd[1722]: time="2025-08-19T08:17:44.088172236Z" level=info msg="connecting to shim 9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" address="unix:///run/containerd/s/c53ced3041dd12da7b07e2646ffd8160261e55acef3631166bd2bb79768d0c30" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:44.124743 systemd-networkd[1370]: cali7e43da7444d: Link UP Aug 19 08:17:44.128730 systemd-networkd[1370]: cali7e43da7444d: Gained carrier Aug 19 08:17:44.136357 systemd[1]: Started cri-containerd-9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40.scope - libcontainer container 9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40. Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:43.965 [INFO][4749] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0 calico-kube-controllers-86896c85c- calico-system afc774c5-998d-4b55-896a-2b3ba14bd2bf 850 0 2025-08-19 08:17:18 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86896c85c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf calico-kube-controllers-86896c85c-k2466 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7e43da7444d [] [] }} ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:43.965 [INFO][4749] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.047 [INFO][4783] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" HandleID="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.047 [INFO][4783] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" HandleID="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f620), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"calico-kube-controllers-86896c85c-k2466", "timestamp":"2025-08-19 08:17:44.046797236 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.047 [INFO][4783] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.047 [INFO][4783] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.048 [INFO][4783] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.055 [INFO][4783] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.067 [INFO][4783] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.072 [INFO][4783] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.074 [INFO][4783] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.079 [INFO][4783] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.080 [INFO][4783] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.082 [INFO][4783] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.091 [INFO][4783] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4783] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.196/26] block=192.168.10.192/26 handle="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4783] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.196/26] handle="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4783] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:44.150805 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4783] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.196/26] IPv6=[] ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" HandleID="k8s-pod-network.f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.110 [INFO][4749] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0", GenerateName:"calico-kube-controllers-86896c85c-", Namespace:"calico-system", SelfLink:"", UID:"afc774c5-998d-4b55-896a-2b3ba14bd2bf", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86896c85c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"calico-kube-controllers-86896c85c-k2466", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7e43da7444d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.111 [INFO][4749] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.196/32] ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.112 [INFO][4749] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7e43da7444d ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.130 [INFO][4749] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.134 [INFO][4749] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0", GenerateName:"calico-kube-controllers-86896c85c-", Namespace:"calico-system", SelfLink:"", UID:"afc774c5-998d-4b55-896a-2b3ba14bd2bf", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86896c85c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c", Pod:"calico-kube-controllers-86896c85c-k2466", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7e43da7444d", MAC:"a6:8d:23:29:2a:80", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.151974 containerd[1722]: 2025-08-19 08:17:44.147 [INFO][4749] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" Namespace="calico-system" Pod="calico-kube-controllers-86896c85c-k2466" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--kube--controllers--86896c85c--k2466-eth0" Aug 19 08:17:44.196839 containerd[1722]: time="2025-08-19T08:17:44.196809187Z" level=info msg="StartContainer for \"22ba3f8ae30963783e380367cfae74617167cec78935dbc0cb30e607f51b5235\" returns successfully" Aug 19 08:17:44.222480 systemd-networkd[1370]: cali2f1af18d8ee: Link UP Aug 19 08:17:44.223546 systemd-networkd[1370]: cali2f1af18d8ee: Gained carrier Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:43.993 [INFO][4760] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0 csi-node-driver- calico-system 0d6984f4-4de2-408b-9ee3-92edac203b52 734 0 2025-08-19 08:17:18 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf csi-node-driver-rw2mz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali2f1af18d8ee [] [] }} ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:43.994 [INFO][4760] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.057 [INFO][4801] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" HandleID="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.057 [INFO][4801] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" HandleID="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5710), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"csi-node-driver-rw2mz", "timestamp":"2025-08-19 08:17:44.057172374 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.057 [INFO][4801] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4801] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.104 [INFO][4801] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.158 [INFO][4801] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.170 [INFO][4801] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.183 [INFO][4801] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.185 [INFO][4801] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.189 [INFO][4801] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.189 [INFO][4801] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.191 [INFO][4801] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3 Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.201 [INFO][4801] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.217 [INFO][4801] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.197/26] block=192.168.10.192/26 handle="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.217 [INFO][4801] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.197/26] handle="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.217 [INFO][4801] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:44.242607 containerd[1722]: 2025-08-19 08:17:44.217 [INFO][4801] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.197/26] IPv6=[] ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" HandleID="k8s-pod-network.e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.219 [INFO][4760] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d6984f4-4de2-408b-9ee3-92edac203b52", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"csi-node-driver-rw2mz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2f1af18d8ee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.219 [INFO][4760] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.197/32] ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.219 [INFO][4760] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2f1af18d8ee ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.222 [INFO][4760] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.222 [INFO][4760] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0d6984f4-4de2-408b-9ee3-92edac203b52", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3", Pod:"csi-node-driver-rw2mz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2f1af18d8ee", MAC:"da:73:da:9f:fc:f6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.243768 containerd[1722]: 2025-08-19 08:17:44.237 [INFO][4760] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" Namespace="calico-system" Pod="csi-node-driver-rw2mz" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-csi--node--driver--rw2mz-eth0" Aug 19 08:17:44.249802 containerd[1722]: time="2025-08-19T08:17:44.249415077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-hjd48,Uid:5acb51bf-65bc-4f86-93a4-1cc021ca500f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\"" Aug 19 08:17:44.250946 containerd[1722]: time="2025-08-19T08:17:44.250916505Z" level=info msg="connecting to shim f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c" address="unix:///run/containerd/s/ae595532f4b3ccfb12f72c71509e4362d8fe0505799440addeceee5c7fcdae92" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:44.253311 containerd[1722]: time="2025-08-19T08:17:44.253288930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:17:44.279213 systemd[1]: Started cri-containerd-f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c.scope - libcontainer container f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c. Aug 19 08:17:44.304288 containerd[1722]: time="2025-08-19T08:17:44.304234468Z" level=info msg="connecting to shim e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3" address="unix:///run/containerd/s/cca107b4abea69eec5f359c875ce378134f20f85076b9a8a44260c371503f93f" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:44.325520 containerd[1722]: time="2025-08-19T08:17:44.325497665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86896c85c-k2466,Uid:afc774c5-998d-4b55-896a-2b3ba14bd2bf,Namespace:calico-system,Attempt:0,} returns sandbox id \"f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c\"" Aug 19 08:17:44.329164 systemd[1]: Started cri-containerd-e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3.scope - libcontainer container e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3. Aug 19 08:17:44.351412 containerd[1722]: time="2025-08-19T08:17:44.351345768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rw2mz,Uid:0d6984f4-4de2-408b-9ee3-92edac203b52,Namespace:calico-system,Attempt:0,} returns sandbox id \"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3\"" Aug 19 08:17:44.818608 containerd[1722]: time="2025-08-19T08:17:44.818351138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-n6vc8,Uid:977f07be-ddc3-40de-8dad-97e5d4d17950,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:44.837977 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3374517449.mount: Deactivated successfully. Aug 19 08:17:44.916280 systemd-networkd[1370]: calif951dd5a7dd: Link UP Aug 19 08:17:44.916919 systemd-networkd[1370]: calif951dd5a7dd: Gained carrier Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.860 [INFO][4996] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0 calico-apiserver-655b7bdffd- calico-apiserver 977f07be-ddc3-40de-8dad-97e5d4d17950 855 0 2025-08-19 08:17:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:655b7bdffd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf calico-apiserver-655b7bdffd-n6vc8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif951dd5a7dd [] [] }} ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.860 [INFO][4996] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.880 [INFO][5007] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.880 [INFO][5007] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5860), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"calico-apiserver-655b7bdffd-n6vc8", "timestamp":"2025-08-19 08:17:44.880562834 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.880 [INFO][5007] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.880 [INFO][5007] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.880 [INFO][5007] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.890 [INFO][5007] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.895 [INFO][5007] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.899 [INFO][5007] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.900 [INFO][5007] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.902 [INFO][5007] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.902 [INFO][5007] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.902 [INFO][5007] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.906 [INFO][5007] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.913 [INFO][5007] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.198/26] block=192.168.10.192/26 handle="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.913 [INFO][5007] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.198/26] handle="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.913 [INFO][5007] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:44.931195 containerd[1722]: 2025-08-19 08:17:44.913 [INFO][5007] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.198/26] IPv6=[] ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.914 [INFO][4996] cni-plugin/k8s.go 418: Populated endpoint ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0", GenerateName:"calico-apiserver-655b7bdffd-", Namespace:"calico-apiserver", SelfLink:"", UID:"977f07be-ddc3-40de-8dad-97e5d4d17950", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"655b7bdffd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"calico-apiserver-655b7bdffd-n6vc8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif951dd5a7dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.914 [INFO][4996] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.198/32] ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.914 [INFO][4996] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif951dd5a7dd ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.917 [INFO][4996] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.917 [INFO][4996] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0", GenerateName:"calico-apiserver-655b7bdffd-", Namespace:"calico-apiserver", SelfLink:"", UID:"977f07be-ddc3-40de-8dad-97e5d4d17950", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"655b7bdffd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b", Pod:"calico-apiserver-655b7bdffd-n6vc8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif951dd5a7dd", MAC:"ca:86:1d:11:a6:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:44.932242 containerd[1722]: 2025-08-19 08:17:44.927 [INFO][4996] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Namespace="calico-apiserver" Pod="calico-apiserver-655b7bdffd-n6vc8" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:17:44.965138 systemd-networkd[1370]: calif6f27f0e65b: Gained IPv6LL Aug 19 08:17:44.974109 kubelet[3115]: I0819 08:17:44.973651 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6fc69bf457-pzdkm" podStartSLOduration=1.920189054 podStartE2EDuration="6.973632908s" podCreationTimestamp="2025-08-19 08:17:38 +0000 UTC" firstStartedPulling="2025-08-19 08:17:38.766328341 +0000 UTC m=+38.027611864" lastFinishedPulling="2025-08-19 08:17:43.819772194 +0000 UTC m=+43.081055718" observedRunningTime="2025-08-19 08:17:44.972131956 +0000 UTC m=+44.233415475" watchObservedRunningTime="2025-08-19 08:17:44.973632908 +0000 UTC m=+44.234916429" Aug 19 08:17:45.042440 containerd[1722]: time="2025-08-19T08:17:45.042399013Z" level=info msg="connecting to shim 43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" address="unix:///run/containerd/s/0c5ec1815d4add3f3ed901bd3e7010483a7845e9ba686cc2a32b0b9afe66f8ee" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:45.065175 systemd[1]: Started cri-containerd-43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b.scope - libcontainer container 43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b. Aug 19 08:17:45.109253 containerd[1722]: time="2025-08-19T08:17:45.109225875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-655b7bdffd-n6vc8,Uid:977f07be-ddc3-40de-8dad-97e5d4d17950,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\"" Aug 19 08:17:45.477157 systemd-networkd[1370]: cali3c4179ea93f: Gained IPv6LL Aug 19 08:17:45.541219 systemd-networkd[1370]: cali2f1af18d8ee: Gained IPv6LL Aug 19 08:17:45.541616 systemd-networkd[1370]: cali7e43da7444d: Gained IPv6LL Aug 19 08:17:45.819498 containerd[1722]: time="2025-08-19T08:17:45.819389192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wkmf2,Uid:6637040d-4229-4a0c-9583-c7b1e340ca6c,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:45.820821 containerd[1722]: time="2025-08-19T08:17:45.819389192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-h982d,Uid:2bb59b89-ca2b-49e1-bc22-00af9770adb2,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:45.989767 systemd-networkd[1370]: cali09a91dd12be: Link UP Aug 19 08:17:45.991803 systemd-networkd[1370]: cali09a91dd12be: Gained carrier Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.903 [INFO][5075] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0 coredns-7c65d6cfc9- kube-system 6637040d-4229-4a0c-9583-c7b1e340ca6c 858 0 2025-08-19 08:17:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf coredns-7c65d6cfc9-wkmf2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali09a91dd12be [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.903 [INFO][5075] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.944 [INFO][5101] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" HandleID="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.944 [INFO][5101] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" HandleID="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024fa20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"coredns-7c65d6cfc9-wkmf2", "timestamp":"2025-08-19 08:17:45.944140096 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.944 [INFO][5101] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.944 [INFO][5101] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.944 [INFO][5101] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.951 [INFO][5101] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.955 [INFO][5101] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.958 [INFO][5101] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.960 [INFO][5101] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.961 [INFO][5101] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.961 [INFO][5101] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.962 [INFO][5101] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.967 [INFO][5101] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5101] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.199/26] block=192.168.10.192/26 handle="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5101] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.199/26] handle="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5101] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:46.009818 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5101] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.199/26] IPv6=[] ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" HandleID="k8s-pod-network.17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:45.980 [INFO][5075] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6637040d-4229-4a0c-9583-c7b1e340ca6c", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"coredns-7c65d6cfc9-wkmf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a91dd12be", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:45.980 [INFO][5075] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.199/32] ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:45.980 [INFO][5075] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali09a91dd12be ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:45.992 [INFO][5075] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:45.993 [INFO][5075] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6637040d-4229-4a0c-9583-c7b1e340ca6c", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c", Pod:"coredns-7c65d6cfc9-wkmf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a91dd12be", MAC:"42:22:53:51:2d:d5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.010558 containerd[1722]: 2025-08-19 08:17:46.006 [INFO][5075] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wkmf2" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-coredns--7c65d6cfc9--wkmf2-eth0" Aug 19 08:17:46.080865 containerd[1722]: time="2025-08-19T08:17:46.080186671Z" level=info msg="connecting to shim 17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c" address="unix:///run/containerd/s/c6583227a6178409e1767962583a9b204b540bc75b9ea2cd2b3ccdae61518fc4" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:46.105726 systemd-networkd[1370]: cali774e7bcece9: Link UP Aug 19 08:17:46.105842 systemd-networkd[1370]: cali774e7bcece9: Gained carrier Aug 19 08:17:46.117625 systemd[1]: Started cri-containerd-17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c.scope - libcontainer container 17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c. Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.908 [INFO][5081] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0 goldmane-58fd7646b9- calico-system 2bb59b89-ca2b-49e1-bc22-00af9770adb2 852 0 2025-08-19 08:17:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf goldmane-58fd7646b9-h982d eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali774e7bcece9 [] [] }} ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.908 [INFO][5081] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.950 [INFO][5106] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" HandleID="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.951 [INFO][5106] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" HandleID="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5010), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"goldmane-58fd7646b9-h982d", "timestamp":"2025-08-19 08:17:45.950863104 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.951 [INFO][5106] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5106] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:45.975 [INFO][5106] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.054 [INFO][5106] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.057 [INFO][5106] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.061 [INFO][5106] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.063 [INFO][5106] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.067 [INFO][5106] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.067 [INFO][5106] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.070 [INFO][5106] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.080 [INFO][5106] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.096 [INFO][5106] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.200/26] block=192.168.10.192/26 handle="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.096 [INFO][5106] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.200/26] handle="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.096 [INFO][5106] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:46.141004 containerd[1722]: 2025-08-19 08:17:46.096 [INFO][5106] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.200/26] IPv6=[] ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" HandleID="k8s-pod-network.aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.100 [INFO][5081] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"2bb59b89-ca2b-49e1-bc22-00af9770adb2", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"goldmane-58fd7646b9-h982d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.10.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali774e7bcece9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.101 [INFO][5081] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.200/32] ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.101 [INFO][5081] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali774e7bcece9 ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.106 [INFO][5081] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.106 [INFO][5081] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"2bb59b89-ca2b-49e1-bc22-00af9770adb2", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b", Pod:"goldmane-58fd7646b9-h982d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.10.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali774e7bcece9", MAC:"02:07:ae:3a:f5:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.141878 containerd[1722]: 2025-08-19 08:17:46.137 [INFO][5081] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" Namespace="calico-system" Pod="goldmane-58fd7646b9-h982d" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-goldmane--58fd7646b9--h982d-eth0" Aug 19 08:17:46.309220 systemd-networkd[1370]: calif951dd5a7dd: Gained IPv6LL Aug 19 08:17:46.334724 containerd[1722]: time="2025-08-19T08:17:46.334621575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wkmf2,Uid:6637040d-4229-4a0c-9583-c7b1e340ca6c,Namespace:kube-system,Attempt:0,} returns sandbox id \"17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c\"" Aug 19 08:17:46.337932 containerd[1722]: time="2025-08-19T08:17:46.337870962Z" level=info msg="CreateContainer within sandbox \"17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:17:46.516396 containerd[1722]: time="2025-08-19T08:17:46.516351071Z" level=info msg="Container 17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:46.538003 containerd[1722]: time="2025-08-19T08:17:46.537966335Z" level=info msg="connecting to shim aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b" address="unix:///run/containerd/s/e88f5fa20e6c73a383904297fcb71968127dddb240187f1ebaf17778d065a60d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:46.539163 containerd[1722]: time="2025-08-19T08:17:46.539135331Z" level=info msg="CreateContainer within sandbox \"17d9ea4654124337cf77332c8f428396baa685047f9000f65f0e1b46bb1e319c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c\"" Aug 19 08:17:46.539880 containerd[1722]: time="2025-08-19T08:17:46.539791292Z" level=info msg="StartContainer for \"17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c\"" Aug 19 08:17:46.542159 containerd[1722]: time="2025-08-19T08:17:46.542020493Z" level=info msg="connecting to shim 17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c" address="unix:///run/containerd/s/c6583227a6178409e1767962583a9b204b540bc75b9ea2cd2b3ccdae61518fc4" protocol=ttrpc version=3 Aug 19 08:17:46.565209 systemd[1]: Started cri-containerd-17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c.scope - libcontainer container 17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c. Aug 19 08:17:46.576315 systemd[1]: Started cri-containerd-aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b.scope - libcontainer container aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b. Aug 19 08:17:46.643982 containerd[1722]: time="2025-08-19T08:17:46.643888011Z" level=info msg="StartContainer for \"17177fedae3d0b1e6b8c7dc24414174790fd3f14d578f26ce250333a1228c29c\" returns successfully" Aug 19 08:17:46.751828 containerd[1722]: time="2025-08-19T08:17:46.751507444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-h982d,Uid:2bb59b89-ca2b-49e1-bc22-00af9770adb2,Namespace:calico-system,Attempt:0,} returns sandbox id \"aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b\"" Aug 19 08:17:46.817595 containerd[1722]: time="2025-08-19T08:17:46.817567238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-gjlqd,Uid:9c9ca255-d68b-47ca-8107-0f46f9e9d40c,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:17:46.952017 systemd-networkd[1370]: cali7b009a29bef: Link UP Aug 19 08:17:46.953693 systemd-networkd[1370]: cali7b009a29bef: Gained carrier Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.872 [INFO][5262] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0 calico-apiserver-598d747447- calico-apiserver 9c9ca255-d68b-47ca-8107-0f46f9e9d40c 853 0 2025-08-19 08:17:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:598d747447 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf calico-apiserver-598d747447-gjlqd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7b009a29bef [] [] }} ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.872 [INFO][5262] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.906 [INFO][5275] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" HandleID="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.906 [INFO][5275] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" HandleID="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e9a50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"calico-apiserver-598d747447-gjlqd", "timestamp":"2025-08-19 08:17:46.906386087 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.906 [INFO][5275] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.906 [INFO][5275] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.907 [INFO][5275] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.914 [INFO][5275] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.917 [INFO][5275] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.921 [INFO][5275] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.922 [INFO][5275] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.925 [INFO][5275] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.925 [INFO][5275] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.928 [INFO][5275] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5 Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.934 [INFO][5275] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.943 [INFO][5275] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.201/26] block=192.168.10.192/26 handle="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.943 [INFO][5275] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.201/26] handle="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.943 [INFO][5275] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:17:46.977010 containerd[1722]: 2025-08-19 08:17:46.943 [INFO][5275] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.201/26] IPv6=[] ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" HandleID="k8s-pod-network.a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.946 [INFO][5262] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0", GenerateName:"calico-apiserver-598d747447-", Namespace:"calico-apiserver", SelfLink:"", UID:"9c9ca255-d68b-47ca-8107-0f46f9e9d40c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"598d747447", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"calico-apiserver-598d747447-gjlqd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7b009a29bef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.946 [INFO][5262] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.201/32] ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.946 [INFO][5262] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7b009a29bef ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.955 [INFO][5262] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.956 [INFO][5262] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0", GenerateName:"calico-apiserver-598d747447-", Namespace:"calico-apiserver", SelfLink:"", UID:"9c9ca255-d68b-47ca-8107-0f46f9e9d40c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"598d747447", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5", Pod:"calico-apiserver-598d747447-gjlqd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7b009a29bef", MAC:"2e:dc:0c:ea:63:97", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:17:46.978257 containerd[1722]: 2025-08-19 08:17:46.972 [INFO][5262] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-gjlqd" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--gjlqd-eth0" Aug 19 08:17:47.034765 kubelet[3115]: I0819 08:17:47.034525 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-wkmf2" podStartSLOduration=42.034508268 podStartE2EDuration="42.034508268s" podCreationTimestamp="2025-08-19 08:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:47.006792468 +0000 UTC m=+46.268076014" watchObservedRunningTime="2025-08-19 08:17:47.034508268 +0000 UTC m=+46.295791791" Aug 19 08:17:47.142390 systemd-networkd[1370]: cali09a91dd12be: Gained IPv6LL Aug 19 08:17:47.603624 containerd[1722]: time="2025-08-19T08:17:47.603572874Z" level=info msg="connecting to shim a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5" address="unix:///run/containerd/s/5763f841ccc5d5b779509511e0477abbe952bbe2513cc0f408f776c8a3efaa37" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:47.626193 systemd[1]: Started cri-containerd-a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5.scope - libcontainer container a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5. Aug 19 08:17:47.653327 systemd-networkd[1370]: cali774e7bcece9: Gained IPv6LL Aug 19 08:17:47.669580 containerd[1722]: time="2025-08-19T08:17:47.669551839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-gjlqd,Uid:9c9ca255-d68b-47ca-8107-0f46f9e9d40c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5\"" Aug 19 08:17:47.676229 containerd[1722]: time="2025-08-19T08:17:47.676190014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:47.679775 containerd[1722]: time="2025-08-19T08:17:47.679743171Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 19 08:17:47.682641 containerd[1722]: time="2025-08-19T08:17:47.682607226Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:47.686211 containerd[1722]: time="2025-08-19T08:17:47.686168802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:47.686681 containerd[1722]: time="2025-08-19T08:17:47.686561451Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 3.433113769s" Aug 19 08:17:47.686681 containerd[1722]: time="2025-08-19T08:17:47.686589976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:17:47.687507 containerd[1722]: time="2025-08-19T08:17:47.687488102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 08:17:47.688686 containerd[1722]: time="2025-08-19T08:17:47.688660394Z" level=info msg="CreateContainer within sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:17:47.712341 containerd[1722]: time="2025-08-19T08:17:47.712316664Z" level=info msg="Container 30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:47.785065 containerd[1722]: time="2025-08-19T08:17:47.785023377Z" level=info msg="CreateContainer within sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\"" Aug 19 08:17:47.785704 containerd[1722]: time="2025-08-19T08:17:47.785681747Z" level=info msg="StartContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\"" Aug 19 08:17:47.787417 containerd[1722]: time="2025-08-19T08:17:47.787391186Z" level=info msg="connecting to shim 30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa" address="unix:///run/containerd/s/c53ced3041dd12da7b07e2646ffd8160261e55acef3631166bd2bb79768d0c30" protocol=ttrpc version=3 Aug 19 08:17:47.815552 systemd[1]: Started cri-containerd-30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa.scope - libcontainer container 30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa. Aug 19 08:17:47.903401 containerd[1722]: time="2025-08-19T08:17:47.903195173Z" level=info msg="StartContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" returns successfully" Aug 19 08:17:48.846055 kubelet[3115]: I0819 08:17:48.845489 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-655b7bdffd-hjd48" podStartSLOduration=31.410687994 podStartE2EDuration="34.845470744s" podCreationTimestamp="2025-08-19 08:17:14 +0000 UTC" firstStartedPulling="2025-08-19 08:17:44.252574134 +0000 UTC m=+43.513857658" lastFinishedPulling="2025-08-19 08:17:47.687356887 +0000 UTC m=+46.948640408" observedRunningTime="2025-08-19 08:17:48.003007531 +0000 UTC m=+47.264291057" watchObservedRunningTime="2025-08-19 08:17:48.845470744 +0000 UTC m=+48.106754268" Aug 19 08:17:48.933160 systemd-networkd[1370]: cali7b009a29bef: Gained IPv6LL Aug 19 08:17:51.794782 containerd[1722]: time="2025-08-19T08:17:51.794727962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:51.802410 containerd[1722]: time="2025-08-19T08:17:51.802367295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 19 08:17:51.805249 containerd[1722]: time="2025-08-19T08:17:51.805206902Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:52.132144 containerd[1722]: time="2025-08-19T08:17:52.131649812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:52.132513 containerd[1722]: time="2025-08-19T08:17:52.132493308Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 4.444978811s" Aug 19 08:17:52.132581 containerd[1722]: time="2025-08-19T08:17:52.132569702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 19 08:17:52.134141 containerd[1722]: time="2025-08-19T08:17:52.134030074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 08:17:52.142904 containerd[1722]: time="2025-08-19T08:17:52.142879252Z" level=info msg="CreateContainer within sandbox \"f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 08:17:52.246073 containerd[1722]: time="2025-08-19T08:17:52.245842002Z" level=info msg="Container f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:52.354289 containerd[1722]: time="2025-08-19T08:17:52.354252058Z" level=info msg="CreateContainer within sandbox \"f9355cf740c793b107413e98fb3e5a7e8e3ec18ec5831ad86da2d123844e618c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\"" Aug 19 08:17:52.355152 containerd[1722]: time="2025-08-19T08:17:52.354914857Z" level=info msg="StartContainer for \"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\"" Aug 19 08:17:52.356559 containerd[1722]: time="2025-08-19T08:17:52.356478873Z" level=info msg="connecting to shim f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0" address="unix:///run/containerd/s/ae595532f4b3ccfb12f72c71509e4362d8fe0505799440addeceee5c7fcdae92" protocol=ttrpc version=3 Aug 19 08:17:52.383193 systemd[1]: Started cri-containerd-f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0.scope - libcontainer container f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0. Aug 19 08:17:52.454573 containerd[1722]: time="2025-08-19T08:17:52.454533003Z" level=info msg="StartContainer for \"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" returns successfully" Aug 19 08:17:53.048976 containerd[1722]: time="2025-08-19T08:17:53.048921569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"4e3188fd850e1c7760f9fe2b3c41c584a169d390a672300e18d679ae4d117627\" pid:5451 exited_at:{seconds:1755591473 nanos:48617647}" Aug 19 08:17:53.061566 kubelet[3115]: I0819 08:17:53.061146 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86896c85c-k2466" podStartSLOduration=27.25428355 podStartE2EDuration="35.061128648s" podCreationTimestamp="2025-08-19 08:17:18 +0000 UTC" firstStartedPulling="2025-08-19 08:17:44.326603187 +0000 UTC m=+43.587886707" lastFinishedPulling="2025-08-19 08:17:52.133448284 +0000 UTC m=+51.394731805" observedRunningTime="2025-08-19 08:17:53.023625003 +0000 UTC m=+52.284908527" watchObservedRunningTime="2025-08-19 08:17:53.061128648 +0000 UTC m=+52.322412176" Aug 19 08:17:54.536751 containerd[1722]: time="2025-08-19T08:17:54.536377513Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:54.779056 containerd[1722]: time="2025-08-19T08:17:54.778925259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 19 08:17:54.936447 containerd[1722]: time="2025-08-19T08:17:54.936274026Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:55.077531 containerd[1722]: time="2025-08-19T08:17:55.077468609Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:55.078170 containerd[1722]: time="2025-08-19T08:17:55.078022148Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.94393354s" Aug 19 08:17:55.078170 containerd[1722]: time="2025-08-19T08:17:55.078067043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 19 08:17:55.078914 containerd[1722]: time="2025-08-19T08:17:55.078888490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:17:55.080876 containerd[1722]: time="2025-08-19T08:17:55.080222550Z" level=info msg="CreateContainer within sandbox \"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 08:17:55.107187 containerd[1722]: time="2025-08-19T08:17:55.107155048Z" level=info msg="Container 902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:55.127987 containerd[1722]: time="2025-08-19T08:17:55.127957409Z" level=info msg="CreateContainer within sandbox \"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b\"" Aug 19 08:17:55.128473 containerd[1722]: time="2025-08-19T08:17:55.128428609Z" level=info msg="StartContainer for \"902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b\"" Aug 19 08:17:55.129846 containerd[1722]: time="2025-08-19T08:17:55.129817898Z" level=info msg="connecting to shim 902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b" address="unix:///run/containerd/s/cca107b4abea69eec5f359c875ce378134f20f85076b9a8a44260c371503f93f" protocol=ttrpc version=3 Aug 19 08:17:55.150241 systemd[1]: Started cri-containerd-902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b.scope - libcontainer container 902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b. Aug 19 08:17:55.180552 containerd[1722]: time="2025-08-19T08:17:55.180534151Z" level=info msg="StartContainer for \"902093a884c8507bc74a7d4de5037ef5f73e10a23bb8a650939e426d3c1f6b4b\" returns successfully" Aug 19 08:17:55.451176 containerd[1722]: time="2025-08-19T08:17:55.451124313Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:55.455599 containerd[1722]: time="2025-08-19T08:17:55.455546805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 08:17:55.457013 containerd[1722]: time="2025-08-19T08:17:55.456987702Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 377.964613ms" Aug 19 08:17:55.457101 containerd[1722]: time="2025-08-19T08:17:55.457015662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:17:55.459225 containerd[1722]: time="2025-08-19T08:17:55.458026111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 08:17:55.459336 containerd[1722]: time="2025-08-19T08:17:55.459310439Z" level=info msg="CreateContainer within sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:17:55.481322 containerd[1722]: time="2025-08-19T08:17:55.480679094Z" level=info msg="Container e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:55.502245 containerd[1722]: time="2025-08-19T08:17:55.502218292Z" level=info msg="CreateContainer within sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\"" Aug 19 08:17:55.502651 containerd[1722]: time="2025-08-19T08:17:55.502571791Z" level=info msg="StartContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\"" Aug 19 08:17:55.503685 containerd[1722]: time="2025-08-19T08:17:55.503637762Z" level=info msg="connecting to shim e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b" address="unix:///run/containerd/s/0c5ec1815d4add3f3ed901bd3e7010483a7845e9ba686cc2a32b0b9afe66f8ee" protocol=ttrpc version=3 Aug 19 08:17:55.524341 systemd[1]: Started cri-containerd-e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b.scope - libcontainer container e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b. Aug 19 08:17:55.566977 containerd[1722]: time="2025-08-19T08:17:55.566936625Z" level=info msg="StartContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" returns successfully" Aug 19 08:17:57.016808 kubelet[3115]: I0819 08:17:57.016776 3115 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:17:58.267128 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2113963711.mount: Deactivated successfully. Aug 19 08:17:58.864896 containerd[1722]: time="2025-08-19T08:17:58.864853686Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.869029 containerd[1722]: time="2025-08-19T08:17:58.868991071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 19 08:17:58.872819 containerd[1722]: time="2025-08-19T08:17:58.872775752Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.886393 containerd[1722]: time="2025-08-19T08:17:58.886341089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.886961 containerd[1722]: time="2025-08-19T08:17:58.886855049Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.428774381s" Aug 19 08:17:58.886961 containerd[1722]: time="2025-08-19T08:17:58.886883515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 19 08:17:58.888296 containerd[1722]: time="2025-08-19T08:17:58.888017387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:17:58.888836 containerd[1722]: time="2025-08-19T08:17:58.888805039Z" level=info msg="CreateContainer within sandbox \"aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 08:17:58.980194 containerd[1722]: time="2025-08-19T08:17:58.979802063Z" level=info msg="Container 460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:58.995617 containerd[1722]: time="2025-08-19T08:17:58.995585523Z" level=info msg="CreateContainer within sandbox \"aa2bab2773ef40ef77e04feb51c9919a9c1abee5a59f11f25fd48b589efb4f3b\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\"" Aug 19 08:17:58.996300 containerd[1722]: time="2025-08-19T08:17:58.996070089Z" level=info msg="StartContainer for \"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\"" Aug 19 08:17:58.997341 containerd[1722]: time="2025-08-19T08:17:58.997312705Z" level=info msg="connecting to shim 460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0" address="unix:///run/containerd/s/e88f5fa20e6c73a383904297fcb71968127dddb240187f1ebaf17778d065a60d" protocol=ttrpc version=3 Aug 19 08:17:59.026158 systemd[1]: Started cri-containerd-460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0.scope - libcontainer container 460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0. Aug 19 08:17:59.082686 containerd[1722]: time="2025-08-19T08:17:59.082606186Z" level=info msg="StartContainer for \"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" returns successfully" Aug 19 08:17:59.318239 containerd[1722]: time="2025-08-19T08:17:59.318195591Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:59.320497 containerd[1722]: time="2025-08-19T08:17:59.320464138Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 08:17:59.321596 containerd[1722]: time="2025-08-19T08:17:59.321572930Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 433.502818ms" Aug 19 08:17:59.321653 containerd[1722]: time="2025-08-19T08:17:59.321602372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:17:59.322932 containerd[1722]: time="2025-08-19T08:17:59.322851492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 08:17:59.324639 containerd[1722]: time="2025-08-19T08:17:59.324611358Z" level=info msg="CreateContainer within sandbox \"a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:17:59.346059 containerd[1722]: time="2025-08-19T08:17:59.344510317Z" level=info msg="Container 7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:59.370109 containerd[1722]: time="2025-08-19T08:17:59.370083230Z" level=info msg="CreateContainer within sandbox \"a2a70aaabdb42fb1db62890959193b8833487ac9d55b4de82a652d769613ecd5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb\"" Aug 19 08:17:59.370790 containerd[1722]: time="2025-08-19T08:17:59.370514140Z" level=info msg="StartContainer for \"7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb\"" Aug 19 08:17:59.371742 containerd[1722]: time="2025-08-19T08:17:59.371668232Z" level=info msg="connecting to shim 7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb" address="unix:///run/containerd/s/5763f841ccc5d5b779509511e0477abbe952bbe2513cc0f408f776c8a3efaa37" protocol=ttrpc version=3 Aug 19 08:17:59.390217 systemd[1]: Started cri-containerd-7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb.scope - libcontainer container 7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb. Aug 19 08:17:59.435942 containerd[1722]: time="2025-08-19T08:17:59.435909721Z" level=info msg="StartContainer for \"7122a2e466a2f238686cc4f6e2f331a3b58aa599b8995b2f4e532bf3f966a3cb\" returns successfully" Aug 19 08:18:00.048818 kubelet[3115]: I0819 08:18:00.048464 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-655b7bdffd-n6vc8" podStartSLOduration=35.700782536 podStartE2EDuration="46.048445271s" podCreationTimestamp="2025-08-19 08:17:14 +0000 UTC" firstStartedPulling="2025-08-19 08:17:45.109997201 +0000 UTC m=+44.371280711" lastFinishedPulling="2025-08-19 08:17:55.45765992 +0000 UTC m=+54.718943446" observedRunningTime="2025-08-19 08:17:56.035534668 +0000 UTC m=+55.296818195" watchObservedRunningTime="2025-08-19 08:18:00.048445271 +0000 UTC m=+59.309728795" Aug 19 08:18:00.069274 kubelet[3115]: I0819 08:18:00.069095 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-598d747447-gjlqd" podStartSLOduration=33.417226091 podStartE2EDuration="45.069077185s" podCreationTimestamp="2025-08-19 08:17:15 +0000 UTC" firstStartedPulling="2025-08-19 08:17:47.670516533 +0000 UTC m=+46.931800050" lastFinishedPulling="2025-08-19 08:17:59.322367625 +0000 UTC m=+58.583651144" observedRunningTime="2025-08-19 08:18:00.047092318 +0000 UTC m=+59.308375844" watchObservedRunningTime="2025-08-19 08:18:00.069077185 +0000 UTC m=+59.330360769" Aug 19 08:18:00.070507 kubelet[3115]: I0819 08:18:00.070463 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-h982d" podStartSLOduration=30.93709037 podStartE2EDuration="43.070447995s" podCreationTimestamp="2025-08-19 08:17:17 +0000 UTC" firstStartedPulling="2025-08-19 08:17:46.754274947 +0000 UTC m=+46.015558472" lastFinishedPulling="2025-08-19 08:17:58.88763257 +0000 UTC m=+58.148916097" observedRunningTime="2025-08-19 08:18:00.067968334 +0000 UTC m=+59.329251857" watchObservedRunningTime="2025-08-19 08:18:00.070447995 +0000 UTC m=+59.331731519" Aug 19 08:18:00.146055 containerd[1722]: time="2025-08-19T08:18:00.146007959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"0614c720a40139de700532fe5f2811d0ceecd76c4ba156b4b74492e52b965c89\" pid:5631 exited_at:{seconds:1755591480 nanos:144835486}" Aug 19 08:18:01.032834 kubelet[3115]: I0819 08:18:01.032772 3115 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:18:01.314675 containerd[1722]: time="2025-08-19T08:18:01.314568212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:01.318633 containerd[1722]: time="2025-08-19T08:18:01.318602023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 19 08:18:01.322626 containerd[1722]: time="2025-08-19T08:18:01.322583027Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:01.327070 containerd[1722]: time="2025-08-19T08:18:01.327009111Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:01.327533 containerd[1722]: time="2025-08-19T08:18:01.327380538Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.004496673s" Aug 19 08:18:01.327533 containerd[1722]: time="2025-08-19T08:18:01.327413310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 19 08:18:01.329667 containerd[1722]: time="2025-08-19T08:18:01.329638117Z" level=info msg="CreateContainer within sandbox \"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 08:18:01.345055 containerd[1722]: time="2025-08-19T08:18:01.344928365Z" level=info msg="Container eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:01.370966 containerd[1722]: time="2025-08-19T08:18:01.370936503Z" level=info msg="CreateContainer within sandbox \"e60ff164a264955253704abfb2d7b0b421f52723bc30c20f4517a9cef1071ad3\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d\"" Aug 19 08:18:01.371357 containerd[1722]: time="2025-08-19T08:18:01.371298400Z" level=info msg="StartContainer for \"eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d\"" Aug 19 08:18:01.373015 containerd[1722]: time="2025-08-19T08:18:01.372987286Z" level=info msg="connecting to shim eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d" address="unix:///run/containerd/s/cca107b4abea69eec5f359c875ce378134f20f85076b9a8a44260c371503f93f" protocol=ttrpc version=3 Aug 19 08:18:01.392284 systemd[1]: Started cri-containerd-eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d.scope - libcontainer container eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d. Aug 19 08:18:01.430095 containerd[1722]: time="2025-08-19T08:18:01.430076144Z" level=info msg="StartContainer for \"eb7e6f74dd6d12e559bcbcdf60cdc4f03a4808ebaee89a12c983572b06eff86d\" returns successfully" Aug 19 08:18:01.906783 kubelet[3115]: I0819 08:18:01.906741 3115 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 08:18:01.906783 kubelet[3115]: I0819 08:18:01.906780 3115 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 08:18:02.048555 kubelet[3115]: I0819 08:18:02.047952 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-rw2mz" podStartSLOduration=27.072523044 podStartE2EDuration="44.047935377s" podCreationTimestamp="2025-08-19 08:17:18 +0000 UTC" firstStartedPulling="2025-08-19 08:17:44.352736403 +0000 UTC m=+43.614019926" lastFinishedPulling="2025-08-19 08:18:01.328148735 +0000 UTC m=+60.589432259" observedRunningTime="2025-08-19 08:18:02.047481195 +0000 UTC m=+61.308764722" watchObservedRunningTime="2025-08-19 08:18:02.047935377 +0000 UTC m=+61.309218903" Aug 19 08:18:02.374058 containerd[1722]: time="2025-08-19T08:18:02.373988279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"a2e6dd298e4dbc74f7c6e6786d72de7970844f00434e7fd175b253f56de17e30\" pid:5703 exited_at:{seconds:1755591482 nanos:373611870}" Aug 19 08:18:06.738406 containerd[1722]: time="2025-08-19T08:18:06.738362855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"56dc9fb35cf932c46a7bbeb59a47eba267931cb3035ce24d533ca25b63439a41\" pid:5726 exited_at:{seconds:1755591486 nanos:738107431}" Aug 19 08:18:10.219564 containerd[1722]: time="2025-08-19T08:18:10.219515788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"af7bd4f5439cb0ae6259526a08bf634ab4ede83d0a78594f7475800b5ac87160\" pid:5752 exited_at:{seconds:1755591490 nanos:219326243}" Aug 19 08:18:14.719482 containerd[1722]: time="2025-08-19T08:18:14.719438985Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"b6fe0d6881b81df66d78f9df381b402f8dbd0ad19c2ddd4b332801b8e576c6d4\" pid:5775 exited_at:{seconds:1755591494 nanos:719207770}" Aug 19 08:18:19.774995 containerd[1722]: time="2025-08-19T08:18:19.774954227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"c0422eb283a36d72dcd72f8c444611dbaa085f73e028865c5af9ed09996a026a\" pid:5798 exited_at:{seconds:1755591499 nanos:774711136}" Aug 19 08:18:26.527060 kubelet[3115]: I0819 08:18:26.525982 3115 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:18:28.728074 kubelet[3115]: I0819 08:18:28.727919 3115 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:18:28.792535 containerd[1722]: time="2025-08-19T08:18:28.792495497Z" level=info msg="StopContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" with timeout 30 (s)" Aug 19 08:18:28.794247 containerd[1722]: time="2025-08-19T08:18:28.794173293Z" level=info msg="Stop container \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" with signal terminated" Aug 19 08:18:28.830440 systemd[1]: cri-containerd-e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b.scope: Deactivated successfully. Aug 19 08:18:28.835071 containerd[1722]: time="2025-08-19T08:18:28.834513410Z" level=info msg="received exit event container_id:\"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" id:\"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" pid:5506 exit_status:1 exited_at:{seconds:1755591508 nanos:834229180}" Aug 19 08:18:28.836735 containerd[1722]: time="2025-08-19T08:18:28.835708828Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" id:\"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" pid:5506 exit_status:1 exited_at:{seconds:1755591508 nanos:834229180}" Aug 19 08:18:28.840407 systemd[1]: Created slice kubepods-besteffort-pod79f6449b_0337_4f7e_b7ac_147fd7febdea.slice - libcontainer container kubepods-besteffort-pod79f6449b_0337_4f7e_b7ac_147fd7febdea.slice. Aug 19 08:18:28.866486 kubelet[3115]: I0819 08:18:28.866298 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/79f6449b-0337-4f7e-b7ac-147fd7febdea-calico-apiserver-certs\") pod \"calico-apiserver-598d747447-hrpxs\" (UID: \"79f6449b-0337-4f7e-b7ac-147fd7febdea\") " pod="calico-apiserver/calico-apiserver-598d747447-hrpxs" Aug 19 08:18:28.866987 kubelet[3115]: I0819 08:18:28.866877 3115 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntb89\" (UniqueName: \"kubernetes.io/projected/79f6449b-0337-4f7e-b7ac-147fd7febdea-kube-api-access-ntb89\") pod \"calico-apiserver-598d747447-hrpxs\" (UID: \"79f6449b-0337-4f7e-b7ac-147fd7febdea\") " pod="calico-apiserver/calico-apiserver-598d747447-hrpxs" Aug 19 08:18:28.872203 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b-rootfs.mount: Deactivated successfully. Aug 19 08:18:29.146486 containerd[1722]: time="2025-08-19T08:18:29.146004414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-hrpxs,Uid:79f6449b-0337-4f7e-b7ac-147fd7febdea,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:29.546894 systemd-networkd[1370]: cali68812061865: Link UP Aug 19 08:18:29.548209 systemd-networkd[1370]: cali68812061865: Gained carrier Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.470 [INFO][5845] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0 calico-apiserver-598d747447- calico-apiserver 79f6449b-0337-4f7e-b7ac-147fd7febdea 1199 0 2025-08-19 08:18:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:598d747447 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-a-5588c1b4cf calico-apiserver-598d747447-hrpxs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali68812061865 [] [] }} ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.470 [INFO][5845] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.503 [INFO][5856] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" HandleID="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.503 [INFO][5856] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" HandleID="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-a-5588c1b4cf", "pod":"calico-apiserver-598d747447-hrpxs", "timestamp":"2025-08-19 08:18:29.503488684 +0000 UTC"}, Hostname:"ci-4426.0.0-a-5588c1b4cf", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.503 [INFO][5856] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.503 [INFO][5856] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.503 [INFO][5856] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-a-5588c1b4cf' Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.509 [INFO][5856] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.513 [INFO][5856] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.516 [INFO][5856] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.518 [INFO][5856] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.521 [INFO][5856] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.521 [INFO][5856] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.524 [INFO][5856] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53 Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.531 [INFO][5856] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.540 [INFO][5856] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.202/26] block=192.168.10.192/26 handle="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.540 [INFO][5856] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.202/26] handle="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" host="ci-4426.0.0-a-5588c1b4cf" Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.540 [INFO][5856] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:29.567586 containerd[1722]: 2025-08-19 08:18:29.540 [INFO][5856] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.202/26] IPv6=[] ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" HandleID="k8s-pod-network.977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.542 [INFO][5845] cni-plugin/k8s.go 418: Populated endpoint ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0", GenerateName:"calico-apiserver-598d747447-", Namespace:"calico-apiserver", SelfLink:"", UID:"79f6449b-0337-4f7e-b7ac-147fd7febdea", ResourceVersion:"1199", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"598d747447", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"", Pod:"calico-apiserver-598d747447-hrpxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.202/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68812061865", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.542 [INFO][5845] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.202/32] ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.543 [INFO][5845] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68812061865 ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.548 [INFO][5845] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.549 [INFO][5845] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0", GenerateName:"calico-apiserver-598d747447-", Namespace:"calico-apiserver", SelfLink:"", UID:"79f6449b-0337-4f7e-b7ac-147fd7febdea", ResourceVersion:"1199", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"598d747447", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-a-5588c1b4cf", ContainerID:"977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53", Pod:"calico-apiserver-598d747447-hrpxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.202/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68812061865", MAC:"6e:30:46:c2:78:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:29.569884 containerd[1722]: 2025-08-19 08:18:29.563 [INFO][5845] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" Namespace="calico-apiserver" Pod="calico-apiserver-598d747447-hrpxs" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--598d747447--hrpxs-eth0" Aug 19 08:18:31.341390 containerd[1722]: time="2025-08-19T08:18:31.341293639Z" level=info msg="StopContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" returns successfully" Aug 19 08:18:31.343240 containerd[1722]: time="2025-08-19T08:18:31.343196153Z" level=info msg="StopPodSandbox for \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\"" Aug 19 08:18:31.343417 containerd[1722]: time="2025-08-19T08:18:31.343402843Z" level=info msg="Container to stop \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 08:18:31.351689 containerd[1722]: time="2025-08-19T08:18:31.351655402Z" level=info msg="connecting to shim 977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53" address="unix:///run/containerd/s/a0e97e2316b6d403a10979906e4d977f64978a9585b0969e6f8881e6663913bc" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:31.366211 systemd-networkd[1370]: cali68812061865: Gained IPv6LL Aug 19 08:18:31.379895 systemd[1]: cri-containerd-43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b.scope: Deactivated successfully. Aug 19 08:18:31.388293 containerd[1722]: time="2025-08-19T08:18:31.388232515Z" level=info msg="TaskExit event in podsandbox handler container_id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" pid:5059 exit_status:137 exited_at:{seconds:1755591511 nanos:379821438}" Aug 19 08:18:31.406488 systemd[1]: Started cri-containerd-977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53.scope - libcontainer container 977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53. Aug 19 08:18:31.432066 containerd[1722]: time="2025-08-19T08:18:31.431229409Z" level=info msg="shim disconnected" id=43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b namespace=k8s.io Aug 19 08:18:31.432224 containerd[1722]: time="2025-08-19T08:18:31.432175243Z" level=warning msg="cleaning up after shim disconnected" id=43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b namespace=k8s.io Aug 19 08:18:31.432457 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b-rootfs.mount: Deactivated successfully. Aug 19 08:18:31.433690 containerd[1722]: time="2025-08-19T08:18:31.432195147Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 08:18:31.433843 containerd[1722]: time="2025-08-19T08:18:31.432335324Z" level=error msg="Failed to handle event container_id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" pid:5059 exit_status:137 exited_at:{seconds:1755591511 nanos:379821438} for 43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" error="failed to handle container TaskExit event: failed to stop sandbox: ttrpc: closed" Aug 19 08:18:31.471921 containerd[1722]: time="2025-08-19T08:18:31.471879908Z" level=info msg="received exit event sandbox_id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" exit_status:137 exited_at:{seconds:1755591511 nanos:379821438}" Aug 19 08:18:31.474769 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b-shm.mount: Deactivated successfully. Aug 19 08:18:31.553305 systemd-networkd[1370]: calif951dd5a7dd: Link DOWN Aug 19 08:18:31.553313 systemd-networkd[1370]: calif951dd5a7dd: Lost carrier Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.550 [INFO][5948] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.551 [INFO][5948] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" iface="eth0" netns="/var/run/netns/cni-bcd41ddb-9084-48ee-4495-6f013520bdc0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.551 [INFO][5948] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" iface="eth0" netns="/var/run/netns/cni-bcd41ddb-9084-48ee-4495-6f013520bdc0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.559 [INFO][5948] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" after=7.485327ms iface="eth0" netns="/var/run/netns/cni-bcd41ddb-9084-48ee-4495-6f013520bdc0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.559 [INFO][5948] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.559 [INFO][5948] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.591 [INFO][5961] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.592 [INFO][5961] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.592 [INFO][5961] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.635 [INFO][5961] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.635 [INFO][5961] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.636 [INFO][5961] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:31.641648 containerd[1722]: 2025-08-19 08:18:31.639 [INFO][5948] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:18:31.645840 systemd[1]: run-netns-cni\x2dbcd41ddb\x2d9084\x2d48ee\x2d4495\x2d6f013520bdc0.mount: Deactivated successfully. Aug 19 08:18:31.646804 containerd[1722]: time="2025-08-19T08:18:31.646776971Z" level=info msg="TearDown network for sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" successfully" Aug 19 08:18:31.646878 containerd[1722]: time="2025-08-19T08:18:31.646868137Z" level=info msg="StopPodSandbox for \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" returns successfully" Aug 19 08:18:31.685235 kubelet[3115]: I0819 08:18:31.685131 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5kk9\" (UniqueName: \"kubernetes.io/projected/977f07be-ddc3-40de-8dad-97e5d4d17950-kube-api-access-b5kk9\") pod \"977f07be-ddc3-40de-8dad-97e5d4d17950\" (UID: \"977f07be-ddc3-40de-8dad-97e5d4d17950\") " Aug 19 08:18:31.686864 kubelet[3115]: I0819 08:18:31.686533 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/977f07be-ddc3-40de-8dad-97e5d4d17950-calico-apiserver-certs\") pod \"977f07be-ddc3-40de-8dad-97e5d4d17950\" (UID: \"977f07be-ddc3-40de-8dad-97e5d4d17950\") " Aug 19 08:18:31.687876 containerd[1722]: time="2025-08-19T08:18:31.687773480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-598d747447-hrpxs,Uid:79f6449b-0337-4f7e-b7ac-147fd7febdea,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53\"" Aug 19 08:18:31.692713 containerd[1722]: time="2025-08-19T08:18:31.692176849Z" level=info msg="CreateContainer within sandbox \"977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:18:31.692892 kubelet[3115]: I0819 08:18:31.692873 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977f07be-ddc3-40de-8dad-97e5d4d17950-kube-api-access-b5kk9" (OuterVolumeSpecName: "kube-api-access-b5kk9") pod "977f07be-ddc3-40de-8dad-97e5d4d17950" (UID: "977f07be-ddc3-40de-8dad-97e5d4d17950"). InnerVolumeSpecName "kube-api-access-b5kk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 19 08:18:31.693776 kubelet[3115]: I0819 08:18:31.693757 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977f07be-ddc3-40de-8dad-97e5d4d17950-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "977f07be-ddc3-40de-8dad-97e5d4d17950" (UID: "977f07be-ddc3-40de-8dad-97e5d4d17950"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 19 08:18:31.787525 kubelet[3115]: I0819 08:18:31.787505 3115 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5kk9\" (UniqueName: \"kubernetes.io/projected/977f07be-ddc3-40de-8dad-97e5d4d17950-kube-api-access-b5kk9\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:18:31.787626 kubelet[3115]: I0819 08:18:31.787617 3115 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/977f07be-ddc3-40de-8dad-97e5d4d17950-calico-apiserver-certs\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:18:32.104884 kubelet[3115]: I0819 08:18:32.104772 3115 scope.go:117] "RemoveContainer" containerID="e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b" Aug 19 08:18:32.108295 containerd[1722]: time="2025-08-19T08:18:32.108194970Z" level=info msg="RemoveContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\"" Aug 19 08:18:32.111841 systemd[1]: Removed slice kubepods-besteffort-pod977f07be_ddc3_40de_8dad_97e5d4d17950.slice - libcontainer container kubepods-besteffort-pod977f07be_ddc3_40de_8dad_97e5d4d17950.slice. Aug 19 08:18:32.335771 systemd[1]: var-lib-kubelet-pods-977f07be\x2dddc3\x2d40de\x2d8dad\x2d97e5d4d17950-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2db5kk9.mount: Deactivated successfully. Aug 19 08:18:32.335874 systemd[1]: var-lib-kubelet-pods-977f07be\x2dddc3\x2d40de\x2d8dad\x2d97e5d4d17950-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 19 08:18:32.371198 containerd[1722]: time="2025-08-19T08:18:32.371052485Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"ebac42d38c03ca2cc41b4f07878318f6462d72a51f4b38d40745f1aabe5a8cc9\" pid:5997 exited_at:{seconds:1755591512 nanos:370783828}" Aug 19 08:18:32.469491 containerd[1722]: time="2025-08-19T08:18:32.469453569Z" level=info msg="RemoveContainer for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" returns successfully" Aug 19 08:18:32.469966 kubelet[3115]: I0819 08:18:32.469948 3115 scope.go:117] "RemoveContainer" containerID="e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b" Aug 19 08:18:32.470383 containerd[1722]: time="2025-08-19T08:18:32.470327315Z" level=error msg="ContainerStatus for \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\": not found" Aug 19 08:18:32.470576 kubelet[3115]: E0819 08:18:32.470550 3115 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\": not found" containerID="e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b" Aug 19 08:18:32.470725 kubelet[3115]: I0819 08:18:32.470649 3115 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b"} err="failed to get container status \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\": rpc error: code = NotFound desc = an error occurred when try to find container \"e12bb563c344d0792606eafe1ff2072b00d4c104d85257ce3010aa0c4f3cf46b\": not found" Aug 19 08:18:32.473777 containerd[1722]: time="2025-08-19T08:18:32.473067322Z" level=info msg="Container 712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:32.489004 containerd[1722]: time="2025-08-19T08:18:32.488947065Z" level=info msg="CreateContainer within sandbox \"977b594ea57bf02ca098f460389222a27ab72252eabef556ab6d0b2159f99b53\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287\"" Aug 19 08:18:32.490132 containerd[1722]: time="2025-08-19T08:18:32.489448489Z" level=info msg="StartContainer for \"712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287\"" Aug 19 08:18:32.490615 containerd[1722]: time="2025-08-19T08:18:32.490573242Z" level=info msg="connecting to shim 712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287" address="unix:///run/containerd/s/a0e97e2316b6d403a10979906e4d977f64978a9585b0969e6f8881e6663913bc" protocol=ttrpc version=3 Aug 19 08:18:32.513305 systemd[1]: Started cri-containerd-712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287.scope - libcontainer container 712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287. Aug 19 08:18:32.570183 containerd[1722]: time="2025-08-19T08:18:32.570139228Z" level=info msg="StartContainer for \"712aee2e3b786c10890bcdfabdbd0535ee510244a51a3339d402bdf533b7f287\" returns successfully" Aug 19 08:18:32.822008 kubelet[3115]: I0819 08:18:32.821952 3115 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977f07be-ddc3-40de-8dad-97e5d4d17950" path="/var/lib/kubelet/pods/977f07be-ddc3-40de-8dad-97e5d4d17950/volumes" Aug 19 08:18:32.972464 containerd[1722]: time="2025-08-19T08:18:32.972335850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" id:\"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" pid:5059 exit_status:137 exited_at:{seconds:1755591511 nanos:379821438}" Aug 19 08:18:33.130208 kubelet[3115]: I0819 08:18:33.129778 3115 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-598d747447-hrpxs" podStartSLOduration=5.129740195 podStartE2EDuration="5.129740195s" podCreationTimestamp="2025-08-19 08:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:18:33.122347821 +0000 UTC m=+92.383631342" watchObservedRunningTime="2025-08-19 08:18:33.129740195 +0000 UTC m=+92.391023721" Aug 19 08:18:34.013804 containerd[1722]: time="2025-08-19T08:18:34.013747526Z" level=info msg="StopContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" with timeout 30 (s)" Aug 19 08:18:34.014736 containerd[1722]: time="2025-08-19T08:18:34.014711091Z" level=info msg="Stop container \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" with signal terminated" Aug 19 08:18:34.048245 systemd[1]: cri-containerd-30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa.scope: Deactivated successfully. Aug 19 08:18:34.054579 containerd[1722]: time="2025-08-19T08:18:34.054550476Z" level=info msg="received exit event container_id:\"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" id:\"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" pid:5357 exit_status:1 exited_at:{seconds:1755591514 nanos:52996432}" Aug 19 08:18:34.055763 containerd[1722]: time="2025-08-19T08:18:34.054769165Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" id:\"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" pid:5357 exit_status:1 exited_at:{seconds:1755591514 nanos:52996432}" Aug 19 08:18:34.097327 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa-rootfs.mount: Deactivated successfully. Aug 19 08:18:34.138353 containerd[1722]: time="2025-08-19T08:18:34.138273400Z" level=info msg="StopContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" returns successfully" Aug 19 08:18:34.139803 containerd[1722]: time="2025-08-19T08:18:34.139114567Z" level=info msg="StopPodSandbox for \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\"" Aug 19 08:18:34.139949 containerd[1722]: time="2025-08-19T08:18:34.139927963Z" level=info msg="Container to stop \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 08:18:34.148555 systemd[1]: cri-containerd-9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40.scope: Deactivated successfully. Aug 19 08:18:34.151265 containerd[1722]: time="2025-08-19T08:18:34.151234825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" pid:4862 exit_status:137 exited_at:{seconds:1755591514 nanos:150579208}" Aug 19 08:18:34.189606 containerd[1722]: time="2025-08-19T08:18:34.189476983Z" level=info msg="shim disconnected" id=9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40 namespace=k8s.io Aug 19 08:18:34.189606 containerd[1722]: time="2025-08-19T08:18:34.189499893Z" level=warning msg="cleaning up after shim disconnected" id=9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40 namespace=k8s.io Aug 19 08:18:34.190128 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40-rootfs.mount: Deactivated successfully. Aug 19 08:18:34.190412 containerd[1722]: time="2025-08-19T08:18:34.189507243Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 08:18:34.221454 containerd[1722]: time="2025-08-19T08:18:34.221420611Z" level=info msg="received exit event sandbox_id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" exit_status:137 exited_at:{seconds:1755591514 nanos:150579208}" Aug 19 08:18:34.221975 containerd[1722]: time="2025-08-19T08:18:34.221949240Z" level=error msg="Failed to handle event container_id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" pid:4862 exit_status:137 exited_at:{seconds:1755591514 nanos:150579208} for 9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" error="failed to handle container TaskExit event: failed to stop sandbox: failed to delete task: ttrpc: closed" Aug 19 08:18:34.227277 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40-shm.mount: Deactivated successfully. Aug 19 08:18:34.313753 systemd-networkd[1370]: cali3c4179ea93f: Link DOWN Aug 19 08:18:34.313759 systemd-networkd[1370]: cali3c4179ea93f: Lost carrier Aug 19 08:18:34.315957 systemd-resolved[1633]: cali3c4179ea93f: Failed to determine whether the interface is managed, ignoring: No such file or directory Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.312 [INFO][6112] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.312 [INFO][6112] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" iface="eth0" netns="/var/run/netns/cni-44c5aede-a899-2536-8bc0-c976aa613029" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.313 [INFO][6112] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" iface="eth0" netns="/var/run/netns/cni-44c5aede-a899-2536-8bc0-c976aa613029" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.320 [INFO][6112] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" after=7.034411ms iface="eth0" netns="/var/run/netns/cni-44c5aede-a899-2536-8bc0-c976aa613029" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.320 [INFO][6112] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.320 [INFO][6112] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.358 [INFO][6120] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.359 [INFO][6120] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.359 [INFO][6120] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.425 [INFO][6120] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.425 [INFO][6120] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.427 [INFO][6120] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:34.431203 containerd[1722]: 2025-08-19 08:18:34.429 [INFO][6112] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:18:34.435238 containerd[1722]: time="2025-08-19T08:18:34.435199890Z" level=info msg="TearDown network for sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" successfully" Aug 19 08:18:34.435238 containerd[1722]: time="2025-08-19T08:18:34.435238848Z" level=info msg="StopPodSandbox for \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" returns successfully" Aug 19 08:18:34.436031 systemd[1]: run-netns-cni\x2d44c5aede\x2da899\x2d2536\x2d8bc0\x2dc976aa613029.mount: Deactivated successfully. Aug 19 08:18:34.516136 kubelet[3115]: I0819 08:18:34.516109 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5acb51bf-65bc-4f86-93a4-1cc021ca500f-calico-apiserver-certs\") pod \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\" (UID: \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\") " Aug 19 08:18:34.517376 kubelet[3115]: I0819 08:18:34.516148 3115 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klwwc\" (UniqueName: \"kubernetes.io/projected/5acb51bf-65bc-4f86-93a4-1cc021ca500f-kube-api-access-klwwc\") pod \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\" (UID: \"5acb51bf-65bc-4f86-93a4-1cc021ca500f\") " Aug 19 08:18:34.523065 kubelet[3115]: I0819 08:18:34.520163 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5acb51bf-65bc-4f86-93a4-1cc021ca500f-kube-api-access-klwwc" (OuterVolumeSpecName: "kube-api-access-klwwc") pod "5acb51bf-65bc-4f86-93a4-1cc021ca500f" (UID: "5acb51bf-65bc-4f86-93a4-1cc021ca500f"). InnerVolumeSpecName "kube-api-access-klwwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 19 08:18:34.521932 systemd[1]: var-lib-kubelet-pods-5acb51bf\x2d65bc\x2d4f86\x2d93a4\x2d1cc021ca500f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dklwwc.mount: Deactivated successfully. Aug 19 08:18:34.523789 kubelet[3115]: I0819 08:18:34.523764 3115 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acb51bf-65bc-4f86-93a4-1cc021ca500f-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "5acb51bf-65bc-4f86-93a4-1cc021ca500f" (UID: "5acb51bf-65bc-4f86-93a4-1cc021ca500f"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 19 08:18:34.617199 kubelet[3115]: I0819 08:18:34.617110 3115 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klwwc\" (UniqueName: \"kubernetes.io/projected/5acb51bf-65bc-4f86-93a4-1cc021ca500f-kube-api-access-klwwc\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:18:34.617199 kubelet[3115]: I0819 08:18:34.617142 3115 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5acb51bf-65bc-4f86-93a4-1cc021ca500f-calico-apiserver-certs\") on node \"ci-4426.0.0-a-5588c1b4cf\" DevicePath \"\"" Aug 19 08:18:34.828323 systemd[1]: Removed slice kubepods-besteffort-pod5acb51bf_65bc_4f86_93a4_1cc021ca500f.slice - libcontainer container kubepods-besteffort-pod5acb51bf_65bc_4f86_93a4_1cc021ca500f.slice. Aug 19 08:18:35.096416 systemd[1]: var-lib-kubelet-pods-5acb51bf\x2d65bc\x2d4f86\x2d93a4\x2d1cc021ca500f-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 19 08:18:35.120657 kubelet[3115]: I0819 08:18:35.120625 3115 scope.go:117] "RemoveContainer" containerID="30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa" Aug 19 08:18:35.124361 containerd[1722]: time="2025-08-19T08:18:35.124293093Z" level=info msg="RemoveContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\"" Aug 19 08:18:35.143969 containerd[1722]: time="2025-08-19T08:18:35.143928564Z" level=info msg="RemoveContainer for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" returns successfully" Aug 19 08:18:35.144153 kubelet[3115]: I0819 08:18:35.144136 3115 scope.go:117] "RemoveContainer" containerID="30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa" Aug 19 08:18:35.144400 containerd[1722]: time="2025-08-19T08:18:35.144371723Z" level=error msg="ContainerStatus for \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\": not found" Aug 19 08:18:35.144512 kubelet[3115]: E0819 08:18:35.144491 3115 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\": not found" containerID="30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa" Aug 19 08:18:35.144648 kubelet[3115]: I0819 08:18:35.144532 3115 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa"} err="failed to get container status \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\": rpc error: code = NotFound desc = an error occurred when try to find container \"30892f63af98db988069c40c83318f9f8a6882b6337ad3329aeaa4dc696ea7aa\": not found" Aug 19 08:18:35.973061 containerd[1722]: time="2025-08-19T08:18:35.971756268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" id:\"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" pid:4862 exit_status:137 exited_at:{seconds:1755591514 nanos:150579208}" Aug 19 08:18:36.741505 containerd[1722]: time="2025-08-19T08:18:36.741444079Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"115ff8cf3e6e748549e49545822004f49b5b20226f8711dfa550c1da0d6ffb57\" pid:6150 exited_at:{seconds:1755591516 nanos:741175345}" Aug 19 08:18:36.820095 kubelet[3115]: I0819 08:18:36.820026 3115 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5acb51bf-65bc-4f86-93a4-1cc021ca500f" path="/var/lib/kubelet/pods/5acb51bf-65bc-4f86-93a4-1cc021ca500f/volumes" Aug 19 08:18:44.718263 containerd[1722]: time="2025-08-19T08:18:44.718157865Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"9846771f7d1064c3e37832b3cc5dc19395c934eb99ee1f86fc97c390466a5b08\" pid:6179 exited_at:{seconds:1755591524 nanos:717895170}" Aug 19 08:19:00.818378 containerd[1722]: time="2025-08-19T08:19:00.818293325Z" level=info msg="StopPodSandbox for \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\"" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.845 [WARNING][6206] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.846 [INFO][6206] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.846 [INFO][6206] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" iface="eth0" netns="" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.846 [INFO][6206] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.846 [INFO][6206] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.866 [INFO][6213] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.866 [INFO][6213] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.866 [INFO][6213] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.870 [WARNING][6213] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.870 [INFO][6213] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.871 [INFO][6213] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:19:00.873521 containerd[1722]: 2025-08-19 08:19:00.872 [INFO][6206] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.873911 containerd[1722]: time="2025-08-19T08:19:00.873551573Z" level=info msg="TearDown network for sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" successfully" Aug 19 08:19:00.873911 containerd[1722]: time="2025-08-19T08:19:00.873580299Z" level=info msg="StopPodSandbox for \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" returns successfully" Aug 19 08:19:00.874125 containerd[1722]: time="2025-08-19T08:19:00.874094763Z" level=info msg="RemovePodSandbox for \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\"" Aug 19 08:19:00.874185 containerd[1722]: time="2025-08-19T08:19:00.874130763Z" level=info msg="Forcibly stopping sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\"" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.905 [WARNING][6228] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.905 [INFO][6228] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.905 [INFO][6228] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" iface="eth0" netns="" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.905 [INFO][6228] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.905 [INFO][6228] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.921 [INFO][6235] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.921 [INFO][6235] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.921 [INFO][6235] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.925 [WARNING][6235] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.926 [INFO][6235] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" HandleID="k8s-pod-network.43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--n6vc8-eth0" Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.926 [INFO][6235] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:19:00.928506 containerd[1722]: 2025-08-19 08:19:00.927 [INFO][6228] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b" Aug 19 08:19:00.928830 containerd[1722]: time="2025-08-19T08:19:00.928505107Z" level=info msg="TearDown network for sandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" successfully" Aug 19 08:19:00.929932 containerd[1722]: time="2025-08-19T08:19:00.929908758Z" level=info msg="Ensure that sandbox 43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b in task-service has been cleanup successfully" Aug 19 08:19:00.944214 containerd[1722]: time="2025-08-19T08:19:00.944125913Z" level=info msg="RemovePodSandbox \"43999031b487dddba4eb667fa40981d67a9eef28f6b4025a55b8c2f36c725a8b\" returns successfully" Aug 19 08:19:00.944499 containerd[1722]: time="2025-08-19T08:19:00.944472490Z" level=info msg="StopPodSandbox for \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\"" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.966 [WARNING][6249] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.967 [INFO][6249] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.967 [INFO][6249] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" iface="eth0" netns="" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.967 [INFO][6249] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.967 [INFO][6249] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.981 [INFO][6256] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.981 [INFO][6256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.981 [INFO][6256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.985 [WARNING][6256] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.985 [INFO][6256] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.986 [INFO][6256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:19:00.987750 containerd[1722]: 2025-08-19 08:19:00.987 [INFO][6249] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:00.988137 containerd[1722]: time="2025-08-19T08:19:00.987789758Z" level=info msg="TearDown network for sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" successfully" Aug 19 08:19:00.988137 containerd[1722]: time="2025-08-19T08:19:00.987810643Z" level=info msg="StopPodSandbox for \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" returns successfully" Aug 19 08:19:00.988211 containerd[1722]: time="2025-08-19T08:19:00.988193040Z" level=info msg="RemovePodSandbox for \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\"" Aug 19 08:19:00.988240 containerd[1722]: time="2025-08-19T08:19:00.988222442Z" level=info msg="Forcibly stopping sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\"" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.010 [WARNING][6270] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" WorkloadEndpoint="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.010 [INFO][6270] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.010 [INFO][6270] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" iface="eth0" netns="" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.010 [INFO][6270] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.010 [INFO][6270] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.024 [INFO][6277] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.024 [INFO][6277] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.024 [INFO][6277] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.028 [WARNING][6277] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.028 [INFO][6277] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" HandleID="k8s-pod-network.9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Workload="ci--4426.0.0--a--5588c1b4cf-k8s-calico--apiserver--655b7bdffd--hjd48-eth0" Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.030 [INFO][6277] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:19:01.031725 containerd[1722]: 2025-08-19 08:19:01.030 [INFO][6270] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40" Aug 19 08:19:01.032062 containerd[1722]: time="2025-08-19T08:19:01.031749956Z" level=info msg="TearDown network for sandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" successfully" Aug 19 08:19:01.033253 containerd[1722]: time="2025-08-19T08:19:01.033230939Z" level=info msg="Ensure that sandbox 9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40 in task-service has been cleanup successfully" Aug 19 08:19:01.048043 containerd[1722]: time="2025-08-19T08:19:01.048015065Z" level=info msg="RemovePodSandbox \"9453318845cf108d5cb1e2cbc9ce1dc6e36c6abfaf55885822bc5aa233890f40\" returns successfully" Aug 19 08:19:02.364279 containerd[1722]: time="2025-08-19T08:19:02.364211948Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"8f73db3bbb47caa2c4c88d445ccf89c90012de1dc76bad42a8651d85bd07f143\" pid:6295 exited_at:{seconds:1755591542 nanos:363896849}" Aug 19 08:19:06.743060 containerd[1722]: time="2025-08-19T08:19:06.743000787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"e9ab9748a593428eed9814f393cd2b3978daf29c7d78476bdd5957064738e05b\" pid:6319 exited_at:{seconds:1755591546 nanos:742769470}" Aug 19 08:19:10.223707 containerd[1722]: time="2025-08-19T08:19:10.223653552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"69667578459e3a275a9bdec77f81c4d9a172fc977720b28464567660837c4148\" pid:6346 exited_at:{seconds:1755591550 nanos:223480567}" Aug 19 08:19:14.717086 containerd[1722]: time="2025-08-19T08:19:14.716966007Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"cac308f640fede589b93a6806fd6f9507e5092a594e4adec98d7abc558485d66\" pid:6391 exited_at:{seconds:1755591554 nanos:716734959}" Aug 19 08:19:18.992032 systemd[1]: Started sshd@7-10.200.8.40:22-10.200.16.10:52886.service - OpenSSH per-connection server daemon (10.200.16.10:52886). Aug 19 08:19:19.627933 sshd[6406]: Accepted publickey for core from 10.200.16.10 port 52886 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:19.630158 sshd-session[6406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:19.635225 systemd-logind[1706]: New session 10 of user core. Aug 19 08:19:19.641177 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 08:19:19.680930 containerd[1722]: time="2025-08-19T08:19:19.680894850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"814553a3d54015c8772b972a2e6cc8349beb0810a9a9df55b5cf09114720a959\" pid:6422 exited_at:{seconds:1755591559 nanos:680651605}" Aug 19 08:19:20.148727 sshd[6428]: Connection closed by 10.200.16.10 port 52886 Aug 19 08:19:20.149120 sshd-session[6406]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:20.152462 systemd[1]: sshd@7-10.200.8.40:22-10.200.16.10:52886.service: Deactivated successfully. Aug 19 08:19:20.154186 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 08:19:20.154953 systemd-logind[1706]: Session 10 logged out. Waiting for processes to exit. Aug 19 08:19:20.156000 systemd-logind[1706]: Removed session 10. Aug 19 08:19:25.265300 systemd[1]: Started sshd@8-10.200.8.40:22-10.200.16.10:36604.service - OpenSSH per-connection server daemon (10.200.16.10:36604). Aug 19 08:19:25.915199 sshd[6446]: Accepted publickey for core from 10.200.16.10 port 36604 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:25.917151 sshd-session[6446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:25.922482 systemd-logind[1706]: New session 11 of user core. Aug 19 08:19:25.930480 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 08:19:26.496316 sshd[6449]: Connection closed by 10.200.16.10 port 36604 Aug 19 08:19:26.496938 sshd-session[6446]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:26.500384 systemd[1]: sshd@8-10.200.8.40:22-10.200.16.10:36604.service: Deactivated successfully. Aug 19 08:19:26.502310 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 08:19:26.503060 systemd-logind[1706]: Session 11 logged out. Waiting for processes to exit. Aug 19 08:19:26.504554 systemd-logind[1706]: Removed session 11. Aug 19 08:19:31.616117 systemd[1]: Started sshd@9-10.200.8.40:22-10.200.16.10:52188.service - OpenSSH per-connection server daemon (10.200.16.10:52188). Aug 19 08:19:32.254674 sshd[6461]: Accepted publickey for core from 10.200.16.10 port 52188 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:32.255926 sshd-session[6461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:32.260379 systemd-logind[1706]: New session 12 of user core. Aug 19 08:19:32.266183 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 08:19:32.363489 containerd[1722]: time="2025-08-19T08:19:32.363448148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"47e005c8981a414cc71200be4abfcefd9d449f2b89c16ccd920a15ddc3922eb9\" pid:6478 exited_at:{seconds:1755591572 nanos:363069677}" Aug 19 08:19:32.751851 sshd[6464]: Connection closed by 10.200.16.10 port 52188 Aug 19 08:19:32.752364 sshd-session[6461]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:32.755760 systemd[1]: sshd@9-10.200.8.40:22-10.200.16.10:52188.service: Deactivated successfully. Aug 19 08:19:32.757666 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 08:19:32.758620 systemd-logind[1706]: Session 12 logged out. Waiting for processes to exit. Aug 19 08:19:32.759740 systemd-logind[1706]: Removed session 12. Aug 19 08:19:32.873997 systemd[1]: Started sshd@10-10.200.8.40:22-10.200.16.10:52192.service - OpenSSH per-connection server daemon (10.200.16.10:52192). Aug 19 08:19:33.509969 sshd[6501]: Accepted publickey for core from 10.200.16.10 port 52192 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:33.511175 sshd-session[6501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:33.515141 systemd-logind[1706]: New session 13 of user core. Aug 19 08:19:33.519190 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 08:19:34.053948 sshd[6504]: Connection closed by 10.200.16.10 port 52192 Aug 19 08:19:34.054545 sshd-session[6501]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:34.057324 systemd[1]: sshd@10-10.200.8.40:22-10.200.16.10:52192.service: Deactivated successfully. Aug 19 08:19:34.059293 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 08:19:34.061256 systemd-logind[1706]: Session 13 logged out. Waiting for processes to exit. Aug 19 08:19:34.062203 systemd-logind[1706]: Removed session 13. Aug 19 08:19:34.165947 systemd[1]: Started sshd@11-10.200.8.40:22-10.200.16.10:52202.service - OpenSSH per-connection server daemon (10.200.16.10:52202). Aug 19 08:19:34.801609 sshd[6514]: Accepted publickey for core from 10.200.16.10 port 52202 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:34.802778 sshd-session[6514]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:34.807114 systemd-logind[1706]: New session 14 of user core. Aug 19 08:19:34.814189 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 08:19:35.298904 sshd[6517]: Connection closed by 10.200.16.10 port 52202 Aug 19 08:19:35.299504 sshd-session[6514]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:35.302342 systemd[1]: sshd@11-10.200.8.40:22-10.200.16.10:52202.service: Deactivated successfully. Aug 19 08:19:35.304167 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 08:19:35.305522 systemd-logind[1706]: Session 14 logged out. Waiting for processes to exit. Aug 19 08:19:35.306568 systemd-logind[1706]: Removed session 14. Aug 19 08:19:36.739029 containerd[1722]: time="2025-08-19T08:19:36.738981038Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"ba06926d8a8fd332d613e1ca236680e8a9cc511164407cdd291540365ade13cc\" pid:6542 exited_at:{seconds:1755591576 nanos:738749344}" Aug 19 08:19:40.416022 systemd[1]: Started sshd@12-10.200.8.40:22-10.200.16.10:58432.service - OpenSSH per-connection server daemon (10.200.16.10:58432). Aug 19 08:19:41.049624 sshd[6558]: Accepted publickey for core from 10.200.16.10 port 58432 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:41.050787 sshd-session[6558]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:41.055133 systemd-logind[1706]: New session 15 of user core. Aug 19 08:19:41.059171 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 08:19:41.558862 sshd[6561]: Connection closed by 10.200.16.10 port 58432 Aug 19 08:19:41.559387 sshd-session[6558]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:41.562200 systemd[1]: sshd@12-10.200.8.40:22-10.200.16.10:58432.service: Deactivated successfully. Aug 19 08:19:41.563803 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 08:19:41.565027 systemd-logind[1706]: Session 15 logged out. Waiting for processes to exit. Aug 19 08:19:41.566125 systemd-logind[1706]: Removed session 15. Aug 19 08:19:44.716410 containerd[1722]: time="2025-08-19T08:19:44.716339392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"3dfa5fc592ae5ae2c89a6edf9622cb393b891b38b14e82801c9beed297bce629\" pid:6583 exited_at:{seconds:1755591584 nanos:716102529}" Aug 19 08:19:46.672825 systemd[1]: Started sshd@13-10.200.8.40:22-10.200.16.10:58440.service - OpenSSH per-connection server daemon (10.200.16.10:58440). Aug 19 08:19:47.316504 sshd[6593]: Accepted publickey for core from 10.200.16.10 port 58440 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:47.317676 sshd-session[6593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:47.321653 systemd-logind[1706]: New session 16 of user core. Aug 19 08:19:47.324192 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 08:19:47.818859 sshd[6596]: Connection closed by 10.200.16.10 port 58440 Aug 19 08:19:47.819408 sshd-session[6593]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:47.822631 systemd[1]: sshd@13-10.200.8.40:22-10.200.16.10:58440.service: Deactivated successfully. Aug 19 08:19:47.824699 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 08:19:47.825571 systemd-logind[1706]: Session 16 logged out. Waiting for processes to exit. Aug 19 08:19:47.826739 systemd-logind[1706]: Removed session 16. Aug 19 08:19:52.937283 systemd[1]: Started sshd@14-10.200.8.40:22-10.200.16.10:51142.service - OpenSSH per-connection server daemon (10.200.16.10:51142). Aug 19 08:19:53.576879 sshd[6608]: Accepted publickey for core from 10.200.16.10 port 51142 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:53.579578 sshd-session[6608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:53.590096 systemd-logind[1706]: New session 17 of user core. Aug 19 08:19:53.594205 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 08:19:54.167950 sshd[6611]: Connection closed by 10.200.16.10 port 51142 Aug 19 08:19:54.169542 sshd-session[6608]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:54.173605 systemd-logind[1706]: Session 17 logged out. Waiting for processes to exit. Aug 19 08:19:54.175593 systemd[1]: sshd@14-10.200.8.40:22-10.200.16.10:51142.service: Deactivated successfully. Aug 19 08:19:54.178878 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 08:19:54.181640 systemd-logind[1706]: Removed session 17. Aug 19 08:19:54.283089 systemd[1]: Started sshd@15-10.200.8.40:22-10.200.16.10:51150.service - OpenSSH per-connection server daemon (10.200.16.10:51150). Aug 19 08:19:54.933869 sshd[6623]: Accepted publickey for core from 10.200.16.10 port 51150 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:54.935419 sshd-session[6623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:54.940088 systemd-logind[1706]: New session 18 of user core. Aug 19 08:19:54.947213 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 08:19:55.533907 sshd[6628]: Connection closed by 10.200.16.10 port 51150 Aug 19 08:19:55.534541 sshd-session[6623]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:55.538064 systemd[1]: sshd@15-10.200.8.40:22-10.200.16.10:51150.service: Deactivated successfully. Aug 19 08:19:55.539695 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 08:19:55.540524 systemd-logind[1706]: Session 18 logged out. Waiting for processes to exit. Aug 19 08:19:55.541633 systemd-logind[1706]: Removed session 18. Aug 19 08:19:55.656493 systemd[1]: Started sshd@16-10.200.8.40:22-10.200.16.10:51158.service - OpenSSH per-connection server daemon (10.200.16.10:51158). Aug 19 08:19:56.291191 sshd[6638]: Accepted publickey for core from 10.200.16.10 port 51158 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:56.292345 sshd-session[6638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:56.297963 systemd-logind[1706]: New session 19 of user core. Aug 19 08:19:56.302195 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 08:19:58.236340 sshd[6641]: Connection closed by 10.200.16.10 port 51158 Aug 19 08:19:58.236649 sshd-session[6638]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:58.239585 systemd[1]: sshd@16-10.200.8.40:22-10.200.16.10:51158.service: Deactivated successfully. Aug 19 08:19:58.241448 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 08:19:58.241637 systemd[1]: session-19.scope: Consumed 444ms CPU time, 76.1M memory peak. Aug 19 08:19:58.243274 systemd-logind[1706]: Session 19 logged out. Waiting for processes to exit. Aug 19 08:19:58.244348 systemd-logind[1706]: Removed session 19. Aug 19 08:19:58.352521 systemd[1]: Started sshd@17-10.200.8.40:22-10.200.16.10:51166.service - OpenSSH per-connection server daemon (10.200.16.10:51166). Aug 19 08:19:58.988489 sshd[6658]: Accepted publickey for core from 10.200.16.10 port 51166 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:19:58.989591 sshd-session[6658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:58.993759 systemd-logind[1706]: New session 20 of user core. Aug 19 08:19:58.998202 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 08:19:59.581725 sshd[6661]: Connection closed by 10.200.16.10 port 51166 Aug 19 08:19:59.583182 sshd-session[6658]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:59.586370 systemd[1]: sshd@17-10.200.8.40:22-10.200.16.10:51166.service: Deactivated successfully. Aug 19 08:19:59.587971 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 08:19:59.588807 systemd-logind[1706]: Session 20 logged out. Waiting for processes to exit. Aug 19 08:19:59.590008 systemd-logind[1706]: Removed session 20. Aug 19 08:19:59.693974 systemd[1]: Started sshd@18-10.200.8.40:22-10.200.16.10:51168.service - OpenSSH per-connection server daemon (10.200.16.10:51168). Aug 19 08:20:00.337581 sshd[6671]: Accepted publickey for core from 10.200.16.10 port 51168 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:00.343453 sshd-session[6671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:00.347671 systemd-logind[1706]: New session 21 of user core. Aug 19 08:20:00.350181 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 08:20:00.838976 sshd[6674]: Connection closed by 10.200.16.10 port 51168 Aug 19 08:20:00.839525 sshd-session[6671]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:00.842709 systemd[1]: sshd@18-10.200.8.40:22-10.200.16.10:51168.service: Deactivated successfully. Aug 19 08:20:00.844645 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 08:20:00.845649 systemd-logind[1706]: Session 21 logged out. Waiting for processes to exit. Aug 19 08:20:00.846945 systemd-logind[1706]: Removed session 21. Aug 19 08:20:02.364483 containerd[1722]: time="2025-08-19T08:20:02.364439045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"fdfe4b8092d0c244185f6ccb550592352cfcbd5f4164cc02cfb06e0595902270\" pid:6700 exited_at:{seconds:1755591602 nanos:364161654}" Aug 19 08:20:05.956518 systemd[1]: Started sshd@19-10.200.8.40:22-10.200.16.10:40342.service - OpenSSH per-connection server daemon (10.200.16.10:40342). Aug 19 08:20:06.591171 sshd[6715]: Accepted publickey for core from 10.200.16.10 port 40342 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:06.592367 sshd-session[6715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:06.596210 systemd-logind[1706]: New session 22 of user core. Aug 19 08:20:06.601182 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 08:20:06.738335 containerd[1722]: time="2025-08-19T08:20:06.738288355Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"be91616ba1e8e104cb46be75fe3998a561c4e0e3487ca987b33fdf129e719639\" pid:6733 exited_at:{seconds:1755591606 nanos:738078840}" Aug 19 08:20:07.096528 sshd[6720]: Connection closed by 10.200.16.10 port 40342 Aug 19 08:20:07.097162 sshd-session[6715]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:07.100871 systemd[1]: sshd@19-10.200.8.40:22-10.200.16.10:40342.service: Deactivated successfully. Aug 19 08:20:07.102809 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 08:20:07.103568 systemd-logind[1706]: Session 22 logged out. Waiting for processes to exit. Aug 19 08:20:07.105065 systemd-logind[1706]: Removed session 22. Aug 19 08:20:10.217625 containerd[1722]: time="2025-08-19T08:20:10.217582005Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"2994e73f2e60b18ee8afc185ae2dd824619001ba61bc1374754000cec41cc199\" pid:6767 exited_at:{seconds:1755591610 nanos:217391838}" Aug 19 08:20:12.223480 systemd[1]: Started sshd@20-10.200.8.40:22-10.200.16.10:39048.service - OpenSSH per-connection server daemon (10.200.16.10:39048). Aug 19 08:20:12.865183 sshd[6778]: Accepted publickey for core from 10.200.16.10 port 39048 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:12.867391 sshd-session[6778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:12.872087 systemd-logind[1706]: New session 23 of user core. Aug 19 08:20:12.879364 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 19 08:20:13.409895 sshd[6781]: Connection closed by 10.200.16.10 port 39048 Aug 19 08:20:13.413221 sshd-session[6778]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:13.416716 systemd-logind[1706]: Session 23 logged out. Waiting for processes to exit. Aug 19 08:20:13.418801 systemd[1]: sshd@20-10.200.8.40:22-10.200.16.10:39048.service: Deactivated successfully. Aug 19 08:20:13.424632 systemd[1]: session-23.scope: Deactivated successfully. Aug 19 08:20:13.427526 systemd-logind[1706]: Removed session 23. Aug 19 08:20:14.740960 containerd[1722]: time="2025-08-19T08:20:14.740914185Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f599e7db3c65ad463341da373ac1987f482e9ab53da9f8c401d29d4bbe8336e0\" id:\"8a2a606667ac1551295a9b66b7b2835ff59d6d30b06e3bc04050092384f8a406\" pid:6804 exited_at:{seconds:1755591614 nanos:740505310}" Aug 19 08:20:18.524003 systemd[1]: Started sshd@21-10.200.8.40:22-10.200.16.10:39052.service - OpenSSH per-connection server daemon (10.200.16.10:39052). Aug 19 08:20:19.159847 sshd[6815]: Accepted publickey for core from 10.200.16.10 port 39052 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:19.161008 sshd-session[6815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:19.164923 systemd-logind[1706]: New session 24 of user core. Aug 19 08:20:19.170202 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 19 08:20:19.652764 sshd[6818]: Connection closed by 10.200.16.10 port 39052 Aug 19 08:20:19.654620 sshd-session[6815]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:19.658363 systemd-logind[1706]: Session 24 logged out. Waiting for processes to exit. Aug 19 08:20:19.659530 systemd[1]: sshd@21-10.200.8.40:22-10.200.16.10:39052.service: Deactivated successfully. Aug 19 08:20:19.662423 systemd[1]: session-24.scope: Deactivated successfully. Aug 19 08:20:19.665017 systemd-logind[1706]: Removed session 24. Aug 19 08:20:19.676316 containerd[1722]: time="2025-08-19T08:20:19.676283124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"d83811136d6f845ad75eb292d89ea2675ce808a77dca2924d48bb57180a83ad7\" pid:6839 exited_at:{seconds:1755591619 nanos:675983041}" Aug 19 08:20:24.780289 systemd[1]: Started sshd@22-10.200.8.40:22-10.200.16.10:35724.service - OpenSSH per-connection server daemon (10.200.16.10:35724). Aug 19 08:20:25.425764 sshd[6861]: Accepted publickey for core from 10.200.16.10 port 35724 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:25.426952 sshd-session[6861]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:25.430651 systemd-logind[1706]: New session 25 of user core. Aug 19 08:20:25.436191 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 19 08:20:25.956068 sshd[6864]: Connection closed by 10.200.16.10 port 35724 Aug 19 08:20:25.956713 sshd-session[6861]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:25.960791 systemd[1]: sshd@22-10.200.8.40:22-10.200.16.10:35724.service: Deactivated successfully. Aug 19 08:20:25.964832 systemd[1]: session-25.scope: Deactivated successfully. Aug 19 08:20:25.967678 systemd-logind[1706]: Session 25 logged out. Waiting for processes to exit. Aug 19 08:20:25.969932 systemd-logind[1706]: Removed session 25. Aug 19 08:20:31.073330 systemd[1]: Started sshd@23-10.200.8.40:22-10.200.16.10:55002.service - OpenSSH per-connection server daemon (10.200.16.10:55002). Aug 19 08:20:31.714556 sshd[6876]: Accepted publickey for core from 10.200.16.10 port 55002 ssh2: RSA SHA256:rwMkWSJkyuzYLtFMdKgQVeSj1yB6THB2GYbfaCk21Xc Aug 19 08:20:31.715756 sshd-session[6876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:31.720256 systemd-logind[1706]: New session 26 of user core. Aug 19 08:20:31.725199 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 19 08:20:32.281952 sshd[6879]: Connection closed by 10.200.16.10 port 55002 Aug 19 08:20:32.282527 sshd-session[6876]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:32.285690 systemd[1]: sshd@23-10.200.8.40:22-10.200.16.10:55002.service: Deactivated successfully. Aug 19 08:20:32.287269 systemd[1]: session-26.scope: Deactivated successfully. Aug 19 08:20:32.287920 systemd-logind[1706]: Session 26 logged out. Waiting for processes to exit. Aug 19 08:20:32.289033 systemd-logind[1706]: Removed session 26. Aug 19 08:20:32.364675 containerd[1722]: time="2025-08-19T08:20:32.364616466Z" level=info msg="TaskExit event in podsandbox handler container_id:\"460d48267778ad71add22b67856c2b6df04bfc3f085fac253ac1ccbdc04e9bd0\" id:\"a20a9f3d48b2361e9d1cf6c7a489db6230e93f35c5c4167b3c7dc3eac3345bba\" pid:6903 exited_at:{seconds:1755591632 nanos:364346060}" Aug 19 08:20:36.738594 containerd[1722]: time="2025-08-19T08:20:36.738549393Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a3b4c3fb59c2bf61f5bd254bbdef7c5306b759c292237119aa5a823b515ed90\" id:\"73b7e93c74299dea66e4fbffbc08db76204b18a52b9016adf602e5aa4cb88edf\" pid:6927 exited_at:{seconds:1755591636 nanos:738289147}"