Sep 12 22:59:51.916414 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 20:38:35 -00 2025 Sep 12 22:59:51.916435 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:51.916452 kernel: BIOS-provided physical RAM map: Sep 12 22:59:51.916459 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 22:59:51.916465 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Sep 12 22:59:51.916471 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000044fdfff] usable Sep 12 22:59:51.916477 kernel: BIOS-e820: [mem 0x00000000044fe000-0x00000000048fdfff] reserved Sep 12 22:59:51.916484 kernel: BIOS-e820: [mem 0x00000000048fe000-0x000000003ff1efff] usable Sep 12 22:59:51.916490 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ffc8fff] reserved Sep 12 22:59:51.916496 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Sep 12 22:59:51.916501 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Sep 12 22:59:51.916507 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Sep 12 22:59:51.916513 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Sep 12 22:59:51.916520 kernel: printk: legacy bootconsole [earlyser0] enabled Sep 12 22:59:51.916531 kernel: NX (Execute Disable) protection: active Sep 12 22:59:51.916539 kernel: APIC: Static calls initialized Sep 12 22:59:51.916546 kernel: efi: EFI v2.7 by Microsoft Sep 12 22:59:51.916553 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3ead5518 RNG=0x3ffd2018 Sep 12 22:59:51.916562 kernel: random: crng init done Sep 12 22:59:51.916568 kernel: secureboot: Secure boot disabled Sep 12 22:59:51.916574 kernel: SMBIOS 3.1.0 present. Sep 12 22:59:51.916581 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 01/28/2025 Sep 12 22:59:51.916587 kernel: DMI: Memory slots populated: 2/2 Sep 12 22:59:51.916595 kernel: Hypervisor detected: Microsoft Hyper-V Sep 12 22:59:51.916602 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 Sep 12 22:59:51.916608 kernel: Hyper-V: Nested features: 0x3e0101 Sep 12 22:59:51.916614 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Sep 12 22:59:51.916621 kernel: Hyper-V: Using hypercall for remote TLB flush Sep 12 22:59:51.916629 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Sep 12 22:59:51.916636 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Sep 12 22:59:51.916642 kernel: tsc: Detected 2299.999 MHz processor Sep 12 22:59:51.916649 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 22:59:51.916658 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 22:59:51.916666 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 Sep 12 22:59:51.916679 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 12 22:59:51.916686 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 22:59:51.916693 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved Sep 12 22:59:51.916700 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 Sep 12 22:59:51.916708 kernel: Using GB pages for direct mapping Sep 12 22:59:51.916715 kernel: ACPI: Early table checksum verification disabled Sep 12 22:59:51.916728 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Sep 12 22:59:51.916738 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916745 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916753 kernel: ACPI: DSDT 0x000000003FFD6000 01E27A (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 12 22:59:51.916760 kernel: ACPI: FACS 0x000000003FFFE000 000040 Sep 12 22:59:51.916768 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916774 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916784 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916792 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) Sep 12 22:59:51.916799 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) Sep 12 22:59:51.916806 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 22:59:51.916815 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Sep 12 22:59:51.916822 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4279] Sep 12 22:59:51.916830 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Sep 12 22:59:51.916838 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Sep 12 22:59:51.916847 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Sep 12 22:59:51.916857 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Sep 12 22:59:51.916865 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] Sep 12 22:59:51.916872 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] Sep 12 22:59:51.916879 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Sep 12 22:59:51.916887 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] Sep 12 22:59:51.916894 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] Sep 12 22:59:51.916902 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] Sep 12 22:59:51.916909 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] Sep 12 22:59:51.916916 kernel: Zone ranges: Sep 12 22:59:51.916926 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 22:59:51.916934 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 12 22:59:51.916941 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Sep 12 22:59:51.916949 kernel: Device empty Sep 12 22:59:51.916956 kernel: Movable zone start for each node Sep 12 22:59:51.916963 kernel: Early memory node ranges Sep 12 22:59:51.916971 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 12 22:59:51.916979 kernel: node 0: [mem 0x0000000000100000-0x00000000044fdfff] Sep 12 22:59:51.916985 kernel: node 0: [mem 0x00000000048fe000-0x000000003ff1efff] Sep 12 22:59:51.916999 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Sep 12 22:59:51.917005 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Sep 12 22:59:51.917012 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Sep 12 22:59:51.917018 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 22:59:51.917025 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 12 22:59:51.917032 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 12 22:59:51.917039 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Sep 12 22:59:51.917045 kernel: ACPI: PM-Timer IO Port: 0x408 Sep 12 22:59:51.917052 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 22:59:51.917061 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 22:59:51.917068 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 22:59:51.917075 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Sep 12 22:59:51.917081 kernel: TSC deadline timer available Sep 12 22:59:51.917088 kernel: CPU topo: Max. logical packages: 1 Sep 12 22:59:51.917095 kernel: CPU topo: Max. logical dies: 1 Sep 12 22:59:51.917102 kernel: CPU topo: Max. dies per package: 1 Sep 12 22:59:51.917108 kernel: CPU topo: Max. threads per core: 2 Sep 12 22:59:51.917115 kernel: CPU topo: Num. cores per package: 1 Sep 12 22:59:51.917124 kernel: CPU topo: Num. threads per package: 2 Sep 12 22:59:51.917130 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 12 22:59:51.917137 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Sep 12 22:59:51.917144 kernel: Booting paravirtualized kernel on Hyper-V Sep 12 22:59:51.917151 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 22:59:51.917158 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 22:59:51.917164 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 12 22:59:51.917171 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 12 22:59:51.917178 kernel: pcpu-alloc: [0] 0 1 Sep 12 22:59:51.917186 kernel: Hyper-V: PV spinlocks enabled Sep 12 22:59:51.917193 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 22:59:51.917202 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:51.917210 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:59:51.917217 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Sep 12 22:59:51.917224 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:59:51.917231 kernel: Fallback order for Node 0: 0 Sep 12 22:59:51.917238 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2095807 Sep 12 22:59:51.917246 kernel: Policy zone: Normal Sep 12 22:59:51.917253 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:59:51.917260 kernel: software IO TLB: area num 2. Sep 12 22:59:51.917267 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 22:59:51.917274 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 22:59:51.917281 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 22:59:51.917288 kernel: Dynamic Preempt: voluntary Sep 12 22:59:51.917295 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:59:51.917304 kernel: rcu: RCU event tracing is enabled. Sep 12 22:59:51.917317 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 22:59:51.917325 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:59:51.917333 kernel: Rude variant of Tasks RCU enabled. Sep 12 22:59:51.917342 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:59:51.917349 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:59:51.917357 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 22:59:51.917365 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:51.917373 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:51.917380 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:51.917388 kernel: Using NULL legacy PIC Sep 12 22:59:51.917397 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Sep 12 22:59:51.917405 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:59:51.917413 kernel: Console: colour dummy device 80x25 Sep 12 22:59:51.917420 kernel: printk: legacy console [tty1] enabled Sep 12 22:59:51.917428 kernel: printk: legacy console [ttyS0] enabled Sep 12 22:59:51.917436 kernel: printk: legacy bootconsole [earlyser0] disabled Sep 12 22:59:51.917457 kernel: ACPI: Core revision 20240827 Sep 12 22:59:51.917466 kernel: Failed to register legacy timer interrupt Sep 12 22:59:51.917478 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 22:59:51.917485 kernel: x2apic enabled Sep 12 22:59:51.917491 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 22:59:51.917498 kernel: Hyper-V: Host Build 10.0.26100.1293-1-0 Sep 12 22:59:51.917504 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 12 22:59:51.917510 kernel: Hyper-V: Disabling IBT because of Hyper-V bug Sep 12 22:59:51.917516 kernel: Hyper-V: Using IPI hypercalls Sep 12 22:59:51.917523 kernel: APIC: send_IPI() replaced with hv_send_ipi() Sep 12 22:59:51.917531 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Sep 12 22:59:51.917537 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Sep 12 22:59:51.917544 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Sep 12 22:59:51.917552 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Sep 12 22:59:51.917559 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Sep 12 22:59:51.917566 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns Sep 12 22:59:51.917573 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4599.99 BogoMIPS (lpj=2299999) Sep 12 22:59:51.917581 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 22:59:51.917587 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 12 22:59:51.917595 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 12 22:59:51.917602 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 22:59:51.917609 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 22:59:51.917615 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 22:59:51.917622 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Sep 12 22:59:51.917629 kernel: RETBleed: Vulnerable Sep 12 22:59:51.917636 kernel: Speculative Store Bypass: Vulnerable Sep 12 22:59:51.917642 kernel: active return thunk: its_return_thunk Sep 12 22:59:51.917649 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 22:59:51.917656 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 22:59:51.917663 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 22:59:51.917670 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 22:59:51.917676 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Sep 12 22:59:51.917682 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Sep 12 22:59:51.917689 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Sep 12 22:59:51.917695 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' Sep 12 22:59:51.917701 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' Sep 12 22:59:51.917707 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' Sep 12 22:59:51.917713 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 22:59:51.917720 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Sep 12 22:59:51.917726 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Sep 12 22:59:51.917733 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Sep 12 22:59:51.917740 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 Sep 12 22:59:51.917746 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 Sep 12 22:59:51.917753 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 Sep 12 22:59:51.917760 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. Sep 12 22:59:51.917766 kernel: Freeing SMP alternatives memory: 32K Sep 12 22:59:51.917773 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:59:51.917780 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:59:51.917787 kernel: landlock: Up and running. Sep 12 22:59:51.917794 kernel: SELinux: Initializing. Sep 12 22:59:51.917801 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 22:59:51.917807 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 22:59:51.917815 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) Sep 12 22:59:51.917822 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. Sep 12 22:59:51.917829 kernel: signal: max sigframe size: 11952 Sep 12 22:59:51.917836 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:59:51.917844 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:59:51.917852 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:59:51.917859 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 22:59:51.917866 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:59:51.917873 kernel: smpboot: x86: Booting SMP configuration: Sep 12 22:59:51.917881 kernel: .... node #0, CPUs: #1 Sep 12 22:59:51.917888 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 22:59:51.917895 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Sep 12 22:59:51.917903 kernel: Memory: 8077032K/8383228K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54084K init, 2880K bss, 299988K reserved, 0K cma-reserved) Sep 12 22:59:51.917910 kernel: devtmpfs: initialized Sep 12 22:59:51.917918 kernel: x86/mm: Memory block size: 128MB Sep 12 22:59:51.917925 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Sep 12 22:59:51.917932 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:59:51.917939 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 22:59:51.917948 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:59:51.917955 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:59:51.917962 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:59:51.917969 kernel: audit: type=2000 audit(1757717988.028:1): state=initialized audit_enabled=0 res=1 Sep 12 22:59:51.917977 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:59:51.917983 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 22:59:51.917990 kernel: cpuidle: using governor menu Sep 12 22:59:51.917998 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:59:51.918005 kernel: dca service started, version 1.12.1 Sep 12 22:59:51.918013 kernel: e820: reserve RAM buffer [mem 0x044fe000-0x07ffffff] Sep 12 22:59:51.918020 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Sep 12 22:59:51.918027 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 22:59:51.918034 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:59:51.918042 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:59:51.918049 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:59:51.918056 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:59:51.918063 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:59:51.918070 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:59:51.918079 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:59:51.918086 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:59:51.918093 kernel: ACPI: Interpreter enabled Sep 12 22:59:51.918100 kernel: ACPI: PM: (supports S0 S5) Sep 12 22:59:51.918107 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 22:59:51.918114 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 22:59:51.918121 kernel: PCI: Ignoring E820 reservations for host bridge windows Sep 12 22:59:51.918129 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Sep 12 22:59:51.918136 kernel: iommu: Default domain type: Translated Sep 12 22:59:51.918144 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 22:59:51.918151 kernel: efivars: Registered efivars operations Sep 12 22:59:51.918158 kernel: PCI: Using ACPI for IRQ routing Sep 12 22:59:51.918166 kernel: PCI: System does not support PCI Sep 12 22:59:51.918172 kernel: vgaarb: loaded Sep 12 22:59:51.918180 kernel: clocksource: Switched to clocksource tsc-early Sep 12 22:59:51.918187 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:59:51.918194 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:59:51.918201 kernel: pnp: PnP ACPI init Sep 12 22:59:51.918209 kernel: pnp: PnP ACPI: found 3 devices Sep 12 22:59:51.918217 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 22:59:51.918224 kernel: NET: Registered PF_INET protocol family Sep 12 22:59:51.918231 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 22:59:51.918239 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Sep 12 22:59:51.918245 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:59:51.918253 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:59:51.918260 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 12 22:59:51.918267 kernel: TCP: Hash tables configured (established 65536 bind 65536) Sep 12 22:59:51.918275 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 12 22:59:51.918283 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 12 22:59:51.918290 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:59:51.918297 kernel: NET: Registered PF_XDP protocol family Sep 12 22:59:51.918304 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:59:51.918311 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 12 22:59:51.918318 kernel: software IO TLB: mapped [mem 0x000000003a9d3000-0x000000003e9d3000] (64MB) Sep 12 22:59:51.918325 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer Sep 12 22:59:51.918333 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules Sep 12 22:59:51.918341 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2127345424d, max_idle_ns: 440795318347 ns Sep 12 22:59:51.918348 kernel: clocksource: Switched to clocksource tsc Sep 12 22:59:51.918355 kernel: Initialise system trusted keyrings Sep 12 22:59:51.918362 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Sep 12 22:59:51.918369 kernel: Key type asymmetric registered Sep 12 22:59:51.918376 kernel: Asymmetric key parser 'x509' registered Sep 12 22:59:51.918383 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 22:59:51.918390 kernel: io scheduler mq-deadline registered Sep 12 22:59:51.918397 kernel: io scheduler kyber registered Sep 12 22:59:51.918406 kernel: io scheduler bfq registered Sep 12 22:59:51.918413 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 22:59:51.918420 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:59:51.918427 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 22:59:51.918435 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 12 22:59:51.918442 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 22:59:51.919341 kernel: i8042: PNP: No PS/2 controller found. Sep 12 22:59:51.919477 kernel: rtc_cmos 00:02: registered as rtc0 Sep 12 22:59:51.919548 kernel: rtc_cmos 00:02: setting system clock to 2025-09-12T22:59:51 UTC (1757717991) Sep 12 22:59:51.919608 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Sep 12 22:59:51.919617 kernel: intel_pstate: Intel P-state driver initializing Sep 12 22:59:51.919624 kernel: efifb: probing for efifb Sep 12 22:59:51.919632 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 12 22:59:51.919639 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 12 22:59:51.919646 kernel: efifb: scrolling: redraw Sep 12 22:59:51.919653 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 22:59:51.919662 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 22:59:51.919670 kernel: fb0: EFI VGA frame buffer device Sep 12 22:59:51.919677 kernel: pstore: Using crash dump compression: deflate Sep 12 22:59:51.919684 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 22:59:51.919692 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:59:51.919699 kernel: Segment Routing with IPv6 Sep 12 22:59:51.919706 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:59:51.919713 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:59:51.919720 kernel: Key type dns_resolver registered Sep 12 22:59:51.919727 kernel: IPI shorthand broadcast: enabled Sep 12 22:59:51.919735 kernel: sched_clock: Marking stable (2689003559, 89206964)->(3064748858, -286538335) Sep 12 22:59:51.919743 kernel: registered taskstats version 1 Sep 12 22:59:51.919750 kernel: Loading compiled-in X.509 certificates Sep 12 22:59:51.919757 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: c3297a5801573420030c321362a802da1fd49c4e' Sep 12 22:59:51.919764 kernel: Demotion targets for Node 0: null Sep 12 22:59:51.919771 kernel: Key type .fscrypt registered Sep 12 22:59:51.919778 kernel: Key type fscrypt-provisioning registered Sep 12 22:59:51.919785 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:59:51.919793 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:59:51.919800 kernel: ima: No architecture policies found Sep 12 22:59:51.919807 kernel: clk: Disabling unused clocks Sep 12 22:59:51.919814 kernel: Warning: unable to open an initial console. Sep 12 22:59:51.919822 kernel: Freeing unused kernel image (initmem) memory: 54084K Sep 12 22:59:51.919829 kernel: Write protecting the kernel read-only data: 24576k Sep 12 22:59:51.919836 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 12 22:59:51.919843 kernel: Run /init as init process Sep 12 22:59:51.919849 kernel: with arguments: Sep 12 22:59:51.919857 kernel: /init Sep 12 22:59:51.919864 kernel: with environment: Sep 12 22:59:51.919870 kernel: HOME=/ Sep 12 22:59:51.919877 kernel: TERM=linux Sep 12 22:59:51.919884 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:59:51.919893 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:59:51.919903 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:59:51.919911 systemd[1]: Detected virtualization microsoft. Sep 12 22:59:51.919920 systemd[1]: Detected architecture x86-64. Sep 12 22:59:51.919928 systemd[1]: Running in initrd. Sep 12 22:59:51.919936 systemd[1]: No hostname configured, using default hostname. Sep 12 22:59:51.919944 systemd[1]: Hostname set to . Sep 12 22:59:51.919952 systemd[1]: Initializing machine ID from random generator. Sep 12 22:59:51.919960 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:59:51.919969 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:59:51.919977 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:59:51.919988 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:59:51.919996 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:59:51.920003 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:59:51.920011 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:59:51.920019 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:59:51.920027 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:59:51.920035 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:59:51.920044 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:59:51.920051 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:59:51.920058 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:59:51.920066 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:59:51.920075 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:59:51.920083 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:59:51.920091 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:59:51.920098 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:59:51.920107 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:59:51.920116 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:59:51.920124 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:59:51.920131 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:59:51.920139 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:59:51.920147 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:59:51.920154 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:59:51.920162 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:59:51.920170 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:59:51.920180 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:59:51.920188 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:59:51.920203 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:59:51.920227 systemd-journald[205]: Collecting audit messages is disabled. Sep 12 22:59:51.920249 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:51.920259 systemd-journald[205]: Journal started Sep 12 22:59:51.920279 systemd-journald[205]: Runtime Journal (/run/log/journal/8274b376e8514a5e86721f66eb64b69a) is 8M, max 158.9M, 150.9M free. Sep 12 22:59:51.923472 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:59:51.929713 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:59:51.931604 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:59:51.934597 systemd-modules-load[207]: Inserted module 'overlay' Sep 12 22:59:51.940864 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:59:51.945425 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:59:51.952124 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:59:51.969524 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:59:51.969079 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:51.975224 kernel: Bridge firewalling registered Sep 12 22:59:51.972296 systemd-tmpfiles[217]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:59:51.974542 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:59:51.977255 systemd-modules-load[207]: Inserted module 'br_netfilter' Sep 12 22:59:51.987530 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:59:51.990215 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:59:51.995930 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:59:52.000200 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:59:52.004031 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:59:52.007553 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:59:52.012763 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:59:52.023353 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:59:52.026715 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:59:52.035041 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:52.048769 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:59:52.066645 systemd-resolved[247]: Positive Trust Anchors: Sep 12 22:59:52.068267 systemd-resolved[247]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:59:52.068304 systemd-resolved[247]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:59:52.070985 systemd-resolved[247]: Defaulting to hostname 'linux'. Sep 12 22:59:52.088545 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:59:52.092976 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:59:52.114460 kernel: SCSI subsystem initialized Sep 12 22:59:52.120461 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:59:52.128469 kernel: iscsi: registered transport (tcp) Sep 12 22:59:52.143587 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:59:52.143622 kernel: QLogic iSCSI HBA Driver Sep 12 22:59:52.154764 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:59:52.175312 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:59:52.176353 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:59:52.206021 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:59:52.209437 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:59:52.263463 kernel: raid6: avx512x4 gen() 45203 MB/s Sep 12 22:59:52.280457 kernel: raid6: avx512x2 gen() 44770 MB/s Sep 12 22:59:52.297455 kernel: raid6: avx512x1 gen() 30152 MB/s Sep 12 22:59:52.315456 kernel: raid6: avx2x4 gen() 43129 MB/s Sep 12 22:59:52.333455 kernel: raid6: avx2x2 gen() 43809 MB/s Sep 12 22:59:52.350972 kernel: raid6: avx2x1 gen() 32165 MB/s Sep 12 22:59:52.350991 kernel: raid6: using algorithm avx512x4 gen() 45203 MB/s Sep 12 22:59:52.368928 kernel: raid6: .... xor() 8162 MB/s, rmw enabled Sep 12 22:59:52.369015 kernel: raid6: using avx512x2 recovery algorithm Sep 12 22:59:52.385460 kernel: xor: automatically using best checksumming function avx Sep 12 22:59:52.488465 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:59:52.491870 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:59:52.495109 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:59:52.511062 systemd-udevd[453]: Using default interface naming scheme 'v255'. Sep 12 22:59:52.514629 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:59:52.516733 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:59:52.542705 dracut-pre-trigger[459]: rd.md=0: removing MD RAID activation Sep 12 22:59:52.558428 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:59:52.561552 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:59:52.588731 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:59:52.596536 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:59:52.631487 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 22:59:52.638466 kernel: AES CTR mode by8 optimization enabled Sep 12 22:59:52.641810 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:52.644133 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:52.648884 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:52.657629 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:52.671856 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:52.671924 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:52.683286 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:52.690588 kernel: hv_vmbus: Vmbus version:5.3 Sep 12 22:59:52.697912 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 12 22:59:52.697939 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 12 22:59:52.711504 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:52.719331 kernel: PTP clock support registered Sep 12 22:59:52.719360 kernel: hv_vmbus: registering driver hv_netvsc Sep 12 22:59:52.724751 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 12 22:59:52.724784 kernel: hv_vmbus: registering driver hv_pci Sep 12 22:59:52.733402 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d40ac7c (unnamed net_device) (uninitialized): VF slot 1 added Sep 12 22:59:52.733943 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Sep 12 22:59:52.733960 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:59:52.738582 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 Sep 12 22:59:52.738912 kernel: hv_utils: Registering HyperV Utility Driver Sep 12 22:59:52.739607 kernel: hv_vmbus: registering driver hv_utils Sep 12 22:59:52.742494 kernel: hv_utils: Shutdown IC version 3.2 Sep 12 22:59:52.747537 kernel: hv_utils: TimeSync IC version 4.0 Sep 12 22:59:52.747604 kernel: hv_utils: Heartbeat IC version 3.0 Sep 12 22:59:52.704979 systemd-resolved[247]: Clock change detected. Flushing caches. Sep 12 22:59:52.714716 kernel: hv_vmbus: registering driver hv_storvsc Sep 12 22:59:52.714735 systemd-journald[205]: Time jumped backwards, rotating. Sep 12 22:59:52.714787 kernel: hv_vmbus: registering driver hid_hyperv Sep 12 22:59:52.719889 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Sep 12 22:59:52.720031 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 Sep 12 22:59:52.720397 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 12 22:59:52.720534 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] Sep 12 22:59:52.722973 kernel: scsi host0: storvsc_host_t Sep 12 22:59:52.725092 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 12 22:59:52.731224 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] Sep 12 22:59:52.736969 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint Sep 12 22:59:52.737009 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] Sep 12 22:59:52.738939 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 12 22:59:52.739094 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 22:59:52.741509 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 12 22:59:52.753151 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 Sep 12 22:59:52.753276 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned Sep 12 22:59:52.758506 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#59 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 22:59:52.769438 kernel: nvme nvme0: pci function c05b:00:00.0 Sep 12 22:59:52.769620 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) Sep 12 22:59:52.778506 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#298 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 22:59:52.920530 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 12 22:59:52.924521 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:59:53.196509 kernel: nvme nvme0: using unchecked data buffer Sep 12 22:59:53.392551 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. Sep 12 22:59:53.415897 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Sep 12 22:59:53.429954 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. Sep 12 22:59:53.431231 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:59:53.439609 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. Sep 12 22:59:53.440043 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. Sep 12 22:59:53.456754 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:59:53.458232 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:59:53.458252 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:59:53.458750 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:59:53.461592 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:59:53.482931 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:59:53.521506 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:59:53.527508 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:59:53.719827 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 Sep 12 22:59:53.719986 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 Sep 12 22:59:53.722585 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] Sep 12 22:59:53.724031 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] Sep 12 22:59:53.728604 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint Sep 12 22:59:53.731577 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] Sep 12 22:59:53.736729 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] Sep 12 22:59:53.736751 kernel: pci 7870:00:00.0: enabling Extended Tags Sep 12 22:59:53.750221 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 Sep 12 22:59:53.750389 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned Sep 12 22:59:53.754527 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned Sep 12 22:59:53.757693 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) Sep 12 22:59:53.766510 kernel: mana 7870:00:00.0: Microsoft Azure Network Adapter protocol version: 0.1.1 Sep 12 22:59:53.766746 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d40ac7c eth0: VF registering: eth1 Sep 12 22:59:53.768858 kernel: mana 7870:00:00.0 eth1: joined to eth0 Sep 12 22:59:53.772509 kernel: mana 7870:00:00.0 enP30832s1: renamed from eth1 Sep 12 22:59:54.532422 disk-uuid[675]: The operation has completed successfully. Sep 12 22:59:54.535681 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 22:59:54.589163 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:59:54.589234 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:59:54.609858 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:59:54.620340 sh[713]: Success Sep 12 22:59:54.646778 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:59:54.646819 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:59:54.648074 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:59:54.655520 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 22:59:54.870131 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:59:54.873728 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:59:54.884253 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:59:54.893511 kernel: BTRFS: device fsid 5d2ab445-1154-4e47-9d7e-ff4b81d84474 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (726) Sep 12 22:59:54.893542 kernel: BTRFS info (device dm-0): first mount of filesystem 5d2ab445-1154-4e47-9d7e-ff4b81d84474 Sep 12 22:59:54.895746 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:55.189192 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 22:59:55.189229 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:59:55.190871 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:59:55.221104 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:59:55.221520 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:59:55.224644 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:59:55.225172 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:59:55.228599 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:59:55.252506 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (761) Sep 12 22:59:55.257592 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:55.257623 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:55.294644 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:59:55.297514 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:59:55.305796 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 22:59:55.305886 kernel: BTRFS info (device nvme0n1p6): turning on async discard Sep 12 22:59:55.305903 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 22:59:55.311511 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:55.312282 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:59:55.314586 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:59:55.331030 systemd-networkd[889]: lo: Link UP Sep 12 22:59:55.331036 systemd-networkd[889]: lo: Gained carrier Sep 12 22:59:55.337318 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Sep 12 22:59:55.337456 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Sep 12 22:59:55.333116 systemd-networkd[889]: Enumeration completed Sep 12 22:59:55.333426 systemd-networkd[889]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:55.343312 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d40ac7c eth0: Data path switched to VF: enP30832s1 Sep 12 22:59:55.333429 systemd-networkd[889]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:59:55.334690 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:59:55.334841 systemd[1]: Reached target network.target - Network. Sep 12 22:59:55.341534 systemd-networkd[889]: enP30832s1: Link UP Sep 12 22:59:55.341600 systemd-networkd[889]: eth0: Link UP Sep 12 22:59:55.341730 systemd-networkd[889]: eth0: Gained carrier Sep 12 22:59:55.341739 systemd-networkd[889]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:55.344783 systemd-networkd[889]: enP30832s1: Gained carrier Sep 12 22:59:55.356862 systemd-networkd[889]: eth0: DHCPv4 address 10.200.8.17/24, gateway 10.200.8.1 acquired from 168.63.129.16 Sep 12 22:59:56.358236 ignition[896]: Ignition 2.22.0 Sep 12 22:59:56.358246 ignition[896]: Stage: fetch-offline Sep 12 22:59:56.360567 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:59:56.358331 ignition[896]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:56.364708 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 22:59:56.358337 ignition[896]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 22:59:56.358404 ignition[896]: parsed url from cmdline: "" Sep 12 22:59:56.358407 ignition[896]: no config URL provided Sep 12 22:59:56.358410 ignition[896]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:59:56.358415 ignition[896]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:59:56.358420 ignition[896]: failed to fetch config: resource requires networking Sep 12 22:59:56.358616 ignition[896]: Ignition finished successfully Sep 12 22:59:56.394578 ignition[904]: Ignition 2.22.0 Sep 12 22:59:56.394587 ignition[904]: Stage: fetch Sep 12 22:59:56.394773 ignition[904]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:56.394780 ignition[904]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 22:59:56.394848 ignition[904]: parsed url from cmdline: "" Sep 12 22:59:56.394851 ignition[904]: no config URL provided Sep 12 22:59:56.394859 ignition[904]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:59:56.394864 ignition[904]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:59:56.394882 ignition[904]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 12 22:59:56.481111 ignition[904]: GET result: OK Sep 12 22:59:56.481197 ignition[904]: config has been read from IMDS userdata Sep 12 22:59:56.481232 ignition[904]: parsing config with SHA512: 768db1e7c6884ebecdd2bfddce01c081833f8e6c1ba43dd41eedbfe2b3b1825f91d6fbaeba16c72b6ddf797fc1826ca082773d482a55b78b6aa947f3461c064c Sep 12 22:59:56.485027 unknown[904]: fetched base config from "system" Sep 12 22:59:56.485034 unknown[904]: fetched base config from "system" Sep 12 22:59:56.485342 ignition[904]: fetch: fetch complete Sep 12 22:59:56.485038 unknown[904]: fetched user config from "azure" Sep 12 22:59:56.485345 ignition[904]: fetch: fetch passed Sep 12 22:59:56.487281 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 22:59:56.485370 ignition[904]: Ignition finished successfully Sep 12 22:59:56.493409 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:59:56.512633 ignition[911]: Ignition 2.22.0 Sep 12 22:59:56.512641 ignition[911]: Stage: kargs Sep 12 22:59:56.512813 ignition[911]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:56.515284 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:59:56.512819 ignition[911]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 22:59:56.517440 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:59:56.513435 ignition[911]: kargs: kargs passed Sep 12 22:59:56.513464 ignition[911]: Ignition finished successfully Sep 12 22:59:56.541954 ignition[918]: Ignition 2.22.0 Sep 12 22:59:56.541963 ignition[918]: Stage: disks Sep 12 22:59:56.543680 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:59:56.542117 ignition[918]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:56.542123 ignition[918]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 22:59:56.542874 ignition[918]: disks: disks passed Sep 12 22:59:56.542904 ignition[918]: Ignition finished successfully Sep 12 22:59:56.551298 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:59:56.556755 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:59:56.559291 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:59:56.561506 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:59:56.564528 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:59:56.567587 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:59:56.630967 systemd-fsck[926]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 12 22:59:56.634254 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:59:56.638744 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:59:57.315645 systemd-networkd[889]: eth0: Gained IPv6LL Sep 12 22:59:58.794506 kernel: EXT4-fs (nvme0n1p9): mounted filesystem d027afc5-396a-49bf-a5be-60ddd42cb089 r/w with ordered data mode. Quota mode: none. Sep 12 22:59:58.794777 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:59:58.797841 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:59:58.825584 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:59:58.844206 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:59:58.850207 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 22:59:58.855531 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (935) Sep 12 22:59:58.855685 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:59:58.863440 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:58.863457 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:58.858709 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:59:58.861883 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:59:58.866598 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:59:58.879216 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 22:59:58.879262 kernel: BTRFS info (device nvme0n1p6): turning on async discard Sep 12 22:59:58.879274 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 22:59:58.880383 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:59:59.323854 coreos-metadata[937]: Sep 12 22:59:59.323 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 22:59:59.327569 coreos-metadata[937]: Sep 12 22:59:59.326 INFO Fetch successful Sep 12 22:59:59.327569 coreos-metadata[937]: Sep 12 22:59:59.327 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 12 22:59:59.337447 coreos-metadata[937]: Sep 12 22:59:59.337 INFO Fetch successful Sep 12 22:59:59.353039 coreos-metadata[937]: Sep 12 22:59:59.353 INFO wrote hostname ci-4459.0.0-a-36add7270c to /sysroot/etc/hostname Sep 12 22:59:59.356041 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 22:59:59.687281 initrd-setup-root[965]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:59:59.717653 initrd-setup-root[972]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:59:59.748892 initrd-setup-root[979]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:59:59.780083 initrd-setup-root[986]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 23:00:01.011244 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 23:00:01.016586 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 23:00:01.019878 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 23:00:01.037947 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 23:00:01.041641 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:00:01.055420 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 23:00:01.064740 ignition[1054]: INFO : Ignition 2.22.0 Sep 12 23:00:01.064740 ignition[1054]: INFO : Stage: mount Sep 12 23:00:01.069591 ignition[1054]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:00:01.069591 ignition[1054]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 23:00:01.069591 ignition[1054]: INFO : mount: mount passed Sep 12 23:00:01.069591 ignition[1054]: INFO : Ignition finished successfully Sep 12 23:00:01.066679 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 23:00:01.069569 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 23:00:01.079600 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 23:00:01.095122 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 (259:5) scanned by mount (1065) Sep 12 23:00:01.095151 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:00:01.096528 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 23:00:01.099644 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 23:00:01.099743 kernel: BTRFS info (device nvme0n1p6): turning on async discard Sep 12 23:00:01.100933 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 23:00:01.102121 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 23:00:01.128228 ignition[1082]: INFO : Ignition 2.22.0 Sep 12 23:00:01.128228 ignition[1082]: INFO : Stage: files Sep 12 23:00:01.130275 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:00:01.130275 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 23:00:01.130275 ignition[1082]: DEBUG : files: compiled without relabeling support, skipping Sep 12 23:00:01.159319 ignition[1082]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 23:00:01.159319 ignition[1082]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 23:00:01.207401 ignition[1082]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 23:00:01.210580 ignition[1082]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 23:00:01.210580 ignition[1082]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 23:00:01.207724 unknown[1082]: wrote ssh authorized keys file for user: core Sep 12 23:00:01.274559 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 23:00:01.279578 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 12 23:00:01.319013 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 23:00:01.371024 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:00:01.374548 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:00:01.393503 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 12 23:00:01.928661 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 23:00:02.537583 ignition[1082]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:00:02.537583 ignition[1082]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 23:00:02.565782 ignition[1082]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:00:02.571535 ignition[1082]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:00:02.571535 ignition[1082]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 23:00:02.571535 ignition[1082]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 23:00:02.575835 ignition[1082]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 23:00:02.575835 ignition[1082]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:00:02.575835 ignition[1082]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:00:02.575835 ignition[1082]: INFO : files: files passed Sep 12 23:00:02.575835 ignition[1082]: INFO : Ignition finished successfully Sep 12 23:00:02.575105 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 23:00:02.589169 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 23:00:02.597860 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 23:00:02.602202 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 23:00:02.602705 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 23:00:02.621201 initrd-setup-root-after-ignition[1111]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:00:02.621201 initrd-setup-root-after-ignition[1111]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:00:02.628651 initrd-setup-root-after-ignition[1115]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:00:02.624013 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:00:02.624690 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 23:00:02.634918 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 23:00:02.683693 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 23:00:02.683772 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 23:00:02.684219 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 23:00:02.684671 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 23:00:02.684829 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 23:00:02.686607 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 23:00:02.697687 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:00:02.701598 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 23:00:02.722272 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:00:02.722755 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:00:02.722981 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 23:00:02.723097 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 23:00:02.723177 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:00:02.730844 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 23:00:02.733189 systemd[1]: Stopped target basic.target - Basic System. Sep 12 23:00:02.735305 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 23:00:02.737636 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 23:00:02.740215 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 23:00:02.743132 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 23:00:02.745629 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 23:00:02.747868 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 23:00:02.751655 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 23:00:02.754247 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 23:00:02.757129 systemd[1]: Stopped target swap.target - Swaps. Sep 12 23:00:02.759413 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 23:00:02.759530 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 23:00:02.763816 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:00:02.767640 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:00:02.771597 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 23:00:02.772406 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:00:02.775598 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 23:00:02.775715 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 23:00:02.779886 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 23:00:02.780007 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:00:02.783661 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 23:00:02.783770 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 23:00:02.783920 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 23:00:02.784012 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 23:00:02.785571 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 23:00:02.821021 ignition[1135]: INFO : Ignition 2.22.0 Sep 12 23:00:02.821021 ignition[1135]: INFO : Stage: umount Sep 12 23:00:02.821021 ignition[1135]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:00:02.821021 ignition[1135]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 23:00:02.821021 ignition[1135]: INFO : umount: umount passed Sep 12 23:00:02.821021 ignition[1135]: INFO : Ignition finished successfully Sep 12 23:00:02.796395 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 23:00:02.817137 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 23:00:02.817282 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:00:02.819732 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 23:00:02.821108 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 23:00:02.828101 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 23:00:02.828180 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 23:00:02.833853 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 23:00:02.834072 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 23:00:02.836530 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 23:00:02.836576 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 23:00:02.840568 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 23:00:02.840608 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 23:00:02.844555 systemd[1]: Stopped target network.target - Network. Sep 12 23:00:02.844731 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 23:00:02.844768 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 23:00:02.845016 systemd[1]: Stopped target paths.target - Path Units. Sep 12 23:00:02.845035 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 23:00:02.846864 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:00:02.856438 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 23:00:02.858987 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 23:00:02.875572 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 23:00:02.875615 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 23:00:02.880581 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 23:00:02.880623 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 23:00:02.882772 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 23:00:02.882820 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 23:00:02.885419 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 23:00:02.885452 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 23:00:02.889656 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 23:00:02.894539 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 23:00:02.900641 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 23:00:02.901194 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 23:00:02.901273 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 23:00:02.906860 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 23:00:02.907042 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 23:00:02.907134 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 23:00:02.912790 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 23:00:02.912981 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 23:00:02.913043 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 23:00:02.916402 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 23:00:02.917076 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 23:00:02.917118 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:00:02.924359 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 23:00:02.931888 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 23:00:02.931984 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 23:00:02.935909 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 23:00:02.935941 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:00:02.943273 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 23:00:02.943310 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 23:00:02.945137 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 23:00:02.945168 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:00:02.954989 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:00:02.965377 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 23:00:02.965432 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 23:00:02.969891 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 23:00:02.974618 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:00:02.978296 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 23:00:02.985365 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d40ac7c eth0: Data path switched from VF: enP30832s1 Sep 12 23:00:02.985487 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Sep 12 23:00:02.978359 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 23:00:02.982278 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 23:00:02.982314 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:00:02.985583 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 23:00:02.985633 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 23:00:02.994673 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 23:00:02.994724 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 23:00:02.997926 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 23:00:02.997974 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:00:03.002237 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 23:00:03.005068 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 23:00:03.005118 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:00:03.011593 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 23:00:03.011638 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:00:03.016795 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 23:00:03.016840 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:00:03.019227 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 23:00:03.019268 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:00:03.025039 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:00:03.026448 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:00:03.030586 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 23:00:03.030641 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 23:00:03.030668 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 23:00:03.030694 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 23:00:03.030946 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 23:00:03.031019 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 23:00:03.036946 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 23:00:03.037019 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 23:00:03.279239 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 23:00:03.279324 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 23:00:03.282794 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 23:00:03.283156 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 23:00:03.283200 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 23:00:03.283935 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 23:00:03.319045 systemd[1]: Switching root. Sep 12 23:00:03.415181 systemd-journald[205]: Journal stopped Sep 12 23:00:10.815752 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). Sep 12 23:00:10.815786 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 23:00:10.815797 kernel: SELinux: policy capability open_perms=1 Sep 12 23:00:10.815806 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 23:00:10.815813 kernel: SELinux: policy capability always_check_network=0 Sep 12 23:00:10.815821 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 23:00:10.815829 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 23:00:10.815838 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 23:00:10.815845 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 23:00:10.815852 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 23:00:10.815860 kernel: audit: type=1403 audit(1757718004.632:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 23:00:10.815868 systemd[1]: Successfully loaded SELinux policy in 181.569ms. Sep 12 23:00:10.815878 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.394ms. Sep 12 23:00:10.815888 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 23:00:10.815898 systemd[1]: Detected virtualization microsoft. Sep 12 23:00:10.815906 systemd[1]: Detected architecture x86-64. Sep 12 23:00:10.815914 systemd[1]: Detected first boot. Sep 12 23:00:10.815922 systemd[1]: Hostname set to . Sep 12 23:00:10.815932 systemd[1]: Initializing machine ID from random generator. Sep 12 23:00:10.815941 zram_generator::config[1180]: No configuration found. Sep 12 23:00:10.815949 kernel: Guest personality initialized and is inactive Sep 12 23:00:10.815957 kernel: VMCI host device registered (name=vmci, major=10, minor=124) Sep 12 23:00:10.815965 kernel: Initialized host personality Sep 12 23:00:10.815972 kernel: NET: Registered PF_VSOCK protocol family Sep 12 23:00:10.815981 systemd[1]: Populated /etc with preset unit settings. Sep 12 23:00:10.815991 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 23:00:10.815999 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 23:00:10.816007 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 23:00:10.816016 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 23:00:10.816024 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 23:00:10.816033 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 23:00:10.816041 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 23:00:10.816048 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 23:00:10.816058 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 23:00:10.816066 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 23:00:10.816074 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 23:00:10.816083 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 23:00:10.816091 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:00:10.816099 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:00:10.816108 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 23:00:10.816118 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 23:00:10.816129 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 23:00:10.816137 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 23:00:10.816146 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 23:00:10.816154 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:00:10.816163 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:00:10.816172 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 23:00:10.816180 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 23:00:10.816191 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 23:00:10.816199 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 23:00:10.816207 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:00:10.816216 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 23:00:10.816224 systemd[1]: Reached target slices.target - Slice Units. Sep 12 23:00:10.816232 systemd[1]: Reached target swap.target - Swaps. Sep 12 23:00:10.816241 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 23:00:10.816249 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 23:00:10.816260 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 23:00:10.816269 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:00:10.816277 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 23:00:10.816285 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:00:10.816294 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 23:00:10.816304 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 23:00:10.816312 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 23:00:10.816321 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 23:00:10.816329 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:10.816338 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 23:00:10.816346 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 23:00:10.816355 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 23:00:10.816364 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 23:00:10.816372 systemd[1]: Reached target machines.target - Containers. Sep 12 23:00:10.816382 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 23:00:10.816391 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:00:10.816399 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 23:00:10.816408 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 23:00:10.816416 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:00:10.816424 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:00:10.816433 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:00:10.816441 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 23:00:10.816451 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:00:10.816460 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 23:00:10.816468 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 23:00:10.816477 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 23:00:10.816485 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 23:00:10.816507 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 23:00:10.816517 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:00:10.816525 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 23:00:10.816536 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 23:00:10.816545 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 23:00:10.816554 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 23:00:10.816562 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 23:00:10.816571 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 23:00:10.816579 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 23:00:10.816588 systemd[1]: Stopped verity-setup.service. Sep 12 23:00:10.816596 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:10.816607 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 23:00:10.816615 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 23:00:10.816624 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 23:00:10.816632 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 23:00:10.816640 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 23:00:10.816648 kernel: fuse: init (API version 7.41) Sep 12 23:00:10.816673 systemd-journald[1263]: Collecting audit messages is disabled. Sep 12 23:00:10.816694 kernel: loop: module loaded Sep 12 23:00:10.816702 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 23:00:10.816711 systemd-journald[1263]: Journal started Sep 12 23:00:10.816730 systemd-journald[1263]: Runtime Journal (/run/log/journal/9fc3bce2102246ce9e520fb70ff16b68) is 8M, max 158.9M, 150.9M free. Sep 12 23:00:10.279471 systemd[1]: Queued start job for default target multi-user.target. Sep 12 23:00:10.286856 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 12 23:00:10.287152 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 23:00:10.819571 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 23:00:10.823125 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:00:10.826956 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 23:00:10.827090 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 23:00:10.829968 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:00:10.830089 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:00:10.834776 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:00:10.834901 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:00:10.836898 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 23:00:10.837107 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 23:00:10.839763 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:00:10.840675 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:00:10.844349 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 23:00:10.847798 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 23:00:10.849339 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:00:10.850990 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 23:00:10.857332 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 23:00:10.861145 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 23:00:10.875481 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 23:00:10.878574 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 23:00:10.878602 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 23:00:10.882272 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 23:00:10.889601 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 23:00:10.893639 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:00:10.907166 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 23:00:10.914782 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 23:00:10.919586 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:00:10.924756 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 23:00:10.927277 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:00:10.928012 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 23:00:10.930925 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 23:00:10.937450 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 23:00:10.941321 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 23:00:10.944595 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:00:10.947705 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 23:00:10.950483 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 23:00:10.963129 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 23:00:10.974994 systemd-journald[1263]: Time spent on flushing to /var/log/journal/9fc3bce2102246ce9e520fb70ff16b68 is 54.492ms for 994 entries. Sep 12 23:00:10.974994 systemd-journald[1263]: System Journal (/var/log/journal/9fc3bce2102246ce9e520fb70ff16b68) is 11.8M, max 2.6G, 2.6G free. Sep 12 23:00:11.205682 systemd-journald[1263]: Received client request to flush runtime journal. Sep 12 23:00:11.205714 kernel: ACPI: bus type drm_connector registered Sep 12 23:00:11.205726 systemd-journald[1263]: /var/log/journal/9fc3bce2102246ce9e520fb70ff16b68/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Sep 12 23:00:11.205744 systemd-journald[1263]: Rotating system journal. Sep 12 23:00:11.205758 kernel: loop0: detected capacity change from 0 to 27936 Sep 12 23:00:10.965615 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 23:00:10.971575 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 23:00:10.981437 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:00:10.984104 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:00:11.066760 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:00:11.132999 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Sep 12 23:00:11.133008 systemd-tmpfiles[1321]: ACLs are not supported, ignoring. Sep 12 23:00:11.134934 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:00:11.139761 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 23:00:11.206425 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 23:00:11.370071 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 23:00:11.370564 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 23:00:11.461509 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 23:00:11.567522 kernel: loop1: detected capacity change from 0 to 128016 Sep 12 23:00:11.715768 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 23:00:11.719320 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 23:00:11.732536 systemd-tmpfiles[1343]: ACLs are not supported, ignoring. Sep 12 23:00:11.732549 systemd-tmpfiles[1343]: ACLs are not supported, ignoring. Sep 12 23:00:11.734438 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:00:12.081510 kernel: loop2: detected capacity change from 0 to 110984 Sep 12 23:00:12.578515 kernel: loop3: detected capacity change from 0 to 229808 Sep 12 23:00:12.616518 kernel: loop4: detected capacity change from 0 to 27936 Sep 12 23:00:12.627517 kernel: loop5: detected capacity change from 0 to 128016 Sep 12 23:00:12.637903 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 23:00:12.638509 kernel: loop6: detected capacity change from 0 to 110984 Sep 12 23:00:12.640869 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:00:12.658521 kernel: loop7: detected capacity change from 0 to 229808 Sep 12 23:00:12.663586 systemd-udevd[1351]: Using default interface naming scheme 'v255'. Sep 12 23:00:12.684287 (sd-merge)[1349]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 12 23:00:12.684612 (sd-merge)[1349]: Merged extensions into '/usr'. Sep 12 23:00:12.687489 systemd[1]: Reload requested from client PID 1319 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 23:00:12.687508 systemd[1]: Reloading... Sep 12 23:00:12.736588 zram_generator::config[1376]: No configuration found. Sep 12 23:00:12.902983 systemd[1]: Reloading finished in 215 ms. Sep 12 23:00:12.924269 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 23:00:12.934229 systemd[1]: Starting ensure-sysext.service... Sep 12 23:00:12.937418 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 23:00:12.981039 systemd[1]: Reload requested from client PID 1435 ('systemctl') (unit ensure-sysext.service)... Sep 12 23:00:12.981048 systemd[1]: Reloading... Sep 12 23:00:12.987423 systemd-tmpfiles[1436]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 23:00:13.000455 systemd-tmpfiles[1436]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 23:00:13.000809 systemd-tmpfiles[1436]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 23:00:13.001051 systemd-tmpfiles[1436]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 23:00:13.001736 systemd-tmpfiles[1436]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 23:00:13.002009 systemd-tmpfiles[1436]: ACLs are not supported, ignoring. Sep 12 23:00:13.002166 systemd-tmpfiles[1436]: ACLs are not supported, ignoring. Sep 12 23:00:13.029558 zram_generator::config[1464]: No configuration found. Sep 12 23:00:13.082072 systemd-tmpfiles[1436]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:00:13.082080 systemd-tmpfiles[1436]: Skipping /boot Sep 12 23:00:13.087156 systemd-tmpfiles[1436]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:00:13.087171 systemd-tmpfiles[1436]: Skipping /boot Sep 12 23:00:13.168141 systemd[1]: Reloading finished in 186 ms. Sep 12 23:00:13.185243 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:00:13.191376 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 23:00:13.219870 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 23:00:13.223278 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 23:00:13.233848 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 23:00:13.238691 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 23:00:13.243934 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.244274 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:00:13.251166 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:00:13.255710 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:00:13.260387 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:00:13.263679 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:00:13.263788 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:00:13.263875 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.264933 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:00:13.265064 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:00:13.267061 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:00:13.270790 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:00:13.275318 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.275517 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:00:13.276366 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:00:13.283209 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:00:13.284835 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:00:13.284946 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:00:13.285025 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.285435 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:00:13.288712 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:00:13.291275 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:00:13.291389 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:00:13.295127 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:00:13.295322 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:00:13.304814 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 23:00:13.308886 systemd[1]: Finished ensure-sysext.service. Sep 12 23:00:13.311642 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 12 23:00:13.312930 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.313077 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:00:13.314722 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:00:13.322738 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:00:13.327404 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:00:13.330739 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:00:13.332942 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:00:13.332979 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:00:13.333036 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 23:00:13.336051 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:00:13.336385 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:00:13.336551 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:00:13.339007 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:00:13.339127 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:00:13.340660 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:00:13.340773 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:00:13.343711 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:00:13.346803 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:00:13.346930 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:00:13.357812 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 23:00:13.360575 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:00:13.360627 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:00:13.363662 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 23:00:13.442091 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 23:00:13.443124 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 23:00:13.545514 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#36 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 23:00:13.545571 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 23:00:13.554512 kernel: hv_vmbus: registering driver hyperv_fb Sep 12 23:00:13.558993 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 12 23:00:13.559040 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 12 23:00:13.562399 kernel: Console: switching to colour dummy device 80x25 Sep 12 23:00:13.566516 kernel: hv_vmbus: registering driver hv_balloon Sep 12 23:00:13.567522 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 23:00:13.570524 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 12 23:00:13.589322 augenrules[1637]: No rules Sep 12 23:00:13.590773 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 12 23:00:13.591085 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:00:13.591614 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 23:00:13.604007 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:00:13.615787 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 23:00:13.615256 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:00:13.615402 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:00:13.623183 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:00:13.659045 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:00:13.659243 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:00:13.663020 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:00:13.781182 systemd-resolved[1527]: Positive Trust Anchors: Sep 12 23:00:13.781192 systemd-resolved[1527]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 23:00:13.781225 systemd-resolved[1527]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 23:00:13.797349 systemd-networkd[1575]: lo: Link UP Sep 12 23:00:13.797353 systemd-networkd[1575]: lo: Gained carrier Sep 12 23:00:13.800264 systemd-networkd[1575]: Enumeration completed Sep 12 23:00:13.800366 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 23:00:13.805347 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 23:00:13.808540 systemd-networkd[1575]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:00:13.808548 systemd-networkd[1575]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:00:13.810697 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 23:00:13.811593 kernel: mana 7870:00:00.0 enP30832s1: Configured vPort 0 PD 18 DB 16 Sep 12 23:00:13.811736 systemd-resolved[1527]: Using system hostname 'ci-4459.0.0-a-36add7270c'. Sep 12 23:00:13.823516 kernel: mana 7870:00:00.0 enP30832s1: Configured steering vPort 0 entries 64 Sep 12 23:00:13.829516 kernel: hv_netvsc f8615163-0000-1000-2000-7ced8d40ac7c eth0: Data path switched to VF: enP30832s1 Sep 12 23:00:13.834701 systemd-networkd[1575]: enP30832s1: Link UP Sep 12 23:00:13.834781 systemd-networkd[1575]: eth0: Link UP Sep 12 23:00:13.834788 systemd-networkd[1575]: eth0: Gained carrier Sep 12 23:00:13.834802 systemd-networkd[1575]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:00:13.836176 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 23:00:13.844792 systemd-networkd[1575]: enP30832s1: Gained carrier Sep 12 23:00:13.851861 systemd[1]: Reached target network.target - Network. Sep 12 23:00:13.853656 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:00:13.856067 systemd-networkd[1575]: eth0: DHCPv4 address 10.200.8.17/24, gateway 10.200.8.1 acquired from 168.63.129.16 Sep 12 23:00:13.859084 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. Sep 12 23:00:13.861706 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 23:00:13.876718 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 23:00:13.895515 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Sep 12 23:00:13.941833 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 23:00:15.016319 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:00:15.363602 systemd-networkd[1575]: eth0: Gained IPv6LL Sep 12 23:00:15.364927 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 23:00:15.367681 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 23:00:15.761169 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 23:00:15.764676 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 23:00:19.075762 ldconfig[1314]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 23:00:19.123653 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 23:00:19.128553 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 23:00:19.157051 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 23:00:19.160721 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 23:00:19.162068 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 23:00:19.163527 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 23:00:19.166550 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 23:00:19.167997 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 23:00:19.170615 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 23:00:19.174551 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 23:00:19.175876 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 23:00:19.175903 systemd[1]: Reached target paths.target - Path Units. Sep 12 23:00:19.178552 systemd[1]: Reached target timers.target - Timer Units. Sep 12 23:00:19.195288 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 23:00:19.199377 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 23:00:19.202919 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 23:00:19.204467 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 23:00:19.207582 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 23:00:19.209985 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 23:00:19.212776 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 23:00:19.215979 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 23:00:19.219162 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 23:00:19.220180 systemd[1]: Reached target basic.target - Basic System. Sep 12 23:00:19.222565 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:00:19.222588 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:00:19.237225 systemd[1]: Starting chronyd.service - NTP client/server... Sep 12 23:00:19.250161 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 23:00:19.261966 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 23:00:19.264642 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 23:00:19.268025 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 23:00:19.271865 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 23:00:19.279668 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 23:00:19.282098 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 23:00:19.283615 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 23:00:19.286690 systemd[1]: hv_fcopy_uio_daemon.service - Hyper-V FCOPY UIO daemon was skipped because of an unmet condition check (ConditionPathExists=/sys/bus/vmbus/devices/eb765408-105f-49b6-b4aa-c123b64d17d4/uio). Sep 12 23:00:19.289645 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 12 23:00:19.292023 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 12 23:00:19.293955 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:19.299576 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 23:00:19.303326 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 23:00:19.306908 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 23:00:19.311689 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 23:00:19.316881 KVP[1694]: KVP starting; pid is:1694 Sep 12 23:00:19.317823 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 23:00:19.329685 kernel: hv_utils: KVP IC version 4.0 Sep 12 23:00:19.325551 KVP[1694]: KVP LIC Version: 3.1 Sep 12 23:00:19.329799 jq[1688]: false Sep 12 23:00:19.324988 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 23:00:19.327231 chronyd[1683]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 12 23:00:19.327895 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 23:00:19.328257 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 23:00:19.329708 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 23:00:19.336355 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 23:00:19.343729 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 23:00:19.346605 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 23:00:19.346779 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 23:00:19.347942 extend-filesystems[1689]: Found /dev/nvme0n1p6 Sep 12 23:00:19.352825 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing passwd entry cache Sep 12 23:00:19.350685 oslogin_cache_refresh[1693]: Refreshing passwd entry cache Sep 12 23:00:19.358004 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 23:00:19.358188 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 23:00:19.370518 jq[1707]: true Sep 12 23:00:19.370699 extend-filesystems[1689]: Found /dev/nvme0n1p9 Sep 12 23:00:19.376284 extend-filesystems[1689]: Checking size of /dev/nvme0n1p9 Sep 12 23:00:19.377025 chronyd[1683]: Timezone right/UTC failed leap second check, ignoring Sep 12 23:00:19.377222 systemd[1]: Started chronyd.service - NTP client/server. Sep 12 23:00:19.382631 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting users, quitting Sep 12 23:00:19.382631 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 23:00:19.382631 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Refreshing group entry cache Sep 12 23:00:19.377147 chronyd[1683]: Loaded seccomp filter (level 2) Sep 12 23:00:19.379227 oslogin_cache_refresh[1693]: Failure getting users, quitting Sep 12 23:00:19.379241 oslogin_cache_refresh[1693]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 23:00:19.379277 oslogin_cache_refresh[1693]: Refreshing group entry cache Sep 12 23:00:19.390428 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Failure getting groups, quitting Sep 12 23:00:19.390428 google_oslogin_nss_cache[1693]: oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 23:00:19.389733 (ntainerd)[1728]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 23:00:19.389548 oslogin_cache_refresh[1693]: Failure getting groups, quitting Sep 12 23:00:19.389555 oslogin_cache_refresh[1693]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 23:00:19.391956 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 23:00:19.392119 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 23:00:19.395213 jq[1725]: true Sep 12 23:00:19.404832 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 23:00:19.404985 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 23:00:19.418101 extend-filesystems[1689]: Old size kept for /dev/nvme0n1p9 Sep 12 23:00:19.414896 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 23:00:19.415063 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 23:00:19.434570 update_engine[1706]: I20250912 23:00:19.434504 1706 main.cc:92] Flatcar Update Engine starting Sep 12 23:00:19.435784 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 23:00:19.492907 tar[1712]: linux-amd64/LICENSE Sep 12 23:00:19.493187 tar[1712]: linux-amd64/helm Sep 12 23:00:19.537136 systemd-logind[1704]: New seat seat0. Sep 12 23:00:19.543530 systemd-logind[1704]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 23:00:19.546220 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 23:00:19.572465 bash[1763]: Updated "/home/core/.ssh/authorized_keys" Sep 12 23:00:19.574158 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 23:00:19.578045 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 23:00:19.763913 sshd_keygen[1738]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 23:00:19.807441 dbus-daemon[1686]: [system] SELinux support is enabled Sep 12 23:00:19.807951 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 23:00:19.815919 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 23:00:19.821768 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 23:00:19.824173 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 23:00:19.824205 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 23:00:19.827755 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 23:00:19.827773 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 23:00:19.831531 update_engine[1706]: I20250912 23:00:19.831351 1706 update_check_scheduler.cc:74] Next update check in 3m54s Sep 12 23:00:19.835383 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 12 23:00:19.837596 systemd[1]: Started update-engine.service - Update Engine. Sep 12 23:00:19.840020 dbus-daemon[1686]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 23:00:19.844935 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 23:00:19.858277 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 23:00:19.858438 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 23:00:19.864089 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 23:00:19.889280 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 12 23:00:19.911280 coreos-metadata[1685]: Sep 12 23:00:19.911 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 23:00:19.916185 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 23:00:19.919302 coreos-metadata[1685]: Sep 12 23:00:19.916 INFO Fetch successful Sep 12 23:00:19.919302 coreos-metadata[1685]: Sep 12 23:00:19.916 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 12 23:00:19.919775 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 23:00:19.921940 coreos-metadata[1685]: Sep 12 23:00:19.920 INFO Fetch successful Sep 12 23:00:19.922516 coreos-metadata[1685]: Sep 12 23:00:19.922 INFO Fetching http://168.63.129.16/machine/aaf60527-731b-457f-8374-7e435e48b665/84a90827%2Dc2bb%2D4444%2D9641%2Dfd2196a2be16.%5Fci%2D4459.0.0%2Da%2D36add7270c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 12 23:00:19.923992 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 23:00:19.927980 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 23:00:19.965780 coreos-metadata[1685]: Sep 12 23:00:19.965 INFO Fetch successful Sep 12 23:00:19.965780 coreos-metadata[1685]: Sep 12 23:00:19.965 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 12 23:00:19.974040 coreos-metadata[1685]: Sep 12 23:00:19.973 INFO Fetch successful Sep 12 23:00:20.015456 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 23:00:20.017230 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 23:00:20.040651 tar[1712]: linux-amd64/README.md Sep 12 23:00:20.055433 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 23:00:20.063071 locksmithd[1805]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 23:00:20.214390 containerd[1728]: time="2025-09-12T23:00:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 23:00:20.216526 containerd[1728]: time="2025-09-12T23:00:20.215573592Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 23:00:20.223500 containerd[1728]: time="2025-09-12T23:00:20.223469404Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.547µs" Sep 12 23:00:20.223569 containerd[1728]: time="2025-09-12T23:00:20.223507542Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 23:00:20.223569 containerd[1728]: time="2025-09-12T23:00:20.223523462Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 23:00:20.223637 containerd[1728]: time="2025-09-12T23:00:20.223626176Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 23:00:20.223691 containerd[1728]: time="2025-09-12T23:00:20.223674821Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 23:00:20.223718 containerd[1728]: time="2025-09-12T23:00:20.223698322Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 23:00:20.223750 containerd[1728]: time="2025-09-12T23:00:20.223741098Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 23:00:20.223769 containerd[1728]: time="2025-09-12T23:00:20.223751697Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 23:00:20.223918 containerd[1728]: time="2025-09-12T23:00:20.223905849Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 23:00:20.223956 containerd[1728]: time="2025-09-12T23:00:20.223949634Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 23:00:20.223988 containerd[1728]: time="2025-09-12T23:00:20.223980602Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 23:00:20.224022 containerd[1728]: time="2025-09-12T23:00:20.224016130Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 23:00:20.224094 containerd[1728]: time="2025-09-12T23:00:20.224088092Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 23:00:20.224252 containerd[1728]: time="2025-09-12T23:00:20.224244779Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 23:00:20.224307 containerd[1728]: time="2025-09-12T23:00:20.224288817Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 23:00:20.224307 containerd[1728]: time="2025-09-12T23:00:20.224301680Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 23:00:20.224348 containerd[1728]: time="2025-09-12T23:00:20.224331690Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 23:00:20.224565 containerd[1728]: time="2025-09-12T23:00:20.224550334Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 23:00:20.224615 containerd[1728]: time="2025-09-12T23:00:20.224604566Z" level=info msg="metadata content store policy set" policy=shared Sep 12 23:00:20.244093 containerd[1728]: time="2025-09-12T23:00:20.244068011Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 23:00:20.244234 containerd[1728]: time="2025-09-12T23:00:20.244184595Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 23:00:20.244234 containerd[1728]: time="2025-09-12T23:00:20.244203713Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 23:00:20.244234 containerd[1728]: time="2025-09-12T23:00:20.244216716Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 23:00:20.244344 containerd[1728]: time="2025-09-12T23:00:20.244335783Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 23:00:20.244454 containerd[1728]: time="2025-09-12T23:00:20.244374915Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 23:00:20.244454 containerd[1728]: time="2025-09-12T23:00:20.244390685Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 23:00:20.244454 containerd[1728]: time="2025-09-12T23:00:20.244402940Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 23:00:20.244454 containerd[1728]: time="2025-09-12T23:00:20.244413281Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 23:00:20.244454 containerd[1728]: time="2025-09-12T23:00:20.244422785Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 23:00:20.244644 containerd[1728]: time="2025-09-12T23:00:20.244431260Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 23:00:20.244644 containerd[1728]: time="2025-09-12T23:00:20.244586099Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 23:00:20.244743 containerd[1728]: time="2025-09-12T23:00:20.244735314Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 23:00:20.244799 containerd[1728]: time="2025-09-12T23:00:20.244793713Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 23:00:20.244829 containerd[1728]: time="2025-09-12T23:00:20.244824834Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244860425Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244874254Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244881912Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244888681Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244895546Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244906185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244913792Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 23:00:20.244933 containerd[1728]: time="2025-09-12T23:00:20.244920185Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 23:00:20.245163 containerd[1728]: time="2025-09-12T23:00:20.245132772Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 23:00:20.245163 containerd[1728]: time="2025-09-12T23:00:20.245146428Z" level=info msg="Start snapshots syncer" Sep 12 23:00:20.245272 containerd[1728]: time="2025-09-12T23:00:20.245222377Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 23:00:20.245566 containerd[1728]: time="2025-09-12T23:00:20.245538525Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 23:00:20.245839 containerd[1728]: time="2025-09-12T23:00:20.245706472Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 23:00:20.245839 containerd[1728]: time="2025-09-12T23:00:20.245803417Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 23:00:20.246037 containerd[1728]: time="2025-09-12T23:00:20.245975690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 23:00:20.246037 containerd[1728]: time="2025-09-12T23:00:20.245995524Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 23:00:20.246037 containerd[1728]: time="2025-09-12T23:00:20.246005539Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 23:00:20.246037 containerd[1728]: time="2025-09-12T23:00:20.246015654Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 23:00:20.246037 containerd[1728]: time="2025-09-12T23:00:20.246026431Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 23:00:20.246211 containerd[1728]: time="2025-09-12T23:00:20.246143296Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 23:00:20.246211 containerd[1728]: time="2025-09-12T23:00:20.246155956Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 23:00:20.246211 containerd[1728]: time="2025-09-12T23:00:20.246183681Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 23:00:20.246211 containerd[1728]: time="2025-09-12T23:00:20.246194155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 23:00:20.246365 containerd[1728]: time="2025-09-12T23:00:20.246202778Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 23:00:20.246365 containerd[1728]: time="2025-09-12T23:00:20.246319767Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 23:00:20.246365 containerd[1728]: time="2025-09-12T23:00:20.246332719Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 23:00:20.246365 containerd[1728]: time="2025-09-12T23:00:20.246340485Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 23:00:20.246365 containerd[1728]: time="2025-09-12T23:00:20.246348925Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246355792Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246530313Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246539920Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246559740Z" level=info msg="runtime interface created" Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246562926Z" level=info msg="created NRI interface" Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246568054Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 23:00:20.246592 containerd[1728]: time="2025-09-12T23:00:20.246576146Z" level=info msg="Connect containerd service" Sep 12 23:00:20.246755 containerd[1728]: time="2025-09-12T23:00:20.246687814Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 23:00:20.247369 containerd[1728]: time="2025-09-12T23:00:20.247348951Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 23:00:20.454169 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:20.586025 (kubelet)[1840]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:20.997996 containerd[1728]: time="2025-09-12T23:00:20.997740454Z" level=info msg="Start subscribing containerd event" Sep 12 23:00:20.997996 containerd[1728]: time="2025-09-12T23:00:20.997779721Z" level=info msg="Start recovering state" Sep 12 23:00:20.998136 containerd[1728]: time="2025-09-12T23:00:20.998100620Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 23:00:20.998207 containerd[1728]: time="2025-09-12T23:00:20.998118820Z" level=info msg="Start event monitor" Sep 12 23:00:20.998244 containerd[1728]: time="2025-09-12T23:00:20.998238723Z" level=info msg="Start cni network conf syncer for default" Sep 12 23:00:20.998341 containerd[1728]: time="2025-09-12T23:00:20.998328098Z" level=info msg="Start streaming server" Sep 12 23:00:20.998385 containerd[1728]: time="2025-09-12T23:00:20.998378203Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 23:00:20.998418 containerd[1728]: time="2025-09-12T23:00:20.998412629Z" level=info msg="runtime interface starting up..." Sep 12 23:00:20.998464 containerd[1728]: time="2025-09-12T23:00:20.998439683Z" level=info msg="starting plugins..." Sep 12 23:00:20.998606 containerd[1728]: time="2025-09-12T23:00:20.998596872Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 23:00:20.998645 containerd[1728]: time="2025-09-12T23:00:20.998295432Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 23:00:20.998801 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 23:00:21.001406 containerd[1728]: time="2025-09-12T23:00:21.001377088Z" level=info msg="containerd successfully booted in 0.787347s" Sep 12 23:00:21.001912 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 23:00:21.004897 systemd[1]: Startup finished in 2.808s (kernel) + 12.797s (initrd) + 16.551s (userspace) = 32.157s. Sep 12 23:00:21.041499 kubelet[1840]: E0912 23:00:21.041466 1840 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:21.044113 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:21.044321 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:21.044672 systemd[1]: kubelet.service: Consumed 810ms CPU time, 267.8M memory peak. Sep 12 23:00:21.617708 login[1819]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Sep 12 23:00:21.617873 login[1818]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 23:00:21.622443 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 23:00:21.623257 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 23:00:21.629206 systemd-logind[1704]: New session 1 of user core. Sep 12 23:00:21.647551 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 23:00:21.649364 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 23:00:21.668825 (systemd)[1865]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 23:00:21.670457 systemd-logind[1704]: New session c1 of user core. Sep 12 23:00:21.868864 waagent[1816]: 2025-09-12T23:00:21.868783Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.869320Z INFO Daemon Daemon OS: flatcar 4459.0.0 Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.869774Z INFO Daemon Daemon Python: 3.11.13 Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.870193Z INFO Daemon Daemon Run daemon Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.870387Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.0.0' Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.870557Z INFO Daemon Daemon Using waagent for provisioning Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.870695Z INFO Daemon Daemon Activate resource disk Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.870838Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.872196Z INFO Daemon Daemon Found device: None Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.872276Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 12 23:00:21.872906 waagent[1816]: 2025-09-12T23:00:21.872769Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 12 23:00:21.873225 waagent[1816]: 2025-09-12T23:00:21.873189Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 23:00:21.873328 waagent[1816]: 2025-09-12T23:00:21.873307Z INFO Daemon Daemon Running default provisioning handler Sep 12 23:00:21.877935 waagent[1816]: 2025-09-12T23:00:21.877687Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 12 23:00:21.878241 waagent[1816]: 2025-09-12T23:00:21.878211Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 12 23:00:21.878322 waagent[1816]: 2025-09-12T23:00:21.878304Z INFO Daemon Daemon cloud-init is enabled: False Sep 12 23:00:21.878384 waagent[1816]: 2025-09-12T23:00:21.878369Z INFO Daemon Daemon Copying ovf-env.xml Sep 12 23:00:22.001009 systemd[1865]: Queued start job for default target default.target. Sep 12 23:00:22.010177 systemd[1865]: Created slice app.slice - User Application Slice. Sep 12 23:00:22.010576 systemd[1865]: Reached target paths.target - Paths. Sep 12 23:00:22.010660 systemd[1865]: Reached target timers.target - Timers. Sep 12 23:00:22.013590 systemd[1865]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 23:00:22.018144 waagent[1816]: 2025-09-12T23:00:22.018099Z INFO Daemon Daemon Successfully mounted dvd Sep 12 23:00:22.024685 systemd[1865]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 23:00:22.024853 systemd[1865]: Reached target sockets.target - Sockets. Sep 12 23:00:22.024962 systemd[1865]: Reached target basic.target - Basic System. Sep 12 23:00:22.025129 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 23:00:22.025203 systemd[1865]: Reached target default.target - Main User Target. Sep 12 23:00:22.025269 systemd[1865]: Startup finished in 350ms. Sep 12 23:00:22.032633 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.042391Z INFO Daemon Daemon Detect protocol endpoint Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.043968Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.044193Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.044410Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.044548Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 12 23:00:22.046004 waagent[1816]: 2025-09-12T23:00:22.044697Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 12 23:00:22.042654 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 12 23:00:22.062737 waagent[1816]: 2025-09-12T23:00:22.062712Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 12 23:00:22.063733 waagent[1816]: 2025-09-12T23:00:22.063223Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 12 23:00:22.063733 waagent[1816]: 2025-09-12T23:00:22.063393Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 12 23:00:22.181733 waagent[1816]: 2025-09-12T23:00:22.181660Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 12 23:00:22.183860 waagent[1816]: 2025-09-12T23:00:22.181896Z INFO Daemon Daemon Forcing an update of the goal state. Sep 12 23:00:22.186235 waagent[1816]: 2025-09-12T23:00:22.186204Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 23:00:22.197416 waagent[1816]: 2025-09-12T23:00:22.197389Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 12 23:00:22.201999 waagent[1816]: 2025-09-12T23:00:22.198214Z INFO Daemon Sep 12 23:00:22.201999 waagent[1816]: 2025-09-12T23:00:22.198288Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: fd659319-1d88-477e-b4ae-b85781953214 eTag: 13936071761020036050 source: Fabric] Sep 12 23:00:22.201999 waagent[1816]: 2025-09-12T23:00:22.198721Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 12 23:00:22.201999 waagent[1816]: 2025-09-12T23:00:22.198969Z INFO Daemon Sep 12 23:00:22.201999 waagent[1816]: 2025-09-12T23:00:22.199178Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 12 23:00:22.206090 waagent[1816]: 2025-09-12T23:00:22.206066Z INFO Daemon Daemon Downloading artifacts profile blob Sep 12 23:00:22.350869 waagent[1816]: 2025-09-12T23:00:22.350826Z INFO Daemon Downloaded certificate {'thumbprint': '8C894D66BF58F955AC61CFE8A8E6C719D5F8F0D6', 'hasPrivateKey': True} Sep 12 23:00:22.352965 waagent[1816]: 2025-09-12T23:00:22.352932Z INFO Daemon Fetch goal state completed Sep 12 23:00:22.397288 waagent[1816]: 2025-09-12T23:00:22.397226Z INFO Daemon Daemon Starting provisioning Sep 12 23:00:22.397678 waagent[1816]: 2025-09-12T23:00:22.397459Z INFO Daemon Daemon Handle ovf-env.xml. Sep 12 23:00:22.398973 waagent[1816]: 2025-09-12T23:00:22.398951Z INFO Daemon Daemon Set hostname [ci-4459.0.0-a-36add7270c] Sep 12 23:00:22.443319 waagent[1816]: 2025-09-12T23:00:22.443282Z INFO Daemon Daemon Publish hostname [ci-4459.0.0-a-36add7270c] Sep 12 23:00:22.444688 waagent[1816]: 2025-09-12T23:00:22.444657Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 12 23:00:22.446206 waagent[1816]: 2025-09-12T23:00:22.446178Z INFO Daemon Daemon Primary interface is [eth0] Sep 12 23:00:22.452050 systemd-networkd[1575]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:00:22.452055 systemd-networkd[1575]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:00:22.452076 systemd-networkd[1575]: eth0: DHCP lease lost Sep 12 23:00:22.452763 waagent[1816]: 2025-09-12T23:00:22.452723Z INFO Daemon Daemon Create user account if not exists Sep 12 23:00:22.454873 waagent[1816]: 2025-09-12T23:00:22.453235Z INFO Daemon Daemon User core already exists, skip useradd Sep 12 23:00:22.454873 waagent[1816]: 2025-09-12T23:00:22.453444Z INFO Daemon Daemon Configure sudoer Sep 12 23:00:22.472517 systemd-networkd[1575]: eth0: DHCPv4 address 10.200.8.17/24, gateway 10.200.8.1 acquired from 168.63.129.16 Sep 12 23:00:22.492284 waagent[1816]: 2025-09-12T23:00:22.492237Z INFO Daemon Daemon Configure sshd Sep 12 23:00:22.497228 waagent[1816]: 2025-09-12T23:00:22.497185Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 12 23:00:22.497927 waagent[1816]: 2025-09-12T23:00:22.497657Z INFO Daemon Daemon Deploy ssh public key. Sep 12 23:00:22.618978 login[1819]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 12 23:00:22.622480 systemd-logind[1704]: New session 2 of user core. Sep 12 23:00:22.628635 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 23:00:23.619426 waagent[1816]: 2025-09-12T23:00:23.619397Z INFO Daemon Daemon Provisioning complete Sep 12 23:00:23.628779 waagent[1816]: 2025-09-12T23:00:23.628754Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 12 23:00:23.631959 waagent[1816]: 2025-09-12T23:00:23.629085Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 12 23:00:23.631959 waagent[1816]: 2025-09-12T23:00:23.629302Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 12 23:00:23.718843 waagent[1914]: 2025-09-12T23:00:23.718780Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 12 23:00:23.719027 waagent[1914]: 2025-09-12T23:00:23.718865Z INFO ExtHandler ExtHandler OS: flatcar 4459.0.0 Sep 12 23:00:23.719027 waagent[1914]: 2025-09-12T23:00:23.718903Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 12 23:00:23.719027 waagent[1914]: 2025-09-12T23:00:23.718939Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Sep 12 23:00:23.766983 waagent[1914]: 2025-09-12T23:00:23.766934Z INFO ExtHandler ExtHandler Distro: flatcar-4459.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 12 23:00:23.767122 waagent[1914]: 2025-09-12T23:00:23.767094Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 23:00:23.767185 waagent[1914]: 2025-09-12T23:00:23.767147Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 23:00:23.775175 waagent[1914]: 2025-09-12T23:00:23.775134Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 23:00:23.790354 waagent[1914]: 2025-09-12T23:00:23.790326Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 12 23:00:23.790671 waagent[1914]: 2025-09-12T23:00:23.790646Z INFO ExtHandler Sep 12 23:00:23.790719 waagent[1914]: 2025-09-12T23:00:23.790694Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: c59e40cb-b099-4418-aaff-d5e218f830e8 eTag: 13936071761020036050 source: Fabric] Sep 12 23:00:23.790899 waagent[1914]: 2025-09-12T23:00:23.790876Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 12 23:00:23.791181 waagent[1914]: 2025-09-12T23:00:23.791156Z INFO ExtHandler Sep 12 23:00:23.791212 waagent[1914]: 2025-09-12T23:00:23.791192Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 12 23:00:23.797746 waagent[1914]: 2025-09-12T23:00:23.797721Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 12 23:00:23.852808 waagent[1914]: 2025-09-12T23:00:23.852765Z INFO ExtHandler Downloaded certificate {'thumbprint': '8C894D66BF58F955AC61CFE8A8E6C719D5F8F0D6', 'hasPrivateKey': True} Sep 12 23:00:23.853083 waagent[1914]: 2025-09-12T23:00:23.853059Z INFO ExtHandler Fetch goal state completed Sep 12 23:00:23.862997 waagent[1914]: 2025-09-12T23:00:23.862957Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.2 1 Jul 2025 (Library: OpenSSL 3.4.2 1 Jul 2025) Sep 12 23:00:23.866419 waagent[1914]: 2025-09-12T23:00:23.866373Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1914 Sep 12 23:00:23.866534 waagent[1914]: 2025-09-12T23:00:23.866475Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 12 23:00:23.866740 waagent[1914]: 2025-09-12T23:00:23.866717Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 12 23:00:23.867619 waagent[1914]: 2025-09-12T23:00:23.867594Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.0.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 12 23:00:23.867865 waagent[1914]: 2025-09-12T23:00:23.867843Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 12 23:00:23.867949 waagent[1914]: 2025-09-12T23:00:23.867932Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 12 23:00:23.868280 waagent[1914]: 2025-09-12T23:00:23.868261Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 12 23:00:23.939984 waagent[1914]: 2025-09-12T23:00:23.939926Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 12 23:00:23.940084 waagent[1914]: 2025-09-12T23:00:23.940060Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 12 23:00:23.944511 waagent[1914]: 2025-09-12T23:00:23.944271Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 12 23:00:23.948399 systemd[1]: Reload requested from client PID 1929 ('systemctl') (unit waagent.service)... Sep 12 23:00:23.948425 systemd[1]: Reloading... Sep 12 23:00:24.016511 zram_generator::config[1968]: No configuration found. Sep 12 23:00:24.074805 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#256 cmd 0x4a status: scsi 0x0 srb 0x20 hv 0xc0000001 Sep 12 23:00:24.175094 systemd[1]: Reloading finished in 226 ms. Sep 12 23:00:24.190191 waagent[1914]: 2025-09-12T23:00:24.189637Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 12 23:00:24.190191 waagent[1914]: 2025-09-12T23:00:24.189717Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 12 23:00:24.553013 waagent[1914]: 2025-09-12T23:00:24.552942Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 12 23:00:24.553172 waagent[1914]: 2025-09-12T23:00:24.553150Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 12 23:00:24.553702 waagent[1914]: 2025-09-12T23:00:24.553674Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 23:00:24.553859 waagent[1914]: 2025-09-12T23:00:24.553705Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 12 23:00:24.553859 waagent[1914]: 2025-09-12T23:00:24.553820Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 23:00:24.554056 waagent[1914]: 2025-09-12T23:00:24.554031Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 12 23:00:24.554229 waagent[1914]: 2025-09-12T23:00:24.554201Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 12 23:00:24.554504 waagent[1914]: 2025-09-12T23:00:24.554466Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 12 23:00:24.554574 waagent[1914]: 2025-09-12T23:00:24.554540Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 23:00:24.554629 waagent[1914]: 2025-09-12T23:00:24.554587Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 12 23:00:24.554921 waagent[1914]: 2025-09-12T23:00:24.554889Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 12 23:00:24.554957 waagent[1914]: 2025-09-12T23:00:24.554932Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 23:00:24.555046 waagent[1914]: 2025-09-12T23:00:24.555028Z INFO EnvHandler ExtHandler Configure routes Sep 12 23:00:24.555141 waagent[1914]: 2025-09-12T23:00:24.555126Z INFO EnvHandler ExtHandler Gateway:None Sep 12 23:00:24.555141 waagent[1914]: 2025-09-12T23:00:24.555088Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 12 23:00:24.555326 waagent[1914]: 2025-09-12T23:00:24.555310Z INFO EnvHandler ExtHandler Routes:None Sep 12 23:00:24.555326 waagent[1914]: 2025-09-12T23:00:24.555364Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 12 23:00:24.555883 waagent[1914]: 2025-09-12T23:00:24.555861Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 12 23:00:24.555883 waagent[1914]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 12 23:00:24.555883 waagent[1914]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Sep 12 23:00:24.555883 waagent[1914]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 12 23:00:24.555883 waagent[1914]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 12 23:00:24.555883 waagent[1914]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 23:00:24.555883 waagent[1914]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 23:00:24.560640 waagent[1914]: 2025-09-12T23:00:24.560611Z INFO ExtHandler ExtHandler Sep 12 23:00:24.560706 waagent[1914]: 2025-09-12T23:00:24.560664Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 7cb4e6a7-20ee-449c-b484-52cff518aa39 correlation 9d275785-1a22-460c-be60-cbc9a5ed4701 created: 2025-09-12T22:59:12.069916Z] Sep 12 23:00:24.560889 waagent[1914]: 2025-09-12T23:00:24.560868Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 12 23:00:24.561196 waagent[1914]: 2025-09-12T23:00:24.561178Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Sep 12 23:00:24.589264 waagent[1914]: 2025-09-12T23:00:24.588910Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 12 23:00:24.589264 waagent[1914]: Try `iptables -h' or 'iptables --help' for more information.) Sep 12 23:00:24.589264 waagent[1914]: 2025-09-12T23:00:24.589214Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: F47E563E-538B-4826-9D65-18CCFACEAAED;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 12 23:00:24.608444 waagent[1914]: 2025-09-12T23:00:24.608405Z INFO MonitorHandler ExtHandler Network interfaces: Sep 12 23:00:24.608444 waagent[1914]: Executing ['ip', '-a', '-o', 'link']: Sep 12 23:00:24.608444 waagent[1914]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 12 23:00:24.608444 waagent[1914]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:40:ac:7c brd ff:ff:ff:ff:ff:ff\ alias Network Device Sep 12 23:00:24.608444 waagent[1914]: 3: enP30832s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:ed:8d:40:ac:7c brd ff:ff:ff:ff:ff:ff\ altname enP30832p0s0 Sep 12 23:00:24.608444 waagent[1914]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 12 23:00:24.608444 waagent[1914]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 12 23:00:24.608444 waagent[1914]: 2: eth0 inet 10.200.8.17/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 12 23:00:24.608444 waagent[1914]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 12 23:00:24.608444 waagent[1914]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 12 23:00:24.608444 waagent[1914]: 2: eth0 inet6 fe80::7eed:8dff:fe40:ac7c/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 12 23:00:24.709284 waagent[1914]: 2025-09-12T23:00:24.709242Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 12 23:00:24.709284 waagent[1914]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.709284 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.709284 waagent[1914]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.709284 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.709284 waagent[1914]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.709284 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.709284 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 23:00:24.709284 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 23:00:24.709284 waagent[1914]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 23:00:24.711563 waagent[1914]: 2025-09-12T23:00:24.711487Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 12 23:00:24.711563 waagent[1914]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.711563 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.711563 waagent[1914]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.711563 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.711563 waagent[1914]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 23:00:24.711563 waagent[1914]: pkts bytes target prot opt in out source destination Sep 12 23:00:24.711563 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 23:00:24.711563 waagent[1914]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 23:00:24.711563 waagent[1914]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 23:00:28.849561 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 23:00:28.850416 systemd[1]: Started sshd@0-10.200.8.17:22-10.200.16.10:55702.service - OpenSSH per-connection server daemon (10.200.16.10:55702). Sep 12 23:00:29.634137 sshd[2059]: Accepted publickey for core from 10.200.16.10 port 55702 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:29.634969 sshd-session[2059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:29.638851 systemd-logind[1704]: New session 3 of user core. Sep 12 23:00:29.645613 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 23:00:30.177383 systemd[1]: Started sshd@1-10.200.8.17:22-10.200.16.10:45498.service - OpenSSH per-connection server daemon (10.200.16.10:45498). Sep 12 23:00:30.804791 sshd[2065]: Accepted publickey for core from 10.200.16.10 port 45498 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:30.805608 sshd-session[2065]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:30.809246 systemd-logind[1704]: New session 4 of user core. Sep 12 23:00:30.814642 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 23:00:31.085763 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 23:00:31.086720 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:31.243314 sshd[2068]: Connection closed by 10.200.16.10 port 45498 Sep 12 23:00:31.243654 sshd-session[2065]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:31.246007 systemd[1]: sshd@1-10.200.8.17:22-10.200.16.10:45498.service: Deactivated successfully. Sep 12 23:00:31.247141 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 23:00:31.247725 systemd-logind[1704]: Session 4 logged out. Waiting for processes to exit. Sep 12 23:00:31.248769 systemd-logind[1704]: Removed session 4. Sep 12 23:00:31.358232 systemd[1]: Started sshd@2-10.200.8.17:22-10.200.16.10:45514.service - OpenSSH per-connection server daemon (10.200.16.10:45514). Sep 12 23:00:31.709193 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:31.717739 (kubelet)[2085]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:31.747137 kubelet[2085]: E0912 23:00:31.747107 2085 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:31.749700 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:31.749806 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:31.750024 systemd[1]: kubelet.service: Consumed 111ms CPU time, 108.5M memory peak. Sep 12 23:00:31.971935 sshd[2077]: Accepted publickey for core from 10.200.16.10 port 45514 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:31.972805 sshd-session[2077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:31.976270 systemd-logind[1704]: New session 5 of user core. Sep 12 23:00:31.982624 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 23:00:32.408375 sshd[2092]: Connection closed by 10.200.16.10 port 45514 Sep 12 23:00:32.408712 sshd-session[2077]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:32.410891 systemd[1]: sshd@2-10.200.8.17:22-10.200.16.10:45514.service: Deactivated successfully. Sep 12 23:00:32.411964 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 23:00:32.412580 systemd-logind[1704]: Session 5 logged out. Waiting for processes to exit. Sep 12 23:00:32.413358 systemd-logind[1704]: Removed session 5. Sep 12 23:00:32.520248 systemd[1]: Started sshd@3-10.200.8.17:22-10.200.16.10:45518.service - OpenSSH per-connection server daemon (10.200.16.10:45518). Sep 12 23:00:33.143711 sshd[2098]: Accepted publickey for core from 10.200.16.10 port 45518 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:33.144399 sshd-session[2098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:33.147863 systemd-logind[1704]: New session 6 of user core. Sep 12 23:00:33.157645 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 23:00:33.582856 sshd[2101]: Connection closed by 10.200.16.10 port 45518 Sep 12 23:00:33.583342 sshd-session[2098]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:33.585579 systemd[1]: sshd@3-10.200.8.17:22-10.200.16.10:45518.service: Deactivated successfully. Sep 12 23:00:33.586617 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 23:00:33.587368 systemd-logind[1704]: Session 6 logged out. Waiting for processes to exit. Sep 12 23:00:33.588075 systemd-logind[1704]: Removed session 6. Sep 12 23:00:33.691195 systemd[1]: Started sshd@4-10.200.8.17:22-10.200.16.10:45526.service - OpenSSH per-connection server daemon (10.200.16.10:45526). Sep 12 23:00:34.319669 sshd[2107]: Accepted publickey for core from 10.200.16.10 port 45526 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:34.320337 sshd-session[2107]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:34.323700 systemd-logind[1704]: New session 7 of user core. Sep 12 23:00:34.334641 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 23:00:34.830561 sudo[2111]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 23:00:34.830747 sudo[2111]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:34.855057 sudo[2111]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:34.954653 sshd[2110]: Connection closed by 10.200.16.10 port 45526 Sep 12 23:00:34.955066 sshd-session[2107]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:34.957304 systemd[1]: sshd@4-10.200.8.17:22-10.200.16.10:45526.service: Deactivated successfully. Sep 12 23:00:34.958305 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 23:00:34.959216 systemd-logind[1704]: Session 7 logged out. Waiting for processes to exit. Sep 12 23:00:34.960129 systemd-logind[1704]: Removed session 7. Sep 12 23:00:35.067276 systemd[1]: Started sshd@5-10.200.8.17:22-10.200.16.10:45542.service - OpenSSH per-connection server daemon (10.200.16.10:45542). Sep 12 23:00:35.689710 sshd[2117]: Accepted publickey for core from 10.200.16.10 port 45542 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:35.690548 sshd-session[2117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:35.693535 systemd-logind[1704]: New session 8 of user core. Sep 12 23:00:35.704626 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 23:00:36.030214 sudo[2122]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 23:00:36.030395 sudo[2122]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:36.035246 sudo[2122]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:36.038460 sudo[2121]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 23:00:36.038650 sudo[2121]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:36.044632 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 23:00:36.072975 augenrules[2144]: No rules Sep 12 23:00:36.073782 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:00:36.073933 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 23:00:36.074785 sudo[2121]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:36.173730 sshd[2120]: Connection closed by 10.200.16.10 port 45542 Sep 12 23:00:36.174031 sshd-session[2117]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:36.175910 systemd[1]: sshd@5-10.200.8.17:22-10.200.16.10:45542.service: Deactivated successfully. Sep 12 23:00:36.176948 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 23:00:36.177838 systemd-logind[1704]: Session 8 logged out. Waiting for processes to exit. Sep 12 23:00:36.178774 systemd-logind[1704]: Removed session 8. Sep 12 23:00:36.282349 systemd[1]: Started sshd@6-10.200.8.17:22-10.200.16.10:45546.service - OpenSSH per-connection server daemon (10.200.16.10:45546). Sep 12 23:00:36.903603 sshd[2153]: Accepted publickey for core from 10.200.16.10 port 45546 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:00:36.904275 sshd-session[2153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:36.907654 systemd-logind[1704]: New session 9 of user core. Sep 12 23:00:36.913641 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 23:00:37.243747 sudo[2157]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 23:00:37.243928 sudo[2157]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:38.614190 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 23:00:38.622776 (dockerd)[2176]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 23:00:39.648184 dockerd[2176]: time="2025-09-12T23:00:39.648134337Z" level=info msg="Starting up" Sep 12 23:00:39.648747 dockerd[2176]: time="2025-09-12T23:00:39.648724619Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 23:00:39.657668 dockerd[2176]: time="2025-09-12T23:00:39.657637175Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 23:00:39.816351 dockerd[2176]: time="2025-09-12T23:00:39.816326318Z" level=info msg="Loading containers: start." Sep 12 23:00:39.875528 kernel: Initializing XFRM netlink socket Sep 12 23:00:40.287968 systemd-networkd[1575]: docker0: Link UP Sep 12 23:00:40.300888 dockerd[2176]: time="2025-09-12T23:00:40.300866727Z" level=info msg="Loading containers: done." Sep 12 23:00:40.329957 dockerd[2176]: time="2025-09-12T23:00:40.329933215Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 23:00:40.330054 dockerd[2176]: time="2025-09-12T23:00:40.329987754Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 23:00:40.330054 dockerd[2176]: time="2025-09-12T23:00:40.330043430Z" level=info msg="Initializing buildkit" Sep 12 23:00:40.370005 dockerd[2176]: time="2025-09-12T23:00:40.369982029Z" level=info msg="Completed buildkit initialization" Sep 12 23:00:40.375169 dockerd[2176]: time="2025-09-12T23:00:40.375140839Z" level=info msg="Daemon has completed initialization" Sep 12 23:00:40.375319 dockerd[2176]: time="2025-09-12T23:00:40.375182545Z" level=info msg="API listen on /run/docker.sock" Sep 12 23:00:40.375451 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 23:00:41.439656 containerd[1728]: time="2025-09-12T23:00:41.439625064Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 23:00:41.835838 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 23:00:41.836930 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:42.262236 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:42.266746 (kubelet)[2393]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:42.310847 kubelet[2393]: E0912 23:00:42.310818 2393 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:42.312284 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:42.312373 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:42.312756 systemd[1]: kubelet.service: Consumed 109ms CPU time, 110.7M memory peak. Sep 12 23:00:42.547738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1283214183.mount: Deactivated successfully. Sep 12 23:00:43.157835 chronyd[1683]: Selected source PHC0 Sep 12 23:00:43.588425 containerd[1728]: time="2025-09-12T23:00:43.587598952Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:43.591506 containerd[1728]: time="2025-09-12T23:00:43.591470952Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114901" Sep 12 23:00:43.597671 containerd[1728]: time="2025-09-12T23:00:43.597648687Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:43.601725 containerd[1728]: time="2025-09-12T23:00:43.601688024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:43.602381 containerd[1728]: time="2025-09-12T23:00:43.602359278Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 2.162702005s" Sep 12 23:00:43.602425 containerd[1728]: time="2025-09-12T23:00:43.602384165Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Sep 12 23:00:43.602869 containerd[1728]: time="2025-09-12T23:00:43.602848565Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 23:00:44.852267 containerd[1728]: time="2025-09-12T23:00:44.852237085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:44.856680 containerd[1728]: time="2025-09-12T23:00:44.856652312Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020852" Sep 12 23:00:44.862120 containerd[1728]: time="2025-09-12T23:00:44.862089211Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:44.866119 containerd[1728]: time="2025-09-12T23:00:44.866080882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:44.866863 containerd[1728]: time="2025-09-12T23:00:44.866590191Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.263717235s" Sep 12 23:00:44.866863 containerd[1728]: time="2025-09-12T23:00:44.866617486Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Sep 12 23:00:44.866991 containerd[1728]: time="2025-09-12T23:00:44.866977377Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 23:00:45.871603 containerd[1728]: time="2025-09-12T23:00:45.871572615Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:45.874588 containerd[1728]: time="2025-09-12T23:00:45.874567089Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155576" Sep 12 23:00:45.877187 containerd[1728]: time="2025-09-12T23:00:45.877153677Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:45.881647 containerd[1728]: time="2025-09-12T23:00:45.881621705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:45.882183 containerd[1728]: time="2025-09-12T23:00:45.882162839Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.015164826s" Sep 12 23:00:45.882222 containerd[1728]: time="2025-09-12T23:00:45.882182762Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Sep 12 23:00:45.882631 containerd[1728]: time="2025-09-12T23:00:45.882611916Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 23:00:46.810371 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2340613140.mount: Deactivated successfully. Sep 12 23:00:47.138694 containerd[1728]: time="2025-09-12T23:00:47.138661630Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:47.141745 containerd[1728]: time="2025-09-12T23:00:47.141717931Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929477" Sep 12 23:00:47.144442 containerd[1728]: time="2025-09-12T23:00:47.144408805Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:47.147776 containerd[1728]: time="2025-09-12T23:00:47.147742104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:47.148049 containerd[1728]: time="2025-09-12T23:00:47.148030096Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.265396744s" Sep 12 23:00:47.148082 containerd[1728]: time="2025-09-12T23:00:47.148054322Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Sep 12 23:00:47.148371 containerd[1728]: time="2025-09-12T23:00:47.148354802Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 23:00:47.764913 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1002142790.mount: Deactivated successfully. Sep 12 23:00:48.594365 containerd[1728]: time="2025-09-12T23:00:48.594335566Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:48.597385 containerd[1728]: time="2025-09-12T23:00:48.597357460Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Sep 12 23:00:48.600322 containerd[1728]: time="2025-09-12T23:00:48.600285311Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:48.604561 containerd[1728]: time="2025-09-12T23:00:48.604527706Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:48.605139 containerd[1728]: time="2025-09-12T23:00:48.605055931Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.456676465s" Sep 12 23:00:48.605139 containerd[1728]: time="2025-09-12T23:00:48.605082859Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 12 23:00:48.605680 containerd[1728]: time="2025-09-12T23:00:48.605663721Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 23:00:49.128255 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2649980388.mount: Deactivated successfully. Sep 12 23:00:49.155908 containerd[1728]: time="2025-09-12T23:00:49.155882548Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:49.160956 containerd[1728]: time="2025-09-12T23:00:49.160926990Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 12 23:00:49.165179 containerd[1728]: time="2025-09-12T23:00:49.165145186Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:49.173874 containerd[1728]: time="2025-09-12T23:00:49.173839671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:49.174446 containerd[1728]: time="2025-09-12T23:00:49.174195741Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 568.509541ms" Sep 12 23:00:49.174446 containerd[1728]: time="2025-09-12T23:00:49.174216926Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 23:00:49.174613 containerd[1728]: time="2025-09-12T23:00:49.174593276Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 23:00:49.716297 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2208832897.mount: Deactivated successfully. Sep 12 23:00:51.208801 containerd[1728]: time="2025-09-12T23:00:51.208772001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.211513 containerd[1728]: time="2025-09-12T23:00:51.211482072Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378441" Sep 12 23:00:51.214329 containerd[1728]: time="2025-09-12T23:00:51.214294231Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.218118 containerd[1728]: time="2025-09-12T23:00:51.217926831Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.218544 containerd[1728]: time="2025-09-12T23:00:51.218525137Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.043886411s" Sep 12 23:00:51.218582 containerd[1728]: time="2025-09-12T23:00:51.218550461Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 12 23:00:52.335910 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 23:00:52.339662 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:52.719208 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:52.727677 (kubelet)[2615]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:52.762679 kubelet[2615]: E0912 23:00:52.762645 2615 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:52.764230 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:52.764337 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:52.765534 systemd[1]: kubelet.service: Consumed 128ms CPU time, 108.2M memory peak. Sep 12 23:00:53.614261 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:53.614419 systemd[1]: kubelet.service: Consumed 128ms CPU time, 108.2M memory peak. Sep 12 23:00:53.616120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:53.636039 systemd[1]: Reload requested from client PID 2630 ('systemctl') (unit session-9.scope)... Sep 12 23:00:53.636052 systemd[1]: Reloading... Sep 12 23:00:53.726522 zram_generator::config[2683]: No configuration found. Sep 12 23:00:53.945655 systemd[1]: Reloading finished in 309 ms. Sep 12 23:00:53.993392 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 23:00:53.993454 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 23:00:53.993671 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:53.993702 systemd[1]: kubelet.service: Consumed 63ms CPU time, 74.4M memory peak. Sep 12 23:00:53.995130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:54.478241 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:54.484719 (kubelet)[2744]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:00:54.514985 kubelet[2744]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:54.514985 kubelet[2744]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:00:54.514985 kubelet[2744]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:54.515212 kubelet[2744]: I0912 23:00:54.515016 2744 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:00:54.616082 kubelet[2744]: I0912 23:00:54.616055 2744 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 23:00:54.617219 kubelet[2744]: I0912 23:00:54.616156 2744 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:00:54.617219 kubelet[2744]: I0912 23:00:54.616513 2744 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 23:00:54.641856 kubelet[2744]: I0912 23:00:54.641841 2744 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:00:54.642177 kubelet[2744]: E0912 23:00:54.642157 2744 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.8.17:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.17:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 23:00:54.646645 kubelet[2744]: I0912 23:00:54.646631 2744 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:00:54.648702 kubelet[2744]: I0912 23:00:54.648685 2744 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:00:54.648852 kubelet[2744]: I0912 23:00:54.648829 2744 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:00:54.648962 kubelet[2744]: I0912 23:00:54.648850 2744 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-a-36add7270c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:00:54.649070 kubelet[2744]: I0912 23:00:54.648966 2744 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:00:54.649070 kubelet[2744]: I0912 23:00:54.648975 2744 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 23:00:54.649070 kubelet[2744]: I0912 23:00:54.649061 2744 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:54.651512 kubelet[2744]: I0912 23:00:54.651067 2744 kubelet.go:480] "Attempting to sync node with API server" Sep 12 23:00:54.651512 kubelet[2744]: I0912 23:00:54.651086 2744 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:00:54.651512 kubelet[2744]: I0912 23:00:54.651146 2744 kubelet.go:386] "Adding apiserver pod source" Sep 12 23:00:54.653090 kubelet[2744]: I0912 23:00:54.652967 2744 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:00:54.658239 kubelet[2744]: E0912 23:00:54.658220 2744 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.8.17:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.0.0-a-36add7270c&limit=500&resourceVersion=0\": dial tcp 10.200.8.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 23:00:54.659053 kubelet[2744]: E0912 23:00:54.659036 2744 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.8.17:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 23:00:54.659196 kubelet[2744]: I0912 23:00:54.659187 2744 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:00:54.659742 kubelet[2744]: I0912 23:00:54.659701 2744 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 23:00:54.660163 kubelet[2744]: W0912 23:00:54.660144 2744 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 23:00:54.662934 kubelet[2744]: I0912 23:00:54.662107 2744 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:00:54.662934 kubelet[2744]: I0912 23:00:54.662144 2744 server.go:1289] "Started kubelet" Sep 12 23:00:54.666342 kubelet[2744]: I0912 23:00:54.666325 2744 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:00:54.670520 kubelet[2744]: I0912 23:00:54.670356 2744 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:00:54.673251 kubelet[2744]: I0912 23:00:54.673236 2744 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:00:54.676288 kubelet[2744]: I0912 23:00:54.673558 2744 server.go:317] "Adding debug handlers to kubelet server" Sep 12 23:00:54.676288 kubelet[2744]: I0912 23:00:54.673583 2744 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:00:54.676418 kubelet[2744]: I0912 23:00:54.673591 2744 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:00:54.676441 kubelet[2744]: E0912 23:00:54.673689 2744 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.0.0-a-36add7270c\" not found" Sep 12 23:00:54.676441 kubelet[2744]: I0912 23:00:54.675090 2744 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:00:54.676589 kubelet[2744]: I0912 23:00:54.676578 2744 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:00:54.676640 kubelet[2744]: I0912 23:00:54.676633 2744 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:00:54.676873 kubelet[2744]: E0912 23:00:54.676857 2744 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.8.17:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 23:00:54.676935 kubelet[2744]: E0912 23:00:54.676910 2744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-a-36add7270c?timeout=10s\": dial tcp 10.200.8.17:6443: connect: connection refused" interval="200ms" Sep 12 23:00:54.678585 kubelet[2744]: E0912 23:00:54.677228 2744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.17:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.17:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.0.0-a-36add7270c.1864ab42076d2957 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.0.0-a-36add7270c,UID:ci-4459.0.0-a-36add7270c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.0.0-a-36add7270c,},FirstTimestamp:2025-09-12 23:00:54.662121815 +0000 UTC m=+0.173555679,LastTimestamp:2025-09-12 23:00:54.662121815 +0000 UTC m=+0.173555679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.0.0-a-36add7270c,}" Sep 12 23:00:54.678831 kubelet[2744]: I0912 23:00:54.678814 2744 factory.go:223] Registration of the systemd container factory successfully Sep 12 23:00:54.678890 kubelet[2744]: I0912 23:00:54.678871 2744 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:00:54.679904 kubelet[2744]: I0912 23:00:54.679888 2744 factory.go:223] Registration of the containerd container factory successfully Sep 12 23:00:54.701082 kubelet[2744]: E0912 23:00:54.701069 2744 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:00:54.704505 kubelet[2744]: I0912 23:00:54.704471 2744 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:00:54.704505 kubelet[2744]: I0912 23:00:54.704489 2744 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:00:54.704597 kubelet[2744]: I0912 23:00:54.704517 2744 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:54.708785 kubelet[2744]: I0912 23:00:54.708772 2744 policy_none.go:49] "None policy: Start" Sep 12 23:00:54.708785 kubelet[2744]: I0912 23:00:54.708787 2744 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:00:54.708842 kubelet[2744]: I0912 23:00:54.708795 2744 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:00:54.715190 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 23:00:54.722894 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 23:00:54.725279 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 23:00:54.730969 kubelet[2744]: I0912 23:00:54.730458 2744 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 23:00:54.732651 kubelet[2744]: E0912 23:00:54.732043 2744 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 23:00:54.732651 kubelet[2744]: I0912 23:00:54.732166 2744 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:00:54.732651 kubelet[2744]: I0912 23:00:54.732175 2744 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:00:54.732992 kubelet[2744]: I0912 23:00:54.732980 2744 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:00:54.734944 kubelet[2744]: I0912 23:00:54.734932 2744 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 23:00:54.735027 kubelet[2744]: I0912 23:00:54.735022 2744 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 23:00:54.735060 kubelet[2744]: I0912 23:00:54.735056 2744 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:00:54.735096 kubelet[2744]: I0912 23:00:54.735093 2744 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 23:00:54.735196 kubelet[2744]: E0912 23:00:54.735190 2744 kubelet.go:2460] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 12 23:00:54.735641 kubelet[2744]: E0912 23:00:54.735630 2744 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:00:54.736767 kubelet[2744]: E0912 23:00:54.736725 2744 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.0.0-a-36add7270c\" not found" Sep 12 23:00:54.737151 kubelet[2744]: E0912 23:00:54.737103 2744 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.8.17:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 23:00:54.833858 kubelet[2744]: I0912 23:00:54.833844 2744 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.834076 kubelet[2744]: E0912 23:00:54.834054 2744 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.17:6443/api/v1/nodes\": dial tcp 10.200.8.17:6443: connect: connection refused" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.844882 systemd[1]: Created slice kubepods-burstable-pode8016fb57e83d679925c70c3c1794cb8.slice - libcontainer container kubepods-burstable-pode8016fb57e83d679925c70c3c1794cb8.slice. Sep 12 23:00:54.849949 kubelet[2744]: E0912 23:00:54.849928 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.854007 systemd[1]: Created slice kubepods-burstable-pod31cb79564ad638bd49a962cbaae04b7b.slice - libcontainer container kubepods-burstable-pod31cb79564ad638bd49a962cbaae04b7b.slice. Sep 12 23:00:54.855959 kubelet[2744]: E0912 23:00:54.855940 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.857840 systemd[1]: Created slice kubepods-burstable-pod2a92b6f367fc35354ecb9c93e926a6a8.slice - libcontainer container kubepods-burstable-pod2a92b6f367fc35354ecb9c93e926a6a8.slice. Sep 12 23:00:54.859121 kubelet[2744]: E0912 23:00:54.859101 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877731 kubelet[2744]: I0912 23:00:54.877554 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877731 kubelet[2744]: I0912 23:00:54.877580 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877731 kubelet[2744]: I0912 23:00:54.877597 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2a92b6f367fc35354ecb9c93e926a6a8-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-a-36add7270c\" (UID: \"2a92b6f367fc35354ecb9c93e926a6a8\") " pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877731 kubelet[2744]: I0912 23:00:54.877610 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877731 kubelet[2744]: I0912 23:00:54.877624 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877862 kubelet[2744]: I0912 23:00:54.877638 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877862 kubelet[2744]: I0912 23:00:54.877652 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877862 kubelet[2744]: I0912 23:00:54.877665 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877862 kubelet[2744]: I0912 23:00:54.877681 2744 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:54.877862 kubelet[2744]: E0912 23:00:54.877688 2744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-a-36add7270c?timeout=10s\": dial tcp 10.200.8.17:6443: connect: connection refused" interval="400ms" Sep 12 23:00:55.035746 kubelet[2744]: I0912 23:00:55.035697 2744 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.035893 kubelet[2744]: E0912 23:00:55.035875 2744 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.17:6443/api/v1/nodes\": dial tcp 10.200.8.17:6443: connect: connection refused" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.151148 containerd[1728]: time="2025-09-12T23:00:55.151119907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-a-36add7270c,Uid:e8016fb57e83d679925c70c3c1794cb8,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:55.156883 containerd[1728]: time="2025-09-12T23:00:55.156858880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-a-36add7270c,Uid:31cb79564ad638bd49a962cbaae04b7b,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:55.160489 containerd[1728]: time="2025-09-12T23:00:55.160467479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-a-36add7270c,Uid:2a92b6f367fc35354ecb9c93e926a6a8,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:55.208687 containerd[1728]: time="2025-09-12T23:00:55.208645765Z" level=info msg="connecting to shim 4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3" address="unix:///run/containerd/s/989614d5b23a0ecec65e0b04cea465c0518437278cb920f41666ee8851221f4e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:55.234636 systemd[1]: Started cri-containerd-4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3.scope - libcontainer container 4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3. Sep 12 23:00:55.236532 containerd[1728]: time="2025-09-12T23:00:55.236473420Z" level=info msg="connecting to shim be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0" address="unix:///run/containerd/s/85533475207c611ed6efe2c8d148ea43e0db16cbe01b7270c89e82232d1285dd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:55.263416 containerd[1728]: time="2025-09-12T23:00:55.263356117Z" level=info msg="connecting to shim 0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d" address="unix:///run/containerd/s/1beeda832b2696faed6af145576a4ee77ae8673b19df380618dcb2156022178a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:55.275083 systemd[1]: Started cri-containerd-be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0.scope - libcontainer container be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0. Sep 12 23:00:55.278538 kubelet[2744]: E0912 23:00:55.278490 2744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.0.0-a-36add7270c?timeout=10s\": dial tcp 10.200.8.17:6443: connect: connection refused" interval="800ms" Sep 12 23:00:55.293587 systemd[1]: Started cri-containerd-0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d.scope - libcontainer container 0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d. Sep 12 23:00:55.305256 containerd[1728]: time="2025-09-12T23:00:55.305236877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.0.0-a-36add7270c,Uid:e8016fb57e83d679925c70c3c1794cb8,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3\"" Sep 12 23:00:55.316423 containerd[1728]: time="2025-09-12T23:00:55.316403654Z" level=info msg="CreateContainer within sandbox \"4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 23:00:55.342323 containerd[1728]: time="2025-09-12T23:00:55.342288578Z" level=info msg="Container f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:55.345123 containerd[1728]: time="2025-09-12T23:00:55.345097236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.0.0-a-36add7270c,Uid:2a92b6f367fc35354ecb9c93e926a6a8,Namespace:kube-system,Attempt:0,} returns sandbox id \"be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0\"" Sep 12 23:00:55.349766 containerd[1728]: time="2025-09-12T23:00:55.349746170Z" level=info msg="CreateContainer within sandbox \"be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 23:00:55.351161 containerd[1728]: time="2025-09-12T23:00:55.351140209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.0.0-a-36add7270c,Uid:31cb79564ad638bd49a962cbaae04b7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d\"" Sep 12 23:00:55.356702 containerd[1728]: time="2025-09-12T23:00:55.356681425Z" level=info msg="CreateContainer within sandbox \"0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 23:00:55.371143 containerd[1728]: time="2025-09-12T23:00:55.371119609Z" level=info msg="CreateContainer within sandbox \"4d2d41972cb53d8fb7aee4d22f7fad1a7d2e43d27bd6182c6b1dc80f3ba49ae3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17\"" Sep 12 23:00:55.371598 containerd[1728]: time="2025-09-12T23:00:55.371577282Z" level=info msg="StartContainer for \"f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17\"" Sep 12 23:00:55.372339 containerd[1728]: time="2025-09-12T23:00:55.372318604Z" level=info msg="connecting to shim f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17" address="unix:///run/containerd/s/989614d5b23a0ecec65e0b04cea465c0518437278cb920f41666ee8851221f4e" protocol=ttrpc version=3 Sep 12 23:00:55.388622 systemd[1]: Started cri-containerd-f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17.scope - libcontainer container f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17. Sep 12 23:00:55.399576 containerd[1728]: time="2025-09-12T23:00:55.398835112Z" level=info msg="Container fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:55.406754 containerd[1728]: time="2025-09-12T23:00:55.406729354Z" level=info msg="Container bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:55.417810 containerd[1728]: time="2025-09-12T23:00:55.417552082Z" level=info msg="CreateContainer within sandbox \"be7c94d2575a31959a7cb2689b72f9838d8efaba33e81f15d4d78f7c1afad7b0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b\"" Sep 12 23:00:55.418237 containerd[1728]: time="2025-09-12T23:00:55.418215602Z" level=info msg="StartContainer for \"fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b\"" Sep 12 23:00:55.421319 containerd[1728]: time="2025-09-12T23:00:55.421094152Z" level=info msg="connecting to shim fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b" address="unix:///run/containerd/s/85533475207c611ed6efe2c8d148ea43e0db16cbe01b7270c89e82232d1285dd" protocol=ttrpc version=3 Sep 12 23:00:55.429979 containerd[1728]: time="2025-09-12T23:00:55.429952078Z" level=info msg="CreateContainer within sandbox \"0ce46f0adb453ddd300801e22f10bfa888ac56ec4c0e7562968283e6e3aaa95d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe\"" Sep 12 23:00:55.430334 containerd[1728]: time="2025-09-12T23:00:55.430318649Z" level=info msg="StartContainer for \"bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe\"" Sep 12 23:00:55.431318 containerd[1728]: time="2025-09-12T23:00:55.431292492Z" level=info msg="connecting to shim bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe" address="unix:///run/containerd/s/1beeda832b2696faed6af145576a4ee77ae8673b19df380618dcb2156022178a" protocol=ttrpc version=3 Sep 12 23:00:55.438362 kubelet[2744]: I0912 23:00:55.438347 2744 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.439092 kubelet[2744]: E0912 23:00:55.438687 2744 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.17:6443/api/v1/nodes\": dial tcp 10.200.8.17:6443: connect: connection refused" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.443719 systemd[1]: Started cri-containerd-fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b.scope - libcontainer container fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b. Sep 12 23:00:55.449232 containerd[1728]: time="2025-09-12T23:00:55.449208726Z" level=info msg="StartContainer for \"f8e4f6989c12e0ed6f09b37d78201dd4514c986c29247d53db4bdf8d799e2e17\" returns successfully" Sep 12 23:00:55.455742 systemd[1]: Started cri-containerd-bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe.scope - libcontainer container bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe. Sep 12 23:00:55.521328 containerd[1728]: time="2025-09-12T23:00:55.521303093Z" level=info msg="StartContainer for \"bb193a7de44698e3bbf6d01ca4cf32f4468e6678b56353d215d6225489371abe\" returns successfully" Sep 12 23:00:55.572814 containerd[1728]: time="2025-09-12T23:00:55.572155383Z" level=info msg="StartContainer for \"fe135d827c497d7e524584938939eb17d59eeac129cd0e3eef7c8a5c494d934b\" returns successfully" Sep 12 23:00:55.742125 kubelet[2744]: E0912 23:00:55.742107 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.746718 kubelet[2744]: E0912 23:00:55.746701 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:55.747753 kubelet[2744]: E0912 23:00:55.747737 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:56.240456 kubelet[2744]: I0912 23:00:56.240442 2744 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:56.750040 kubelet[2744]: E0912 23:00:56.750020 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:56.750635 kubelet[2744]: E0912 23:00:56.750621 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:56.750883 kubelet[2744]: E0912 23:00:56.750873 2744 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:56.961242 kubelet[2744]: E0912 23:00:56.961218 2744 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.0.0-a-36add7270c\" not found" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.015206 kubelet[2744]: I0912 23:00:57.014977 2744 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.015206 kubelet[2744]: E0912 23:00:57.015003 2744 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459.0.0-a-36add7270c\": node \"ci-4459.0.0-a-36add7270c\" not found" Sep 12 23:00:57.074568 kubelet[2744]: I0912 23:00:57.074549 2744 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.081505 kubelet[2744]: E0912 23:00:57.081478 2744 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.0.0-a-36add7270c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.081654 kubelet[2744]: I0912 23:00:57.081589 2744 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.083016 kubelet[2744]: E0912 23:00:57.082989 2744 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.083016 kubelet[2744]: I0912 23:00:57.083004 2744 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.085753 kubelet[2744]: E0912 23:00:57.085620 2744 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.0.0-a-36add7270c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:00:57.661014 kubelet[2744]: I0912 23:00:57.660995 2744 apiserver.go:52] "Watching apiserver" Sep 12 23:00:57.676993 kubelet[2744]: I0912 23:00:57.676973 2744 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:00:58.924476 systemd[1]: Reload requested from client PID 3022 ('systemctl') (unit session-9.scope)... Sep 12 23:00:58.924487 systemd[1]: Reloading... Sep 12 23:00:59.018520 zram_generator::config[3072]: No configuration found. Sep 12 23:00:59.175101 systemd[1]: Reloading finished in 250 ms. Sep 12 23:00:59.194600 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:59.217401 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 23:00:59.218624 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:59.218698 systemd[1]: kubelet.service: Consumed 426ms CPU time, 129.2M memory peak. Sep 12 23:00:59.220356 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:59.718731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:59.724818 (kubelet)[3136]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:00:59.757212 kubelet[3136]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:59.757212 kubelet[3136]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:00:59.757212 kubelet[3136]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:59.757440 kubelet[3136]: I0912 23:00:59.757258 3136 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:00:59.766910 kubelet[3136]: I0912 23:00:59.766643 3136 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 23:00:59.766910 kubelet[3136]: I0912 23:00:59.766661 3136 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:00:59.766910 kubelet[3136]: I0912 23:00:59.766877 3136 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 23:00:59.767948 kubelet[3136]: I0912 23:00:59.767932 3136 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 23:00:59.769924 kubelet[3136]: I0912 23:00:59.769616 3136 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:00:59.772750 kubelet[3136]: I0912 23:00:59.772729 3136 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:00:59.774419 kubelet[3136]: I0912 23:00:59.774400 3136 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:00:59.774945 kubelet[3136]: I0912 23:00:59.774662 3136 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:00:59.774945 kubelet[3136]: I0912 23:00:59.774681 3136 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.0.0-a-36add7270c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:00:59.774945 kubelet[3136]: I0912 23:00:59.774776 3136 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:00:59.774945 kubelet[3136]: I0912 23:00:59.774783 3136 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 23:00:59.774945 kubelet[3136]: I0912 23:00:59.774810 3136 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:59.775592 kubelet[3136]: I0912 23:00:59.775577 3136 kubelet.go:480] "Attempting to sync node with API server" Sep 12 23:00:59.775653 kubelet[3136]: I0912 23:00:59.775598 3136 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:00:59.775653 kubelet[3136]: I0912 23:00:59.775617 3136 kubelet.go:386] "Adding apiserver pod source" Sep 12 23:00:59.775653 kubelet[3136]: I0912 23:00:59.775630 3136 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:00:59.776479 kubelet[3136]: I0912 23:00:59.776388 3136 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:00:59.776902 kubelet[3136]: I0912 23:00:59.776826 3136 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 23:00:59.778851 kubelet[3136]: I0912 23:00:59.778835 3136 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:00:59.778907 kubelet[3136]: I0912 23:00:59.778869 3136 server.go:1289] "Started kubelet" Sep 12 23:00:59.784132 kubelet[3136]: I0912 23:00:59.781793 3136 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:00:59.785368 kubelet[3136]: I0912 23:00:59.785265 3136 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:00:59.787961 kubelet[3136]: I0912 23:00:59.786257 3136 server.go:317] "Adding debug handlers to kubelet server" Sep 12 23:00:59.791537 kubelet[3136]: I0912 23:00:59.791284 3136 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:00:59.791537 kubelet[3136]: I0912 23:00:59.791457 3136 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:00:59.791750 kubelet[3136]: I0912 23:00:59.791735 3136 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:00:59.792629 kubelet[3136]: I0912 23:00:59.792431 3136 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:00:59.792629 kubelet[3136]: E0912 23:00:59.792621 3136 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.0.0-a-36add7270c\" not found" Sep 12 23:00:59.793163 kubelet[3136]: I0912 23:00:59.793150 3136 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:00:59.793423 kubelet[3136]: I0912 23:00:59.793233 3136 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:00:59.809935 kubelet[3136]: I0912 23:00:59.809208 3136 factory.go:223] Registration of the systemd container factory successfully Sep 12 23:00:59.809935 kubelet[3136]: I0912 23:00:59.809284 3136 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:00:59.810145 kubelet[3136]: I0912 23:00:59.810107 3136 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 23:00:59.811062 kubelet[3136]: I0912 23:00:59.811047 3136 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 23:00:59.811134 kubelet[3136]: I0912 23:00:59.811130 3136 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 23:00:59.811177 kubelet[3136]: I0912 23:00:59.811172 3136 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:00:59.811207 kubelet[3136]: I0912 23:00:59.811203 3136 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 23:00:59.811271 kubelet[3136]: E0912 23:00:59.811260 3136 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:00:59.819802 kubelet[3136]: I0912 23:00:59.819790 3136 factory.go:223] Registration of the containerd container factory successfully Sep 12 23:00:59.837756 kubelet[3136]: E0912 23:00:59.837742 3136 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:00:59.859761 kubelet[3136]: I0912 23:00:59.859750 3136 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:00:59.860046 kubelet[3136]: I0912 23:00:59.860039 3136 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:00:59.860082 kubelet[3136]: I0912 23:00:59.860079 3136 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:59.860176 kubelet[3136]: I0912 23:00:59.860170 3136 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 23:00:59.860203 kubelet[3136]: I0912 23:00:59.860195 3136 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 23:00:59.860224 kubelet[3136]: I0912 23:00:59.860222 3136 policy_none.go:49] "None policy: Start" Sep 12 23:00:59.860246 kubelet[3136]: I0912 23:00:59.860243 3136 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:00:59.860267 kubelet[3136]: I0912 23:00:59.860264 3136 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:00:59.860330 kubelet[3136]: I0912 23:00:59.860327 3136 state_mem.go:75] "Updated machine memory state" Sep 12 23:00:59.862671 kubelet[3136]: E0912 23:00:59.862661 3136 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 23:00:59.863116 kubelet[3136]: I0912 23:00:59.863108 3136 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:00:59.863232 kubelet[3136]: I0912 23:00:59.863213 3136 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:00:59.863950 kubelet[3136]: I0912 23:00:59.863631 3136 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:00:59.865177 kubelet[3136]: E0912 23:00:59.865165 3136 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:00:59.912817 kubelet[3136]: I0912 23:00:59.912633 3136 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.912980 kubelet[3136]: I0912 23:00:59.912633 3136 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.913163 kubelet[3136]: I0912 23:00:59.913154 3136 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.921677 kubelet[3136]: I0912 23:00:59.921644 3136 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 23:00:59.924607 kubelet[3136]: I0912 23:00:59.924588 3136 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 23:00:59.924761 kubelet[3136]: I0912 23:00:59.924739 3136 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 23:00:59.967289 kubelet[3136]: I0912 23:00:59.966855 3136 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.976742 kubelet[3136]: I0912 23:00:59.976640 3136 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.976742 kubelet[3136]: I0912 23:00:59.976686 3136 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.994980 kubelet[3136]: I0912 23:00:59.994953 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-k8s-certs\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995474 kubelet[3136]: I0912 23:00:59.994993 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995474 kubelet[3136]: I0912 23:00:59.995017 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2a92b6f367fc35354ecb9c93e926a6a8-kubeconfig\") pod \"kube-scheduler-ci-4459.0.0-a-36add7270c\" (UID: \"2a92b6f367fc35354ecb9c93e926a6a8\") " pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995474 kubelet[3136]: I0912 23:00:59.995034 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-k8s-certs\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995474 kubelet[3136]: I0912 23:00:59.995047 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-ca-certs\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995474 kubelet[3136]: I0912 23:00:59.995077 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995593 kubelet[3136]: I0912 23:00:59.995094 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/31cb79564ad638bd49a962cbaae04b7b-kubeconfig\") pod \"kube-controller-manager-ci-4459.0.0-a-36add7270c\" (UID: \"31cb79564ad638bd49a962cbaae04b7b\") " pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995593 kubelet[3136]: I0912 23:00:59.995107 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-ca-certs\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:00:59.995593 kubelet[3136]: I0912 23:00:59.995134 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e8016fb57e83d679925c70c3c1794cb8-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.0.0-a-36add7270c\" (UID: \"e8016fb57e83d679925c70c3c1794cb8\") " pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:01:00.776894 kubelet[3136]: I0912 23:01:00.776869 3136 apiserver.go:52] "Watching apiserver" Sep 12 23:01:00.793844 kubelet[3136]: I0912 23:01:00.793819 3136 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:01:00.847530 kubelet[3136]: I0912 23:01:00.847258 3136 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:01:00.847788 kubelet[3136]: I0912 23:01:00.847778 3136 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:01:00.858842 kubelet[3136]: I0912 23:01:00.858825 3136 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 23:01:00.858907 kubelet[3136]: E0912 23:01:00.858861 3136 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.0.0-a-36add7270c\" already exists" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" Sep 12 23:01:00.865511 kubelet[3136]: I0912 23:01:00.865432 3136 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 23:01:00.866150 kubelet[3136]: E0912 23:01:00.865688 3136 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.0.0-a-36add7270c\" already exists" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" Sep 12 23:01:00.879237 kubelet[3136]: I0912 23:01:00.879189 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.0.0-a-36add7270c" podStartSLOduration=1.87917865 podStartE2EDuration="1.87917865s" podCreationTimestamp="2025-09-12 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:00.866599975 +0000 UTC m=+1.138226765" watchObservedRunningTime="2025-09-12 23:01:00.87917865 +0000 UTC m=+1.150805444" Sep 12 23:01:00.887815 kubelet[3136]: I0912 23:01:00.887778 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.0.0-a-36add7270c" podStartSLOduration=1.887769408 podStartE2EDuration="1.887769408s" podCreationTimestamp="2025-09-12 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:00.879284717 +0000 UTC m=+1.150911817" watchObservedRunningTime="2025-09-12 23:01:00.887769408 +0000 UTC m=+1.159396213" Sep 12 23:01:00.896505 kubelet[3136]: I0912 23:01:00.896318 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.0.0-a-36add7270c" podStartSLOduration=1.89631021 podStartE2EDuration="1.89631021s" podCreationTimestamp="2025-09-12 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:00.888138371 +0000 UTC m=+1.159765162" watchObservedRunningTime="2025-09-12 23:01:00.89631021 +0000 UTC m=+1.167937002" Sep 12 23:01:01.678507 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Sep 12 23:01:04.065296 kubelet[3136]: I0912 23:01:04.065268 3136 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 23:01:04.065656 containerd[1728]: time="2025-09-12T23:01:04.065617847Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 23:01:04.066052 kubelet[3136]: I0912 23:01:04.065778 3136 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 23:01:04.344349 systemd[1]: Created slice kubepods-besteffort-pod98220bdd_a986_40f4_b641_f54c22ecd8c9.slice - libcontainer container kubepods-besteffort-pod98220bdd_a986_40f4_b641_f54c22ecd8c9.slice. Sep 12 23:01:04.423398 kubelet[3136]: I0912 23:01:04.423378 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/98220bdd-a986-40f4-b641-f54c22ecd8c9-kube-proxy\") pod \"kube-proxy-ncdw2\" (UID: \"98220bdd-a986-40f4-b641-f54c22ecd8c9\") " pod="kube-system/kube-proxy-ncdw2" Sep 12 23:01:04.423485 kubelet[3136]: I0912 23:01:04.423403 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/98220bdd-a986-40f4-b641-f54c22ecd8c9-xtables-lock\") pod \"kube-proxy-ncdw2\" (UID: \"98220bdd-a986-40f4-b641-f54c22ecd8c9\") " pod="kube-system/kube-proxy-ncdw2" Sep 12 23:01:04.423485 kubelet[3136]: I0912 23:01:04.423419 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98220bdd-a986-40f4-b641-f54c22ecd8c9-lib-modules\") pod \"kube-proxy-ncdw2\" (UID: \"98220bdd-a986-40f4-b641-f54c22ecd8c9\") " pod="kube-system/kube-proxy-ncdw2" Sep 12 23:01:04.423485 kubelet[3136]: I0912 23:01:04.423440 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldn8w\" (UniqueName: \"kubernetes.io/projected/98220bdd-a986-40f4-b641-f54c22ecd8c9-kube-api-access-ldn8w\") pod \"kube-proxy-ncdw2\" (UID: \"98220bdd-a986-40f4-b641-f54c22ecd8c9\") " pod="kube-system/kube-proxy-ncdw2" Sep 12 23:01:04.527723 kubelet[3136]: E0912 23:01:04.527699 3136 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 23:01:04.527723 kubelet[3136]: E0912 23:01:04.527718 3136 projected.go:194] Error preparing data for projected volume kube-api-access-ldn8w for pod kube-system/kube-proxy-ncdw2: configmap "kube-root-ca.crt" not found Sep 12 23:01:04.527808 kubelet[3136]: E0912 23:01:04.527769 3136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98220bdd-a986-40f4-b641-f54c22ecd8c9-kube-api-access-ldn8w podName:98220bdd-a986-40f4-b641-f54c22ecd8c9 nodeName:}" failed. No retries permitted until 2025-09-12 23:01:05.027750621 +0000 UTC m=+5.299377412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ldn8w" (UniqueName: "kubernetes.io/projected/98220bdd-a986-40f4-b641-f54c22ecd8c9-kube-api-access-ldn8w") pod "kube-proxy-ncdw2" (UID: "98220bdd-a986-40f4-b641-f54c22ecd8c9") : configmap "kube-root-ca.crt" not found Sep 12 23:01:04.847596 update_engine[1706]: I20250912 23:01:04.847545 1706 update_attempter.cc:509] Updating boot flags... Sep 12 23:01:05.209396 systemd[1]: Created slice kubepods-besteffort-pod2fd25a20_1212_4ae6_9b8c_db12172e9eb9.slice - libcontainer container kubepods-besteffort-pod2fd25a20_1212_4ae6_9b8c_db12172e9eb9.slice. Sep 12 23:01:05.229713 kubelet[3136]: I0912 23:01:05.229582 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2fd25a20-1212-4ae6-9b8c-db12172e9eb9-var-lib-calico\") pod \"tigera-operator-755d956888-brplc\" (UID: \"2fd25a20-1212-4ae6-9b8c-db12172e9eb9\") " pod="tigera-operator/tigera-operator-755d956888-brplc" Sep 12 23:01:05.229968 kubelet[3136]: I0912 23:01:05.229862 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmsxr\" (UniqueName: \"kubernetes.io/projected/2fd25a20-1212-4ae6-9b8c-db12172e9eb9-kube-api-access-mmsxr\") pod \"tigera-operator-755d956888-brplc\" (UID: \"2fd25a20-1212-4ae6-9b8c-db12172e9eb9\") " pod="tigera-operator/tigera-operator-755d956888-brplc" Sep 12 23:01:05.254042 containerd[1728]: time="2025-09-12T23:01:05.254011004Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ncdw2,Uid:98220bdd-a986-40f4-b641-f54c22ecd8c9,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:05.297817 containerd[1728]: time="2025-09-12T23:01:05.297788815Z" level=info msg="connecting to shim 2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57" address="unix:///run/containerd/s/0092844c0fd7d2107fc20904823c327d32291413d2b92ba77009e4185420d284" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:05.317633 systemd[1]: Started cri-containerd-2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57.scope - libcontainer container 2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57. Sep 12 23:01:05.346423 containerd[1728]: time="2025-09-12T23:01:05.346335325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ncdw2,Uid:98220bdd-a986-40f4-b641-f54c22ecd8c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57\"" Sep 12 23:01:05.353055 containerd[1728]: time="2025-09-12T23:01:05.353031474Z" level=info msg="CreateContainer within sandbox \"2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 23:01:05.373458 containerd[1728]: time="2025-09-12T23:01:05.373437835Z" level=info msg="Container d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:05.390138 containerd[1728]: time="2025-09-12T23:01:05.390116783Z" level=info msg="CreateContainer within sandbox \"2bee056c9810363d145f6c5dae01958a230facd483faa45f82896f091eff5c57\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e\"" Sep 12 23:01:05.391550 containerd[1728]: time="2025-09-12T23:01:05.390727940Z" level=info msg="StartContainer for \"d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e\"" Sep 12 23:01:05.391985 containerd[1728]: time="2025-09-12T23:01:05.391963018Z" level=info msg="connecting to shim d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e" address="unix:///run/containerd/s/0092844c0fd7d2107fc20904823c327d32291413d2b92ba77009e4185420d284" protocol=ttrpc version=3 Sep 12 23:01:05.407633 systemd[1]: Started cri-containerd-d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e.scope - libcontainer container d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e. Sep 12 23:01:05.435265 containerd[1728]: time="2025-09-12T23:01:05.435204307Z" level=info msg="StartContainer for \"d106b3df64821039f96e96c8ae7f73e83c8af7d56efcc431781f6c94902d507e\" returns successfully" Sep 12 23:01:05.515989 containerd[1728]: time="2025-09-12T23:01:05.515927955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-brplc,Uid:2fd25a20-1212-4ae6-9b8c-db12172e9eb9,Namespace:tigera-operator,Attempt:0,}" Sep 12 23:01:05.558507 containerd[1728]: time="2025-09-12T23:01:05.558458375Z" level=info msg="connecting to shim de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698" address="unix:///run/containerd/s/30e6e81e64a13db1c117f377f0159f3257ffc07f91d87aa45307bd4b8ed75819" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:05.576650 systemd[1]: Started cri-containerd-de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698.scope - libcontainer container de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698. Sep 12 23:01:05.615030 containerd[1728]: time="2025-09-12T23:01:05.615008226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-brplc,Uid:2fd25a20-1212-4ae6-9b8c-db12172e9eb9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698\"" Sep 12 23:01:05.616123 containerd[1728]: time="2025-09-12T23:01:05.616080739Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 23:01:05.866729 kubelet[3136]: I0912 23:01:05.866682 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-ncdw2" podStartSLOduration=1.866670068 podStartE2EDuration="1.866670068s" podCreationTimestamp="2025-09-12 23:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:05.866533247 +0000 UTC m=+6.138160040" watchObservedRunningTime="2025-09-12 23:01:05.866670068 +0000 UTC m=+6.138296859" Sep 12 23:01:07.310541 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3844432231.mount: Deactivated successfully. Sep 12 23:01:07.692108 containerd[1728]: time="2025-09-12T23:01:07.692080604Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:07.694861 containerd[1728]: time="2025-09-12T23:01:07.694776906Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 23:01:07.697190 containerd[1728]: time="2025-09-12T23:01:07.697166285Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:07.700132 containerd[1728]: time="2025-09-12T23:01:07.700109216Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:07.700514 containerd[1728]: time="2025-09-12T23:01:07.700405162Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.084287564s" Sep 12 23:01:07.700514 containerd[1728]: time="2025-09-12T23:01:07.700429354Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 23:01:07.705582 containerd[1728]: time="2025-09-12T23:01:07.705560317Z" level=info msg="CreateContainer within sandbox \"de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 23:01:07.725606 containerd[1728]: time="2025-09-12T23:01:07.725575944Z" level=info msg="Container 5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:07.726946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2356952691.mount: Deactivated successfully. Sep 12 23:01:07.736782 containerd[1728]: time="2025-09-12T23:01:07.736759965Z" level=info msg="CreateContainer within sandbox \"de4c190e6f5dad10a81a4d2702e190f096daa0bcbf9d12e4f1cde18a2d3e2698\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859\"" Sep 12 23:01:07.737203 containerd[1728]: time="2025-09-12T23:01:07.737180687Z" level=info msg="StartContainer for \"5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859\"" Sep 12 23:01:07.738005 containerd[1728]: time="2025-09-12T23:01:07.737979095Z" level=info msg="connecting to shim 5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859" address="unix:///run/containerd/s/30e6e81e64a13db1c117f377f0159f3257ffc07f91d87aa45307bd4b8ed75819" protocol=ttrpc version=3 Sep 12 23:01:07.761652 systemd[1]: Started cri-containerd-5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859.scope - libcontainer container 5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859. Sep 12 23:01:07.787384 containerd[1728]: time="2025-09-12T23:01:07.787360523Z" level=info msg="StartContainer for \"5466d3e964a06bac446d74cc5536203f9860ee04cfad17ad7823fe952ff54859\" returns successfully" Sep 12 23:01:11.778517 kubelet[3136]: I0912 23:01:11.777966 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-brplc" podStartSLOduration=4.692762681 podStartE2EDuration="6.777952102s" podCreationTimestamp="2025-09-12 23:01:05 +0000 UTC" firstStartedPulling="2025-09-12 23:01:05.615772419 +0000 UTC m=+5.887399204" lastFinishedPulling="2025-09-12 23:01:07.700961837 +0000 UTC m=+7.972588625" observedRunningTime="2025-09-12 23:01:07.868340691 +0000 UTC m=+8.139967488" watchObservedRunningTime="2025-09-12 23:01:11.777952102 +0000 UTC m=+12.049578896" Sep 12 23:01:12.940663 sudo[2157]: pam_unix(sudo:session): session closed for user root Sep 12 23:01:13.040206 sshd[2156]: Connection closed by 10.200.16.10 port 45546 Sep 12 23:01:13.041642 sshd-session[2153]: pam_unix(sshd:session): session closed for user core Sep 12 23:01:13.044572 systemd[1]: sshd@6-10.200.8.17:22-10.200.16.10:45546.service: Deactivated successfully. Sep 12 23:01:13.047102 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 23:01:13.047467 systemd[1]: session-9.scope: Consumed 3.184s CPU time, 232.1M memory peak. Sep 12 23:01:13.051254 systemd-logind[1704]: Session 9 logged out. Waiting for processes to exit. Sep 12 23:01:13.054693 systemd-logind[1704]: Removed session 9. Sep 12 23:01:16.227533 systemd[1]: Created slice kubepods-besteffort-pod6c5c41c7_56b8_437c_bc7a_bcffbe13800b.slice - libcontainer container kubepods-besteffort-pod6c5c41c7_56b8_437c_bc7a_bcffbe13800b.slice. Sep 12 23:01:16.304819 kubelet[3136]: I0912 23:01:16.304793 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn9vs\" (UniqueName: \"kubernetes.io/projected/6c5c41c7-56b8-437c-bc7a-bcffbe13800b-kube-api-access-cn9vs\") pod \"calico-typha-78ff968d76-9r7r6\" (UID: \"6c5c41c7-56b8-437c-bc7a-bcffbe13800b\") " pod="calico-system/calico-typha-78ff968d76-9r7r6" Sep 12 23:01:16.304819 kubelet[3136]: I0912 23:01:16.304824 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c5c41c7-56b8-437c-bc7a-bcffbe13800b-tigera-ca-bundle\") pod \"calico-typha-78ff968d76-9r7r6\" (UID: \"6c5c41c7-56b8-437c-bc7a-bcffbe13800b\") " pod="calico-system/calico-typha-78ff968d76-9r7r6" Sep 12 23:01:16.305117 kubelet[3136]: I0912 23:01:16.304843 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6c5c41c7-56b8-437c-bc7a-bcffbe13800b-typha-certs\") pod \"calico-typha-78ff968d76-9r7r6\" (UID: \"6c5c41c7-56b8-437c-bc7a-bcffbe13800b\") " pod="calico-system/calico-typha-78ff968d76-9r7r6" Sep 12 23:01:16.536229 containerd[1728]: time="2025-09-12T23:01:16.536016600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78ff968d76-9r7r6,Uid:6c5c41c7-56b8-437c-bc7a-bcffbe13800b,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:16.585534 containerd[1728]: time="2025-09-12T23:01:16.585114716Z" level=info msg="connecting to shim ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64" address="unix:///run/containerd/s/5f9dcf6be35d2878c4eb5767982ff2835800e0552d864cdec95b462446d21309" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:16.607672 kubelet[3136]: I0912 23:01:16.607608 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-lib-modules\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.608295 systemd[1]: Created slice kubepods-besteffort-pod278d829c_f434_4eee_b3bd_6fbb76f9bbec.slice - libcontainer container kubepods-besteffort-pod278d829c_f434_4eee_b3bd_6fbb76f9bbec.slice. Sep 12 23:01:16.610810 kubelet[3136]: I0912 23:01:16.610315 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-policysync\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610810 kubelet[3136]: I0912 23:01:16.610352 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-cni-log-dir\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610810 kubelet[3136]: I0912 23:01:16.610369 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/278d829c-f434-4eee-b3bd-6fbb76f9bbec-node-certs\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610810 kubelet[3136]: I0912 23:01:16.610386 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/278d829c-f434-4eee-b3bd-6fbb76f9bbec-tigera-ca-bundle\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610810 kubelet[3136]: I0912 23:01:16.610402 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-cni-net-dir\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610951 kubelet[3136]: I0912 23:01:16.610421 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-var-lib-calico\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610951 kubelet[3136]: I0912 23:01:16.610436 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-xtables-lock\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610951 kubelet[3136]: I0912 23:01:16.610464 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-var-run-calico\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610951 kubelet[3136]: I0912 23:01:16.610509 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-cni-bin-dir\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.610951 kubelet[3136]: I0912 23:01:16.610533 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/278d829c-f434-4eee-b3bd-6fbb76f9bbec-flexvol-driver-host\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.611034 kubelet[3136]: I0912 23:01:16.610545 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mnf\" (UniqueName: \"kubernetes.io/projected/278d829c-f434-4eee-b3bd-6fbb76f9bbec-kube-api-access-p4mnf\") pod \"calico-node-87txg\" (UID: \"278d829c-f434-4eee-b3bd-6fbb76f9bbec\") " pod="calico-system/calico-node-87txg" Sep 12 23:01:16.627767 systemd[1]: Started cri-containerd-ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64.scope - libcontainer container ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64. Sep 12 23:01:16.665489 containerd[1728]: time="2025-09-12T23:01:16.665463864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78ff968d76-9r7r6,Uid:6c5c41c7-56b8-437c-bc7a-bcffbe13800b,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64\"" Sep 12 23:01:16.666582 containerd[1728]: time="2025-09-12T23:01:16.666562179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 23:01:16.712002 kubelet[3136]: E0912 23:01:16.711836 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.712002 kubelet[3136]: W0912 23:01:16.711857 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.712002 kubelet[3136]: E0912 23:01:16.711885 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.712002 kubelet[3136]: E0912 23:01:16.712002 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.712002 kubelet[3136]: W0912 23:01:16.712007 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.712153 kubelet[3136]: E0912 23:01:16.712015 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.712153 kubelet[3136]: E0912 23:01:16.712113 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.712153 kubelet[3136]: W0912 23:01:16.712118 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.712153 kubelet[3136]: E0912 23:01:16.712123 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.712233 kubelet[3136]: E0912 23:01:16.712226 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.712233 kubelet[3136]: W0912 23:01:16.712230 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.712273 kubelet[3136]: E0912 23:01:16.712235 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.713108 kubelet[3136]: E0912 23:01:16.712998 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.713108 kubelet[3136]: W0912 23:01:16.713012 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.713108 kubelet[3136]: E0912 23:01:16.713025 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.717014 kubelet[3136]: E0912 23:01:16.716192 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.717014 kubelet[3136]: W0912 23:01:16.716209 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.717014 kubelet[3136]: E0912 23:01:16.716222 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.721823 kubelet[3136]: E0912 23:01:16.721797 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.721823 kubelet[3136]: W0912 23:01:16.721819 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.721916 kubelet[3136]: E0912 23:01:16.721832 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.889360 kubelet[3136]: E0912 23:01:16.889332 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:16.905966 kubelet[3136]: E0912 23:01:16.905948 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.905966 kubelet[3136]: W0912 23:01:16.905962 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906085 kubelet[3136]: E0912 23:01:16.905974 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906085 kubelet[3136]: E0912 23:01:16.906067 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906085 kubelet[3136]: W0912 23:01:16.906072 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906085 kubelet[3136]: E0912 23:01:16.906078 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906176 kubelet[3136]: E0912 23:01:16.906159 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906176 kubelet[3136]: W0912 23:01:16.906164 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906176 kubelet[3136]: E0912 23:01:16.906169 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906301 kubelet[3136]: E0912 23:01:16.906287 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906301 kubelet[3136]: W0912 23:01:16.906298 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906356 kubelet[3136]: E0912 23:01:16.906306 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906426 kubelet[3136]: E0912 23:01:16.906411 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906426 kubelet[3136]: W0912 23:01:16.906424 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906470 kubelet[3136]: E0912 23:01:16.906430 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906543 kubelet[3136]: E0912 23:01:16.906531 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906543 kubelet[3136]: W0912 23:01:16.906540 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906595 kubelet[3136]: E0912 23:01:16.906546 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906626 kubelet[3136]: E0912 23:01:16.906618 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906626 kubelet[3136]: W0912 23:01:16.906624 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906708 kubelet[3136]: E0912 23:01:16.906630 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906733 kubelet[3136]: E0912 23:01:16.906710 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906733 kubelet[3136]: W0912 23:01:16.906714 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906733 kubelet[3136]: E0912 23:01:16.906719 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906799 kubelet[3136]: E0912 23:01:16.906791 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906799 kubelet[3136]: W0912 23:01:16.906797 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906873 kubelet[3136]: E0912 23:01:16.906802 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906873 kubelet[3136]: E0912 23:01:16.906869 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906921 kubelet[3136]: W0912 23:01:16.906874 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906921 kubelet[3136]: E0912 23:01:16.906879 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.906960 kubelet[3136]: E0912 23:01:16.906950 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.906960 kubelet[3136]: W0912 23:01:16.906954 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.906960 kubelet[3136]: E0912 23:01:16.906959 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907049 kubelet[3136]: E0912 23:01:16.907022 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907049 kubelet[3136]: W0912 23:01:16.907026 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907049 kubelet[3136]: E0912 23:01:16.907031 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907107 kubelet[3136]: E0912 23:01:16.907100 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907107 kubelet[3136]: W0912 23:01:16.907103 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907146 kubelet[3136]: E0912 23:01:16.907108 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907180 kubelet[3136]: E0912 23:01:16.907170 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907180 kubelet[3136]: W0912 23:01:16.907176 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907256 kubelet[3136]: E0912 23:01:16.907182 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907256 kubelet[3136]: E0912 23:01:16.907246 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907256 kubelet[3136]: W0912 23:01:16.907250 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907256 kubelet[3136]: E0912 23:01:16.907255 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907344 kubelet[3136]: E0912 23:01:16.907327 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907344 kubelet[3136]: W0912 23:01:16.907331 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907344 kubelet[3136]: E0912 23:01:16.907335 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907452 kubelet[3136]: E0912 23:01:16.907417 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907452 kubelet[3136]: W0912 23:01:16.907422 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907452 kubelet[3136]: E0912 23:01:16.907426 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907538 kubelet[3136]: E0912 23:01:16.907518 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907538 kubelet[3136]: W0912 23:01:16.907523 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907538 kubelet[3136]: E0912 23:01:16.907527 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907606 kubelet[3136]: E0912 23:01:16.907595 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907606 kubelet[3136]: W0912 23:01:16.907599 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907685 kubelet[3136]: E0912 23:01:16.907605 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.907685 kubelet[3136]: E0912 23:01:16.907685 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.907724 kubelet[3136]: W0912 23:01:16.907688 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.907724 kubelet[3136]: E0912 23:01:16.907693 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913280 kubelet[3136]: E0912 23:01:16.912915 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913280 kubelet[3136]: W0912 23:01:16.912930 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913280 kubelet[3136]: E0912 23:01:16.912941 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913280 kubelet[3136]: I0912 23:01:16.912958 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1b987837-dd23-4076-8418-d77fb0bca3b7-varrun\") pod \"csi-node-driver-7vb8z\" (UID: \"1b987837-dd23-4076-8418-d77fb0bca3b7\") " pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:16.913280 kubelet[3136]: E0912 23:01:16.913087 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913280 kubelet[3136]: W0912 23:01:16.913093 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913280 kubelet[3136]: E0912 23:01:16.913100 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913280 kubelet[3136]: I0912 23:01:16.913113 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b987837-dd23-4076-8418-d77fb0bca3b7-socket-dir\") pod \"csi-node-driver-7vb8z\" (UID: \"1b987837-dd23-4076-8418-d77fb0bca3b7\") " pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:16.913280 kubelet[3136]: E0912 23:01:16.913252 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913512 containerd[1728]: time="2025-09-12T23:01:16.913112131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-87txg,Uid:278d829c-f434-4eee-b3bd-6fbb76f9bbec,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:16.913818 kubelet[3136]: W0912 23:01:16.913258 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913818 kubelet[3136]: E0912 23:01:16.913265 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913818 kubelet[3136]: I0912 23:01:16.913277 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b987837-dd23-4076-8418-d77fb0bca3b7-kubelet-dir\") pod \"csi-node-driver-7vb8z\" (UID: \"1b987837-dd23-4076-8418-d77fb0bca3b7\") " pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:16.913818 kubelet[3136]: E0912 23:01:16.913408 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913818 kubelet[3136]: W0912 23:01:16.913413 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913818 kubelet[3136]: E0912 23:01:16.913419 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913818 kubelet[3136]: I0912 23:01:16.913431 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdxqz\" (UniqueName: \"kubernetes.io/projected/1b987837-dd23-4076-8418-d77fb0bca3b7-kube-api-access-sdxqz\") pod \"csi-node-driver-7vb8z\" (UID: \"1b987837-dd23-4076-8418-d77fb0bca3b7\") " pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:16.913818 kubelet[3136]: E0912 23:01:16.913548 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913968 kubelet[3136]: W0912 23:01:16.913553 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913968 kubelet[3136]: E0912 23:01:16.913559 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.913968 kubelet[3136]: I0912 23:01:16.913571 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b987837-dd23-4076-8418-d77fb0bca3b7-registration-dir\") pod \"csi-node-driver-7vb8z\" (UID: \"1b987837-dd23-4076-8418-d77fb0bca3b7\") " pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:16.913968 kubelet[3136]: E0912 23:01:16.913704 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.913968 kubelet[3136]: W0912 23:01:16.913723 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.913968 kubelet[3136]: E0912 23:01:16.913730 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914123 kubelet[3136]: E0912 23:01:16.914100 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914123 kubelet[3136]: W0912 23:01:16.914121 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.914185 kubelet[3136]: E0912 23:01:16.914128 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914330 kubelet[3136]: E0912 23:01:16.914264 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914330 kubelet[3136]: W0912 23:01:16.914285 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.914330 kubelet[3136]: E0912 23:01:16.914292 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914525 kubelet[3136]: E0912 23:01:16.914391 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914525 kubelet[3136]: W0912 23:01:16.914395 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.914525 kubelet[3136]: E0912 23:01:16.914401 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914636 kubelet[3136]: E0912 23:01:16.914613 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914678 kubelet[3136]: W0912 23:01:16.914643 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.914678 kubelet[3136]: E0912 23:01:16.914650 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914826 kubelet[3136]: E0912 23:01:16.914811 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914826 kubelet[3136]: W0912 23:01:16.914820 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.914878 kubelet[3136]: E0912 23:01:16.914826 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.914948 kubelet[3136]: E0912 23:01:16.914934 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.914948 kubelet[3136]: W0912 23:01:16.914945 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.915067 kubelet[3136]: E0912 23:01:16.914951 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.915067 kubelet[3136]: E0912 23:01:16.915046 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.915067 kubelet[3136]: W0912 23:01:16.915050 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.915067 kubelet[3136]: E0912 23:01:16.915055 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.915234 kubelet[3136]: E0912 23:01:16.915165 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.915234 kubelet[3136]: W0912 23:01:16.915169 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.915234 kubelet[3136]: E0912 23:01:16.915175 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.915341 kubelet[3136]: E0912 23:01:16.915318 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:16.915341 kubelet[3136]: W0912 23:01:16.915338 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:16.915385 kubelet[3136]: E0912 23:01:16.915344 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:16.960836 containerd[1728]: time="2025-09-12T23:01:16.960802047Z" level=info msg="connecting to shim 925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e" address="unix:///run/containerd/s/f52e3c3063f16947553ccb3e5d8f75d33fb73b442af7e471106b0e3b87e763ad" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:16.984656 systemd[1]: Started cri-containerd-925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e.scope - libcontainer container 925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e. Sep 12 23:01:17.014025 kubelet[3136]: E0912 23:01:17.013970 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.014025 kubelet[3136]: W0912 23:01:17.013981 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.014025 kubelet[3136]: E0912 23:01:17.013989 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.014243 kubelet[3136]: E0912 23:01:17.014203 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.014243 kubelet[3136]: W0912 23:01:17.014208 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.014243 kubelet[3136]: E0912 23:01:17.014214 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.014337 kubelet[3136]: E0912 23:01:17.014331 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.014337 kubelet[3136]: W0912 23:01:17.014339 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.014388 kubelet[3136]: E0912 23:01:17.014349 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.014534 kubelet[3136]: E0912 23:01:17.014471 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.014534 kubelet[3136]: W0912 23:01:17.014477 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.014623 kubelet[3136]: E0912 23:01:17.014483 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.014778 kubelet[3136]: E0912 23:01:17.014748 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.014778 kubelet[3136]: W0912 23:01:17.014757 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.014926 kubelet[3136]: E0912 23:01:17.014765 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.015101 kubelet[3136]: E0912 23:01:17.015092 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.015235 kubelet[3136]: W0912 23:01:17.015142 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.015235 kubelet[3136]: E0912 23:01:17.015165 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.015387 kubelet[3136]: E0912 23:01:17.015379 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.015387 kubelet[3136]: W0912 23:01:17.015398 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.015387 kubelet[3136]: E0912 23:01:17.015409 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.015589 kubelet[3136]: E0912 23:01:17.015549 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.015589 kubelet[3136]: W0912 23:01:17.015556 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.015589 kubelet[3136]: E0912 23:01:17.015565 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.015867 kubelet[3136]: E0912 23:01:17.015852 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.015913 kubelet[3136]: W0912 23:01:17.015882 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.015913 kubelet[3136]: E0912 23:01:17.015891 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.016260 kubelet[3136]: E0912 23:01:17.016249 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.016260 kubelet[3136]: W0912 23:01:17.016260 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.016544 kubelet[3136]: E0912 23:01:17.016528 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017049 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.018377 kubelet[3136]: W0912 23:01:17.017057 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017066 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017186 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.018377 kubelet[3136]: W0912 23:01:17.017189 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017194 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017292 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.018377 kubelet[3136]: W0912 23:01:17.017295 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017299 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.018377 kubelet[3136]: E0912 23:01:17.017747 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.018610 kubelet[3136]: W0912 23:01:17.017754 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.018610 kubelet[3136]: E0912 23:01:17.017761 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.018610 kubelet[3136]: E0912 23:01:17.018401 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.018610 kubelet[3136]: W0912 23:01:17.018411 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.018610 kubelet[3136]: E0912 23:01:17.018423 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.019572 kubelet[3136]: E0912 23:01:17.019555 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.019572 kubelet[3136]: W0912 23:01:17.019568 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.019659 kubelet[3136]: E0912 23:01:17.019579 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.019902 kubelet[3136]: E0912 23:01:17.019885 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.019902 kubelet[3136]: W0912 23:01:17.019897 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.019902 kubelet[3136]: E0912 23:01:17.019908 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.020051 kubelet[3136]: E0912 23:01:17.020042 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.020051 kubelet[3136]: W0912 23:01:17.020049 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.020113 kubelet[3136]: E0912 23:01:17.020056 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.020193 kubelet[3136]: E0912 23:01:17.020182 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.020193 kubelet[3136]: W0912 23:01:17.020189 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.020248 kubelet[3136]: E0912 23:01:17.020196 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.020335 kubelet[3136]: E0912 23:01:17.020324 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.020335 kubelet[3136]: W0912 23:01:17.020331 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.020393 kubelet[3136]: E0912 23:01:17.020337 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.020458 kubelet[3136]: E0912 23:01:17.020453 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.020483 kubelet[3136]: W0912 23:01:17.020459 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.020483 kubelet[3136]: E0912 23:01:17.020465 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.021008 kubelet[3136]: E0912 23:01:17.020991 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.021077 kubelet[3136]: W0912 23:01:17.021064 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.021113 kubelet[3136]: E0912 23:01:17.021080 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.021626 kubelet[3136]: E0912 23:01:17.021610 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.021626 kubelet[3136]: W0912 23:01:17.021622 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.021705 kubelet[3136]: E0912 23:01:17.021632 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.022080 kubelet[3136]: E0912 23:01:17.022065 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.022134 kubelet[3136]: W0912 23:01:17.022081 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.022134 kubelet[3136]: E0912 23:01:17.022092 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.023143 kubelet[3136]: E0912 23:01:17.023118 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.023143 kubelet[3136]: W0912 23:01:17.023141 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.023143 kubelet[3136]: E0912 23:01:17.023152 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.031070 kubelet[3136]: E0912 23:01:17.031034 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:17.031070 kubelet[3136]: W0912 23:01:17.031044 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:17.031070 kubelet[3136]: E0912 23:01:17.031052 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:17.036131 containerd[1728]: time="2025-09-12T23:01:17.036086149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-87txg,Uid:278d829c-f434-4eee-b3bd-6fbb76f9bbec,Namespace:calico-system,Attempt:0,} returns sandbox id \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\"" Sep 12 23:01:18.071870 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1135093534.mount: Deactivated successfully. Sep 12 23:01:18.811884 kubelet[3136]: E0912 23:01:18.811823 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:18.998995 containerd[1728]: time="2025-09-12T23:01:18.998964014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.003428 containerd[1728]: time="2025-09-12T23:01:19.003404206Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 23:01:19.006295 containerd[1728]: time="2025-09-12T23:01:19.005752596Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.010572 containerd[1728]: time="2025-09-12T23:01:19.010547305Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.010957 containerd[1728]: time="2025-09-12T23:01:19.010938631Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.34434877s" Sep 12 23:01:19.011017 containerd[1728]: time="2025-09-12T23:01:19.011008386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 23:01:19.011712 containerd[1728]: time="2025-09-12T23:01:19.011694205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 23:01:19.027328 containerd[1728]: time="2025-09-12T23:01:19.027301905Z" level=info msg="CreateContainer within sandbox \"ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 23:01:19.045046 containerd[1728]: time="2025-09-12T23:01:19.045016346Z" level=info msg="Container 7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:19.049640 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount218222585.mount: Deactivated successfully. Sep 12 23:01:19.061774 containerd[1728]: time="2025-09-12T23:01:19.061744454Z" level=info msg="CreateContainer within sandbox \"ce976123c27c8cfaec40771568814b27dc7f860945188a2bf55fd7c0b6953f64\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8\"" Sep 12 23:01:19.062392 containerd[1728]: time="2025-09-12T23:01:19.062065289Z" level=info msg="StartContainer for \"7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8\"" Sep 12 23:01:19.063231 containerd[1728]: time="2025-09-12T23:01:19.063196528Z" level=info msg="connecting to shim 7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8" address="unix:///run/containerd/s/5f9dcf6be35d2878c4eb5767982ff2835800e0552d864cdec95b462446d21309" protocol=ttrpc version=3 Sep 12 23:01:19.083648 systemd[1]: Started cri-containerd-7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8.scope - libcontainer container 7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8. Sep 12 23:01:19.129482 containerd[1728]: time="2025-09-12T23:01:19.129463233Z" level=info msg="StartContainer for \"7ad84856582c5bc5d4691e30e49ebad841368631df29bb825826d8b12e3c23b8\" returns successfully" Sep 12 23:01:19.926911 kubelet[3136]: E0912 23:01:19.926888 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.926911 kubelet[3136]: W0912 23:01:19.926910 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927196 kubelet[3136]: E0912 23:01:19.926924 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927196 kubelet[3136]: E0912 23:01:19.927050 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927196 kubelet[3136]: W0912 23:01:19.927055 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927196 kubelet[3136]: E0912 23:01:19.927062 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927196 kubelet[3136]: E0912 23:01:19.927156 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927196 kubelet[3136]: W0912 23:01:19.927160 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927196 kubelet[3136]: E0912 23:01:19.927166 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927345 kubelet[3136]: E0912 23:01:19.927252 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927345 kubelet[3136]: W0912 23:01:19.927256 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927345 kubelet[3136]: E0912 23:01:19.927262 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927407 kubelet[3136]: E0912 23:01:19.927347 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927407 kubelet[3136]: W0912 23:01:19.927351 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927407 kubelet[3136]: E0912 23:01:19.927356 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927473 kubelet[3136]: E0912 23:01:19.927437 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927473 kubelet[3136]: W0912 23:01:19.927441 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927473 kubelet[3136]: E0912 23:01:19.927446 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927557 kubelet[3136]: E0912 23:01:19.927535 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927557 kubelet[3136]: W0912 23:01:19.927540 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927557 kubelet[3136]: E0912 23:01:19.927546 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927621 kubelet[3136]: E0912 23:01:19.927618 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927643 kubelet[3136]: W0912 23:01:19.927622 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927643 kubelet[3136]: E0912 23:01:19.927627 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927751 kubelet[3136]: E0912 23:01:19.927733 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927751 kubelet[3136]: W0912 23:01:19.927742 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927751 kubelet[3136]: E0912 23:01:19.927750 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927860 kubelet[3136]: E0912 23:01:19.927835 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927860 kubelet[3136]: W0912 23:01:19.927839 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927860 kubelet[3136]: E0912 23:01:19.927845 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.927971 kubelet[3136]: E0912 23:01:19.927937 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.927971 kubelet[3136]: W0912 23:01:19.927941 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.927971 kubelet[3136]: E0912 23:01:19.927947 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.928069 kubelet[3136]: E0912 23:01:19.928043 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.928069 kubelet[3136]: W0912 23:01:19.928047 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.928069 kubelet[3136]: E0912 23:01:19.928052 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.928194 kubelet[3136]: E0912 23:01:19.928171 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.928194 kubelet[3136]: W0912 23:01:19.928192 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.928248 kubelet[3136]: E0912 23:01:19.928198 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.928305 kubelet[3136]: E0912 23:01:19.928298 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.928305 kubelet[3136]: W0912 23:01:19.928303 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.928352 kubelet[3136]: E0912 23:01:19.928309 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.928408 kubelet[3136]: E0912 23:01:19.928386 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.928408 kubelet[3136]: W0912 23:01:19.928406 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.928450 kubelet[3136]: E0912 23:01:19.928411 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.936698 kubelet[3136]: E0912 23:01:19.936684 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.936698 kubelet[3136]: W0912 23:01:19.936695 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.936795 kubelet[3136]: E0912 23:01:19.936706 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.936823 kubelet[3136]: E0912 23:01:19.936821 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.936846 kubelet[3136]: W0912 23:01:19.936825 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.936846 kubelet[3136]: E0912 23:01:19.936843 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.936949 kubelet[3136]: E0912 23:01:19.936936 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.936949 kubelet[3136]: W0912 23:01:19.936946 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937001 kubelet[3136]: E0912 23:01:19.936954 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937128 kubelet[3136]: E0912 23:01:19.937105 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937128 kubelet[3136]: W0912 23:01:19.937126 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937181 kubelet[3136]: E0912 23:01:19.937132 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937240 kubelet[3136]: E0912 23:01:19.937231 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937240 kubelet[3136]: W0912 23:01:19.937237 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937284 kubelet[3136]: E0912 23:01:19.937243 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937373 kubelet[3136]: E0912 23:01:19.937349 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937373 kubelet[3136]: W0912 23:01:19.937370 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937422 kubelet[3136]: E0912 23:01:19.937377 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937511 kubelet[3136]: E0912 23:01:19.937503 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937531 kubelet[3136]: W0912 23:01:19.937511 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937531 kubelet[3136]: E0912 23:01:19.937517 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937841 kubelet[3136]: E0912 23:01:19.937812 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937841 kubelet[3136]: W0912 23:01:19.937837 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.937909 kubelet[3136]: E0912 23:01:19.937847 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.937964 kubelet[3136]: E0912 23:01:19.937942 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.937964 kubelet[3136]: W0912 23:01:19.937962 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938009 kubelet[3136]: E0912 23:01:19.937968 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938078 kubelet[3136]: E0912 23:01:19.938069 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938078 kubelet[3136]: W0912 23:01:19.938075 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938124 kubelet[3136]: E0912 23:01:19.938081 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938230 kubelet[3136]: E0912 23:01:19.938207 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938230 kubelet[3136]: W0912 23:01:19.938227 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938275 kubelet[3136]: E0912 23:01:19.938233 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938412 kubelet[3136]: E0912 23:01:19.938403 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938412 kubelet[3136]: W0912 23:01:19.938410 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938459 kubelet[3136]: E0912 23:01:19.938415 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938532 kubelet[3136]: E0912 23:01:19.938522 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938532 kubelet[3136]: W0912 23:01:19.938530 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938579 kubelet[3136]: E0912 23:01:19.938535 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938672 kubelet[3136]: E0912 23:01:19.938657 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938672 kubelet[3136]: W0912 23:01:19.938670 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938725 kubelet[3136]: E0912 23:01:19.938678 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938832 kubelet[3136]: E0912 23:01:19.938815 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938832 kubelet[3136]: W0912 23:01:19.938830 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.938877 kubelet[3136]: E0912 23:01:19.938836 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.938966 kubelet[3136]: E0912 23:01:19.938943 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.938966 kubelet[3136]: W0912 23:01:19.938964 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.939031 kubelet[3136]: E0912 23:01:19.938970 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.939096 kubelet[3136]: E0912 23:01:19.939086 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.939096 kubelet[3136]: W0912 23:01:19.939092 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.939137 kubelet[3136]: E0912 23:01:19.939097 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:19.939382 kubelet[3136]: E0912 23:01:19.939358 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:19.939382 kubelet[3136]: W0912 23:01:19.939379 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:19.939455 kubelet[3136]: E0912 23:01:19.939385 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.812465 kubelet[3136]: E0912 23:01:20.812433 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:20.883871 kubelet[3136]: I0912 23:01:20.883855 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:01:20.934680 kubelet[3136]: E0912 23:01:20.934655 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.934680 kubelet[3136]: W0912 23:01:20.934677 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.934920 kubelet[3136]: E0912 23:01:20.934689 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.934920 kubelet[3136]: E0912 23:01:20.934790 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.934920 kubelet[3136]: W0912 23:01:20.934795 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.934920 kubelet[3136]: E0912 23:01:20.934802 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.934920 kubelet[3136]: E0912 23:01:20.934881 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.934920 kubelet[3136]: W0912 23:01:20.934885 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.934920 kubelet[3136]: E0912 23:01:20.934890 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935065 kubelet[3136]: E0912 23:01:20.935000 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935065 kubelet[3136]: W0912 23:01:20.935003 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935065 kubelet[3136]: E0912 23:01:20.935014 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935125 kubelet[3136]: E0912 23:01:20.935092 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935125 kubelet[3136]: W0912 23:01:20.935096 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935125 kubelet[3136]: E0912 23:01:20.935102 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935185 kubelet[3136]: E0912 23:01:20.935169 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935185 kubelet[3136]: W0912 23:01:20.935172 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935185 kubelet[3136]: E0912 23:01:20.935177 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935256 kubelet[3136]: E0912 23:01:20.935244 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935256 kubelet[3136]: W0912 23:01:20.935251 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935301 kubelet[3136]: E0912 23:01:20.935257 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935351 kubelet[3136]: E0912 23:01:20.935330 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935351 kubelet[3136]: W0912 23:01:20.935348 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935388 kubelet[3136]: E0912 23:01:20.935353 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935484 kubelet[3136]: E0912 23:01:20.935458 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935484 kubelet[3136]: W0912 23:01:20.935480 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935558 kubelet[3136]: E0912 23:01:20.935488 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935596 kubelet[3136]: E0912 23:01:20.935586 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935596 kubelet[3136]: W0912 23:01:20.935593 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935647 kubelet[3136]: E0912 23:01:20.935599 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935675 kubelet[3136]: E0912 23:01:20.935667 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935675 kubelet[3136]: W0912 23:01:20.935672 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935724 kubelet[3136]: E0912 23:01:20.935677 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935752 kubelet[3136]: E0912 23:01:20.935745 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935752 kubelet[3136]: W0912 23:01:20.935749 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935797 kubelet[3136]: E0912 23:01:20.935754 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935837 kubelet[3136]: E0912 23:01:20.935826 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935837 kubelet[3136]: W0912 23:01:20.935832 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935889 kubelet[3136]: E0912 23:01:20.935838 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935916 kubelet[3136]: E0912 23:01:20.935907 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935916 kubelet[3136]: W0912 23:01:20.935912 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.935966 kubelet[3136]: E0912 23:01:20.935917 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.935993 kubelet[3136]: E0912 23:01:20.935982 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.935993 kubelet[3136]: W0912 23:01:20.935986 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.936041 kubelet[3136]: E0912 23:01:20.935991 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944310 kubelet[3136]: E0912 23:01:20.944293 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944310 kubelet[3136]: W0912 23:01:20.944306 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944402 kubelet[3136]: E0912 23:01:20.944317 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944460 kubelet[3136]: E0912 23:01:20.944450 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944460 kubelet[3136]: W0912 23:01:20.944458 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944602 kubelet[3136]: E0912 23:01:20.944465 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944670 kubelet[3136]: E0912 23:01:20.944656 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944670 kubelet[3136]: W0912 23:01:20.944667 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944739 kubelet[3136]: E0912 23:01:20.944676 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944779 kubelet[3136]: E0912 23:01:20.944765 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944779 kubelet[3136]: W0912 23:01:20.944770 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944779 kubelet[3136]: E0912 23:01:20.944775 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944878 kubelet[3136]: E0912 23:01:20.944848 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944878 kubelet[3136]: W0912 23:01:20.944852 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944878 kubelet[3136]: E0912 23:01:20.944857 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.944971 kubelet[3136]: E0912 23:01:20.944948 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.944971 kubelet[3136]: W0912 23:01:20.944952 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.944971 kubelet[3136]: E0912 23:01:20.944958 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945159 kubelet[3136]: E0912 23:01:20.945143 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945159 kubelet[3136]: W0912 23:01:20.945156 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945223 kubelet[3136]: E0912 23:01:20.945164 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945305 kubelet[3136]: E0912 23:01:20.945287 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945305 kubelet[3136]: W0912 23:01:20.945303 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945355 kubelet[3136]: E0912 23:01:20.945310 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945414 kubelet[3136]: E0912 23:01:20.945405 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945414 kubelet[3136]: W0912 23:01:20.945411 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945460 kubelet[3136]: E0912 23:01:20.945416 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945547 kubelet[3136]: E0912 23:01:20.945522 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945547 kubelet[3136]: W0912 23:01:20.945545 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945598 kubelet[3136]: E0912 23:01:20.945551 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945683 kubelet[3136]: E0912 23:01:20.945674 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945683 kubelet[3136]: W0912 23:01:20.945680 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945722 kubelet[3136]: E0912 23:01:20.945686 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.945856 kubelet[3136]: E0912 23:01:20.945791 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.945856 kubelet[3136]: W0912 23:01:20.945798 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.945856 kubelet[3136]: E0912 23:01:20.945804 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946055 kubelet[3136]: E0912 23:01:20.946032 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946055 kubelet[3136]: W0912 23:01:20.946052 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946106 kubelet[3136]: E0912 23:01:20.946059 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946142 kubelet[3136]: E0912 23:01:20.946140 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946165 kubelet[3136]: W0912 23:01:20.946145 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946165 kubelet[3136]: E0912 23:01:20.946150 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946310 kubelet[3136]: E0912 23:01:20.946288 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946310 kubelet[3136]: W0912 23:01:20.946306 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946370 kubelet[3136]: E0912 23:01:20.946312 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946574 kubelet[3136]: E0912 23:01:20.946530 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946574 kubelet[3136]: W0912 23:01:20.946540 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946574 kubelet[3136]: E0912 23:01:20.946549 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946667 kubelet[3136]: E0912 23:01:20.946652 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946667 kubelet[3136]: W0912 23:01:20.946657 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946667 kubelet[3136]: E0912 23:01:20.946663 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:20.946760 kubelet[3136]: E0912 23:01:20.946740 3136 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:20.946760 kubelet[3136]: W0912 23:01:20.946744 3136 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:20.946760 kubelet[3136]: E0912 23:01:20.946749 3136 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:22.151468 containerd[1728]: time="2025-09-12T23:01:22.151439311Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:22.155396 containerd[1728]: time="2025-09-12T23:01:22.155365078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 23:01:22.158378 containerd[1728]: time="2025-09-12T23:01:22.158344018Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:22.161960 containerd[1728]: time="2025-09-12T23:01:22.161918721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:22.162423 containerd[1728]: time="2025-09-12T23:01:22.162188100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 3.150467946s" Sep 12 23:01:22.162423 containerd[1728]: time="2025-09-12T23:01:22.162214169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 23:01:22.170462 containerd[1728]: time="2025-09-12T23:01:22.170435921Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 23:01:22.191012 containerd[1728]: time="2025-09-12T23:01:22.190669092Z" level=info msg="Container 535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:22.207481 containerd[1728]: time="2025-09-12T23:01:22.207457555Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\"" Sep 12 23:01:22.208572 containerd[1728]: time="2025-09-12T23:01:22.207842096Z" level=info msg="StartContainer for \"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\"" Sep 12 23:01:22.209170 containerd[1728]: time="2025-09-12T23:01:22.209145407Z" level=info msg="connecting to shim 535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7" address="unix:///run/containerd/s/f52e3c3063f16947553ccb3e5d8f75d33fb73b442af7e471106b0e3b87e763ad" protocol=ttrpc version=3 Sep 12 23:01:22.229630 systemd[1]: Started cri-containerd-535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7.scope - libcontainer container 535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7. Sep 12 23:01:22.264377 containerd[1728]: time="2025-09-12T23:01:22.264286467Z" level=info msg="StartContainer for \"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\" returns successfully" Sep 12 23:01:22.266751 systemd[1]: cri-containerd-535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7.scope: Deactivated successfully. Sep 12 23:01:22.270094 containerd[1728]: time="2025-09-12T23:01:22.270071460Z" level=info msg="received exit event container_id:\"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\" id:\"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\" pid:3877 exited_at:{seconds:1757718082 nanos:269776280}" Sep 12 23:01:22.271245 containerd[1728]: time="2025-09-12T23:01:22.271223346Z" level=info msg="TaskExit event in podsandbox handler container_id:\"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\" id:\"535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7\" pid:3877 exited_at:{seconds:1757718082 nanos:269776280}" Sep 12 23:01:22.283612 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-535c771f7f9b284e3a1c1272febbbe3df33b3d2049a7a83b7e2f4213a32195b7-rootfs.mount: Deactivated successfully. Sep 12 23:01:22.812032 kubelet[3136]: E0912 23:01:22.812006 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:22.903278 kubelet[3136]: I0912 23:01:22.902709 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-78ff968d76-9r7r6" podStartSLOduration=4.5573895570000005 podStartE2EDuration="6.902697244s" podCreationTimestamp="2025-09-12 23:01:16 +0000 UTC" firstStartedPulling="2025-09-12 23:01:16.666274088 +0000 UTC m=+16.937900878" lastFinishedPulling="2025-09-12 23:01:19.011581773 +0000 UTC m=+19.283208565" observedRunningTime="2025-09-12 23:01:19.893975974 +0000 UTC m=+20.165602765" watchObservedRunningTime="2025-09-12 23:01:22.902697244 +0000 UTC m=+23.174324040" Sep 12 23:01:24.812341 kubelet[3136]: E0912 23:01:24.812314 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:24.894929 containerd[1728]: time="2025-09-12T23:01:24.894582131Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 23:01:25.948745 kubelet[3136]: I0912 23:01:25.948690 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:01:26.812506 kubelet[3136]: E0912 23:01:26.812471 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:27.377496 containerd[1728]: time="2025-09-12T23:01:27.377468104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:27.381823 containerd[1728]: time="2025-09-12T23:01:27.381794367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 23:01:27.384751 containerd[1728]: time="2025-09-12T23:01:27.384708810Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:27.387987 containerd[1728]: time="2025-09-12T23:01:27.387949896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:27.388453 containerd[1728]: time="2025-09-12T23:01:27.388233709Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 2.493618044s" Sep 12 23:01:27.388453 containerd[1728]: time="2025-09-12T23:01:27.388256652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 23:01:27.395450 containerd[1728]: time="2025-09-12T23:01:27.395426017Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 23:01:27.436585 containerd[1728]: time="2025-09-12T23:01:27.436382469Z" level=info msg="Container f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:27.452168 containerd[1728]: time="2025-09-12T23:01:27.452132410Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\"" Sep 12 23:01:27.452685 containerd[1728]: time="2025-09-12T23:01:27.452534587Z" level=info msg="StartContainer for \"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\"" Sep 12 23:01:27.453898 containerd[1728]: time="2025-09-12T23:01:27.453860625Z" level=info msg="connecting to shim f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7" address="unix:///run/containerd/s/f52e3c3063f16947553ccb3e5d8f75d33fb73b442af7e471106b0e3b87e763ad" protocol=ttrpc version=3 Sep 12 23:01:27.480632 systemd[1]: Started cri-containerd-f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7.scope - libcontainer container f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7. Sep 12 23:01:27.509033 containerd[1728]: time="2025-09-12T23:01:27.508971834Z" level=info msg="StartContainer for \"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\" returns successfully" Sep 12 23:01:28.601911 containerd[1728]: time="2025-09-12T23:01:28.601763742Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 23:01:28.603576 systemd[1]: cri-containerd-f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7.scope: Deactivated successfully. Sep 12 23:01:28.603817 systemd[1]: cri-containerd-f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7.scope: Consumed 335ms CPU time, 195.6M memory peak, 171.3M written to disk. Sep 12 23:01:28.605918 containerd[1728]: time="2025-09-12T23:01:28.605895575Z" level=info msg="received exit event container_id:\"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\" id:\"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\" pid:3937 exited_at:{seconds:1757718088 nanos:605753388}" Sep 12 23:01:28.606177 containerd[1728]: time="2025-09-12T23:01:28.606155286Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\" id:\"f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7\" pid:3937 exited_at:{seconds:1757718088 nanos:605753388}" Sep 12 23:01:28.620761 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f9a068b5208b680cc09159c639cc8237d4fb6dd594884cec1afe5eca8c28d3a7-rootfs.mount: Deactivated successfully. Sep 12 23:01:28.673595 kubelet[3136]: I0912 23:01:28.673581 3136 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 23:01:28.886398 systemd[1]: Created slice kubepods-burstable-podb48c1214_e762_4786_bd68_58bad958ec5e.slice - libcontainer container kubepods-burstable-podb48c1214_e762_4786_bd68_58bad958ec5e.slice. Sep 12 23:01:28.903657 kubelet[3136]: I0912 23:01:28.903624 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b48c1214-e762-4786-bd68-58bad958ec5e-config-volume\") pod \"coredns-674b8bbfcf-jchqr\" (UID: \"b48c1214-e762-4786-bd68-58bad958ec5e\") " pod="kube-system/coredns-674b8bbfcf-jchqr" Sep 12 23:01:28.903657 kubelet[3136]: I0912 23:01:28.903651 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5czm2\" (UniqueName: \"kubernetes.io/projected/b48c1214-e762-4786-bd68-58bad958ec5e-kube-api-access-5czm2\") pod \"coredns-674b8bbfcf-jchqr\" (UID: \"b48c1214-e762-4786-bd68-58bad958ec5e\") " pod="kube-system/coredns-674b8bbfcf-jchqr" Sep 12 23:01:29.035111 systemd[1]: Created slice kubepods-besteffort-pod0cc5f576_7283_4e9c_b7f6_17b22bad935a.slice - libcontainer container kubepods-besteffort-pod0cc5f576_7283_4e9c_b7f6_17b22bad935a.slice. Sep 12 23:01:29.104251 kubelet[3136]: I0912 23:01:29.104210 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ac665b-4fbe-4cc1-8333-eb18d722c81d-config-volume\") pod \"coredns-674b8bbfcf-vkg6h\" (UID: \"62ac665b-4fbe-4cc1-8333-eb18d722c81d\") " pod="kube-system/coredns-674b8bbfcf-vkg6h" Sep 12 23:01:29.104251 kubelet[3136]: I0912 23:01:29.104238 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-ca-bundle\") pod \"whisker-748687d77f-g7ttg\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " pod="calico-system/whisker-748687d77f-g7ttg" Sep 12 23:01:29.104380 kubelet[3136]: I0912 23:01:29.104261 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9s6l\" (UniqueName: \"kubernetes.io/projected/62ac665b-4fbe-4cc1-8333-eb18d722c81d-kube-api-access-m9s6l\") pod \"coredns-674b8bbfcf-vkg6h\" (UID: \"62ac665b-4fbe-4cc1-8333-eb18d722c81d\") " pod="kube-system/coredns-674b8bbfcf-vkg6h" Sep 12 23:01:29.104380 kubelet[3136]: I0912 23:01:29.104277 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-backend-key-pair\") pod \"whisker-748687d77f-g7ttg\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " pod="calico-system/whisker-748687d77f-g7ttg" Sep 12 23:01:29.104380 kubelet[3136]: I0912 23:01:29.104293 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-744xn\" (UniqueName: \"kubernetes.io/projected/0cc5f576-7283-4e9c-b7f6-17b22bad935a-kube-api-access-744xn\") pod \"whisker-748687d77f-g7ttg\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " pod="calico-system/whisker-748687d77f-g7ttg" Sep 12 23:01:29.189822 containerd[1728]: time="2025-09-12T23:01:29.189762754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jchqr,Uid:b48c1214-e762-4786-bd68-58bad958ec5e,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:29.335535 systemd[1]: Created slice kubepods-besteffort-podf39f69e4_a9bf_404f_8a5c_8ebad50034e9.slice - libcontainer container kubepods-besteffort-podf39f69e4_a9bf_404f_8a5c_8ebad50034e9.slice. Sep 12 23:01:29.339213 systemd[1]: Created slice kubepods-besteffort-pod1b987837_dd23_4076_8418_d77fb0bca3b7.slice - libcontainer container kubepods-besteffort-pod1b987837_dd23_4076_8418_d77fb0bca3b7.slice. Sep 12 23:01:29.343151 containerd[1728]: time="2025-09-12T23:01:29.343126286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7vb8z,Uid:1b987837-dd23-4076-8418-d77fb0bca3b7,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:29.384268 containerd[1728]: time="2025-09-12T23:01:29.384232883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-748687d77f-g7ttg,Uid:0cc5f576-7283-4e9c-b7f6-17b22bad935a,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:29.406918 kubelet[3136]: I0912 23:01:29.406849 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4j4f\" (UniqueName: \"kubernetes.io/projected/f39f69e4-a9bf-404f-8a5c-8ebad50034e9-kube-api-access-w4j4f\") pod \"calico-kube-controllers-565597cb9f-xh8s4\" (UID: \"f39f69e4-a9bf-404f-8a5c-8ebad50034e9\") " pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" Sep 12 23:01:29.407006 kubelet[3136]: I0912 23:01:29.406883 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f39f69e4-a9bf-404f-8a5c-8ebad50034e9-tigera-ca-bundle\") pod \"calico-kube-controllers-565597cb9f-xh8s4\" (UID: \"f39f69e4-a9bf-404f-8a5c-8ebad50034e9\") " pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" Sep 12 23:01:29.465436 containerd[1728]: time="2025-09-12T23:01:29.465155358Z" level=error msg="Failed to destroy network for sandbox \"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.487678 containerd[1728]: time="2025-09-12T23:01:29.487392659Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jchqr,Uid:b48c1214-e762-4786-bd68-58bad958ec5e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.488457 kubelet[3136]: E0912 23:01:29.488398 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.489028 kubelet[3136]: E0912 23:01:29.488669 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jchqr" Sep 12 23:01:29.489028 kubelet[3136]: E0912 23:01:29.488691 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jchqr" Sep 12 23:01:29.489028 kubelet[3136]: E0912 23:01:29.488816 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-jchqr_kube-system(b48c1214-e762-4786-bd68-58bad958ec5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-jchqr_kube-system(b48c1214-e762-4786-bd68-58bad958ec5e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1c049ad613f78b2a478facff5ccb5f7fa5ddee928978a1ad6c215d86546f7ad4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-jchqr" podUID="b48c1214-e762-4786-bd68-58bad958ec5e" Sep 12 23:01:29.491167 systemd[1]: Created slice kubepods-burstable-pod62ac665b_4fbe_4cc1_8333_eb18d722c81d.slice - libcontainer container kubepods-burstable-pod62ac665b_4fbe_4cc1_8333_eb18d722c81d.slice. Sep 12 23:01:29.500685 containerd[1728]: time="2025-09-12T23:01:29.498354257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vkg6h,Uid:62ac665b-4fbe-4cc1-8333-eb18d722c81d,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:29.501700 systemd[1]: Created slice kubepods-besteffort-pod5358e75c_8a39_437c_b59d_55ac7b3e156a.slice - libcontainer container kubepods-besteffort-pod5358e75c_8a39_437c_b59d_55ac7b3e156a.slice. Sep 12 23:01:29.508385 kubelet[3136]: I0912 23:01:29.508142 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjz4f\" (UniqueName: \"kubernetes.io/projected/c7bc1133-2e59-4613-a38f-a2f5f55d823e-kube-api-access-sjz4f\") pod \"calico-apiserver-746b8b545d-xm6wc\" (UID: \"c7bc1133-2e59-4613-a38f-a2f5f55d823e\") " pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" Sep 12 23:01:29.508385 kubelet[3136]: I0912 23:01:29.508171 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5358e75c-8a39-437c-b59d-55ac7b3e156a-calico-apiserver-certs\") pod \"calico-apiserver-746b8b545d-8247x\" (UID: \"5358e75c-8a39-437c-b59d-55ac7b3e156a\") " pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" Sep 12 23:01:29.508385 kubelet[3136]: I0912 23:01:29.508207 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4nbm\" (UniqueName: \"kubernetes.io/projected/5358e75c-8a39-437c-b59d-55ac7b3e156a-kube-api-access-q4nbm\") pod \"calico-apiserver-746b8b545d-8247x\" (UID: \"5358e75c-8a39-437c-b59d-55ac7b3e156a\") " pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" Sep 12 23:01:29.508385 kubelet[3136]: I0912 23:01:29.508236 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c7bc1133-2e59-4613-a38f-a2f5f55d823e-calico-apiserver-certs\") pod \"calico-apiserver-746b8b545d-xm6wc\" (UID: \"c7bc1133-2e59-4613-a38f-a2f5f55d823e\") " pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" Sep 12 23:01:29.516256 systemd[1]: Created slice kubepods-besteffort-podc7bc1133_2e59_4613_a38f_a2f5f55d823e.slice - libcontainer container kubepods-besteffort-podc7bc1133_2e59_4613_a38f_a2f5f55d823e.slice. Sep 12 23:01:29.526221 systemd[1]: Created slice kubepods-besteffort-podc92e1465_873d_4803_a773_cd505c1cec05.slice - libcontainer container kubepods-besteffort-podc92e1465_873d_4803_a773_cd505c1cec05.slice. Sep 12 23:01:29.584359 containerd[1728]: time="2025-09-12T23:01:29.584332237Z" level=error msg="Failed to destroy network for sandbox \"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.588659 containerd[1728]: time="2025-09-12T23:01:29.588579559Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7vb8z,Uid:1b987837-dd23-4076-8418-d77fb0bca3b7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.590857 kubelet[3136]: E0912 23:01:29.588836 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.590857 kubelet[3136]: E0912 23:01:29.590586 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:29.590857 kubelet[3136]: E0912 23:01:29.590606 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7vb8z" Sep 12 23:01:29.591072 kubelet[3136]: E0912 23:01:29.590666 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7vb8z_calico-system(1b987837-dd23-4076-8418-d77fb0bca3b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7vb8z_calico-system(1b987837-dd23-4076-8418-d77fb0bca3b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ea706c67062933e350b09cd11905e3a7a13c507d6682b9e405561a26b3053ab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7vb8z" podUID="1b987837-dd23-4076-8418-d77fb0bca3b7" Sep 12 23:01:29.602093 containerd[1728]: time="2025-09-12T23:01:29.602068213Z" level=error msg="Failed to destroy network for sandbox \"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.605064 containerd[1728]: time="2025-09-12T23:01:29.604990448Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-748687d77f-g7ttg,Uid:0cc5f576-7283-4e9c-b7f6-17b22bad935a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.605163 kubelet[3136]: E0912 23:01:29.605138 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.605196 kubelet[3136]: E0912 23:01:29.605181 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-748687d77f-g7ttg" Sep 12 23:01:29.605249 kubelet[3136]: E0912 23:01:29.605199 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-748687d77f-g7ttg" Sep 12 23:01:29.605275 kubelet[3136]: E0912 23:01:29.605250 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-748687d77f-g7ttg_calico-system(0cc5f576-7283-4e9c-b7f6-17b22bad935a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-748687d77f-g7ttg_calico-system(0cc5f576-7283-4e9c-b7f6-17b22bad935a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a4078285c9c9a7a6779b5b311da6bbdf2ce91cb381a00f6e06e8a5bc10029d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-748687d77f-g7ttg" podUID="0cc5f576-7283-4e9c-b7f6-17b22bad935a" Sep 12 23:01:29.608843 kubelet[3136]: I0912 23:01:29.608820 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94hb\" (UniqueName: \"kubernetes.io/projected/c92e1465-873d-4803-a773-cd505c1cec05-kube-api-access-q94hb\") pod \"goldmane-54d579b49d-hzg8c\" (UID: \"c92e1465-873d-4803-a773-cd505c1cec05\") " pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.608916 kubelet[3136]: I0912 23:01:29.608903 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c92e1465-873d-4803-a773-cd505c1cec05-config\") pod \"goldmane-54d579b49d-hzg8c\" (UID: \"c92e1465-873d-4803-a773-cd505c1cec05\") " pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.608942 kubelet[3136]: I0912 23:01:29.608932 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c92e1465-873d-4803-a773-cd505c1cec05-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-hzg8c\" (UID: \"c92e1465-873d-4803-a773-cd505c1cec05\") " pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.608963 kubelet[3136]: I0912 23:01:29.608949 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c92e1465-873d-4803-a773-cd505c1cec05-goldmane-key-pair\") pod \"goldmane-54d579b49d-hzg8c\" (UID: \"c92e1465-873d-4803-a773-cd505c1cec05\") " pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.611543 containerd[1728]: time="2025-09-12T23:01:29.610834094Z" level=error msg="Failed to destroy network for sandbox \"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.614817 containerd[1728]: time="2025-09-12T23:01:29.614783037Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vkg6h,Uid:62ac665b-4fbe-4cc1-8333-eb18d722c81d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.615594 kubelet[3136]: E0912 23:01:29.615568 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.615651 kubelet[3136]: E0912 23:01:29.615607 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vkg6h" Sep 12 23:01:29.615651 kubelet[3136]: E0912 23:01:29.615624 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vkg6h" Sep 12 23:01:29.615810 kubelet[3136]: E0912 23:01:29.615656 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vkg6h_kube-system(62ac665b-4fbe-4cc1-8333-eb18d722c81d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vkg6h_kube-system(62ac665b-4fbe-4cc1-8333-eb18d722c81d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a61ec3e575eaef1aeb51216238865b9a2e8a0ed100380fa4db5256fba2b3fb9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vkg6h" podUID="62ac665b-4fbe-4cc1-8333-eb18d722c81d" Sep 12 23:01:29.628135 systemd[1]: run-netns-cni\x2db150771c\x2d3a3d\x2d278d\x2d70c6\x2d1e7f9fc12be0.mount: Deactivated successfully. Sep 12 23:01:29.638391 containerd[1728]: time="2025-09-12T23:01:29.638367174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-565597cb9f-xh8s4,Uid:f39f69e4-a9bf-404f-8a5c-8ebad50034e9,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:29.678812 containerd[1728]: time="2025-09-12T23:01:29.678786974Z" level=error msg="Failed to destroy network for sandbox \"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.680422 systemd[1]: run-netns-cni\x2d58697d72\x2d573e\x2d29ee\x2d7f4f\x2dca2c68f44321.mount: Deactivated successfully. Sep 12 23:01:29.683240 containerd[1728]: time="2025-09-12T23:01:29.683213400Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-565597cb9f-xh8s4,Uid:f39f69e4-a9bf-404f-8a5c-8ebad50034e9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.683408 kubelet[3136]: E0912 23:01:29.683386 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.683658 kubelet[3136]: E0912 23:01:29.683420 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" Sep 12 23:01:29.683658 kubelet[3136]: E0912 23:01:29.683435 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" Sep 12 23:01:29.683658 kubelet[3136]: E0912 23:01:29.683472 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-565597cb9f-xh8s4_calico-system(f39f69e4-a9bf-404f-8a5c-8ebad50034e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-565597cb9f-xh8s4_calico-system(f39f69e4-a9bf-404f-8a5c-8ebad50034e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ab768ad59a35bba96d8f7ea6cb6459c47e98a294b5cb6421e05b0d9537ee041\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" podUID="f39f69e4-a9bf-404f-8a5c-8ebad50034e9" Sep 12 23:01:29.809357 containerd[1728]: time="2025-09-12T23:01:29.809301357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-8247x,Uid:5358e75c-8a39-437c-b59d-55ac7b3e156a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:29.824043 containerd[1728]: time="2025-09-12T23:01:29.824023001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-xm6wc,Uid:c7bc1133-2e59-4613-a38f-a2f5f55d823e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:29.830539 containerd[1728]: time="2025-09-12T23:01:29.830519097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hzg8c,Uid:c92e1465-873d-4803-a773-cd505c1cec05,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:29.879596 containerd[1728]: time="2025-09-12T23:01:29.879566079Z" level=error msg="Failed to destroy network for sandbox \"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.886444 containerd[1728]: time="2025-09-12T23:01:29.886408426Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-8247x,Uid:5358e75c-8a39-437c-b59d-55ac7b3e156a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.886698 kubelet[3136]: E0912 23:01:29.886679 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.886748 kubelet[3136]: E0912 23:01:29.886709 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" Sep 12 23:01:29.886748 kubelet[3136]: E0912 23:01:29.886724 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" Sep 12 23:01:29.886793 kubelet[3136]: E0912 23:01:29.886764 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-746b8b545d-8247x_calico-apiserver(5358e75c-8a39-437c-b59d-55ac7b3e156a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-746b8b545d-8247x_calico-apiserver(5358e75c-8a39-437c-b59d-55ac7b3e156a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e8bedf6d9dfe0aba22153c7d10b20b40428af916e9bde2b14c4334f54fbe481\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" podUID="5358e75c-8a39-437c-b59d-55ac7b3e156a" Sep 12 23:01:29.888820 containerd[1728]: time="2025-09-12T23:01:29.888783525Z" level=error msg="Failed to destroy network for sandbox \"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.894131 containerd[1728]: time="2025-09-12T23:01:29.894064945Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-xm6wc,Uid:c7bc1133-2e59-4613-a38f-a2f5f55d823e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.894226 kubelet[3136]: E0912 23:01:29.894188 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.894226 kubelet[3136]: E0912 23:01:29.894220 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" Sep 12 23:01:29.894287 kubelet[3136]: E0912 23:01:29.894237 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" Sep 12 23:01:29.894311 kubelet[3136]: E0912 23:01:29.894279 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-746b8b545d-xm6wc_calico-apiserver(c7bc1133-2e59-4613-a38f-a2f5f55d823e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-746b8b545d-xm6wc_calico-apiserver(c7bc1133-2e59-4613-a38f-a2f5f55d823e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d6f10bdf2a324c00bbb1babb58b2c6a21db16f9ce6f4a4fce65fa1314a9d7c28\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" podUID="c7bc1133-2e59-4613-a38f-a2f5f55d823e" Sep 12 23:01:29.896970 containerd[1728]: time="2025-09-12T23:01:29.896934169Z" level=error msg="Failed to destroy network for sandbox \"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.901926 containerd[1728]: time="2025-09-12T23:01:29.901527430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hzg8c,Uid:c92e1465-873d-4803-a773-cd505c1cec05,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.902110 kubelet[3136]: E0912 23:01:29.901706 3136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:29.902110 kubelet[3136]: E0912 23:01:29.901748 3136 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.902110 kubelet[3136]: E0912 23:01:29.901761 3136 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hzg8c" Sep 12 23:01:29.902293 kubelet[3136]: E0912 23:01:29.901792 3136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-hzg8c_calico-system(c92e1465-873d-4803-a773-cd505c1cec05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-hzg8c_calico-system(c92e1465-873d-4803-a773-cd505c1cec05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83ead29994a52d775f5aa5e65f990415125512ceb46067d3f69ec4a186781661\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-hzg8c" podUID="c92e1465-873d-4803-a773-cd505c1cec05" Sep 12 23:01:29.906550 containerd[1728]: time="2025-09-12T23:01:29.906530284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 23:01:36.631820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3204029593.mount: Deactivated successfully. Sep 12 23:01:36.662404 containerd[1728]: time="2025-09-12T23:01:36.662372179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:36.668646 containerd[1728]: time="2025-09-12T23:01:36.668618101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 23:01:36.671344 containerd[1728]: time="2025-09-12T23:01:36.670874239Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:36.674164 containerd[1728]: time="2025-09-12T23:01:36.674140340Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:36.674424 containerd[1728]: time="2025-09-12T23:01:36.674407239Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.767851588s" Sep 12 23:01:36.674486 containerd[1728]: time="2025-09-12T23:01:36.674475570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 23:01:36.688797 containerd[1728]: time="2025-09-12T23:01:36.688769380Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 23:01:36.708469 containerd[1728]: time="2025-09-12T23:01:36.706980782Z" level=info msg="Container 22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:36.722140 containerd[1728]: time="2025-09-12T23:01:36.722115462Z" level=info msg="CreateContainer within sandbox \"925ee34414ade99e003b569b5dbdeaec73472d97e57027dcd74387e78af3c99e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\"" Sep 12 23:01:36.723775 containerd[1728]: time="2025-09-12T23:01:36.722668176Z" level=info msg="StartContainer for \"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\"" Sep 12 23:01:36.723959 containerd[1728]: time="2025-09-12T23:01:36.723932623Z" level=info msg="connecting to shim 22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2" address="unix:///run/containerd/s/f52e3c3063f16947553ccb3e5d8f75d33fb73b442af7e471106b0e3b87e763ad" protocol=ttrpc version=3 Sep 12 23:01:36.745660 systemd[1]: Started cri-containerd-22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2.scope - libcontainer container 22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2. Sep 12 23:01:36.775515 containerd[1728]: time="2025-09-12T23:01:36.775478742Z" level=info msg="StartContainer for \"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" returns successfully" Sep 12 23:01:36.933680 kubelet[3136]: I0912 23:01:36.933096 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-87txg" podStartSLOduration=1.2949217499999999 podStartE2EDuration="20.933079771s" podCreationTimestamp="2025-09-12 23:01:16 +0000 UTC" firstStartedPulling="2025-09-12 23:01:17.036750434 +0000 UTC m=+17.308377220" lastFinishedPulling="2025-09-12 23:01:36.674908453 +0000 UTC m=+36.946535241" observedRunningTime="2025-09-12 23:01:36.932072739 +0000 UTC m=+37.203699535" watchObservedRunningTime="2025-09-12 23:01:36.933079771 +0000 UTC m=+37.204706554" Sep 12 23:01:37.134684 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 23:01:37.134748 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 23:01:37.346813 kubelet[3136]: I0912 23:01:37.346730 3136 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-backend-key-pair\") pod \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " Sep 12 23:01:37.346813 kubelet[3136]: I0912 23:01:37.346767 3136 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-ca-bundle\") pod \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " Sep 12 23:01:37.346813 kubelet[3136]: I0912 23:01:37.346787 3136 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-744xn\" (UniqueName: \"kubernetes.io/projected/0cc5f576-7283-4e9c-b7f6-17b22bad935a-kube-api-access-744xn\") pod \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\" (UID: \"0cc5f576-7283-4e9c-b7f6-17b22bad935a\") " Sep 12 23:01:37.348507 kubelet[3136]: I0912 23:01:37.347390 3136 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "0cc5f576-7283-4e9c-b7f6-17b22bad935a" (UID: "0cc5f576-7283-4e9c-b7f6-17b22bad935a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 23:01:37.355966 kubelet[3136]: I0912 23:01:37.355942 3136 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc5f576-7283-4e9c-b7f6-17b22bad935a-kube-api-access-744xn" (OuterVolumeSpecName: "kube-api-access-744xn") pod "0cc5f576-7283-4e9c-b7f6-17b22bad935a" (UID: "0cc5f576-7283-4e9c-b7f6-17b22bad935a"). InnerVolumeSpecName "kube-api-access-744xn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 23:01:37.356584 kubelet[3136]: I0912 23:01:37.356554 3136 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "0cc5f576-7283-4e9c-b7f6-17b22bad935a" (UID: "0cc5f576-7283-4e9c-b7f6-17b22bad935a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 23:01:37.449199 kubelet[3136]: I0912 23:01:37.449176 3136 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-ca-bundle\") on node \"ci-4459.0.0-a-36add7270c\" DevicePath \"\"" Sep 12 23:01:37.449199 kubelet[3136]: I0912 23:01:37.449199 3136 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-744xn\" (UniqueName: \"kubernetes.io/projected/0cc5f576-7283-4e9c-b7f6-17b22bad935a-kube-api-access-744xn\") on node \"ci-4459.0.0-a-36add7270c\" DevicePath \"\"" Sep 12 23:01:37.449283 kubelet[3136]: I0912 23:01:37.449208 3136 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0cc5f576-7283-4e9c-b7f6-17b22bad935a-whisker-backend-key-pair\") on node \"ci-4459.0.0-a-36add7270c\" DevicePath \"\"" Sep 12 23:01:37.631865 systemd[1]: var-lib-kubelet-pods-0cc5f576\x2d7283\x2d4e9c\x2db7f6\x2d17b22bad935a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d744xn.mount: Deactivated successfully. Sep 12 23:01:37.631941 systemd[1]: var-lib-kubelet-pods-0cc5f576\x2d7283\x2d4e9c\x2db7f6\x2d17b22bad935a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 23:01:37.748687 containerd[1728]: time="2025-09-12T23:01:37.748664264Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"922b5af66c6c08756bf5d528be8d6a59030f12dd3aa4f0134e9f4fcfa934f2d4\" pid:4266 exit_status:1 exited_at:{seconds:1757718097 nanos:748395966}" Sep 12 23:01:37.807851 containerd[1728]: time="2025-09-12T23:01:37.807813905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"4c7d8149f1ecd9bf70fa8b275382708790d7de78b7aacc23524c240fa4c138c1\" pid:4291 exit_status:1 exited_at:{seconds:1757718097 nanos:807673803}" Sep 12 23:01:37.816407 systemd[1]: Removed slice kubepods-besteffort-pod0cc5f576_7283_4e9c_b7f6_17b22bad935a.slice - libcontainer container kubepods-besteffort-pod0cc5f576_7283_4e9c_b7f6_17b22bad935a.slice. Sep 12 23:01:37.997708 systemd[1]: Created slice kubepods-besteffort-podde32dc27_37ab_4bac_82c1_4a34b2e55b40.slice - libcontainer container kubepods-besteffort-podde32dc27_37ab_4bac_82c1_4a34b2e55b40.slice. Sep 12 23:01:38.008724 containerd[1728]: time="2025-09-12T23:01:38.008700237Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"5dc4b6727c6a39e61a41405a8439c242fbdb9ee8915d0d09bfc9bf1216347a71\" pid:4317 exit_status:1 exited_at:{seconds:1757718098 nanos:8533520}" Sep 12 23:01:38.052193 kubelet[3136]: I0912 23:01:38.052171 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de32dc27-37ab-4bac-82c1-4a34b2e55b40-whisker-ca-bundle\") pod \"whisker-7bfcb5cdc4-zzl9s\" (UID: \"de32dc27-37ab-4bac-82c1-4a34b2e55b40\") " pod="calico-system/whisker-7bfcb5cdc4-zzl9s" Sep 12 23:01:38.052398 kubelet[3136]: I0912 23:01:38.052203 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/de32dc27-37ab-4bac-82c1-4a34b2e55b40-whisker-backend-key-pair\") pod \"whisker-7bfcb5cdc4-zzl9s\" (UID: \"de32dc27-37ab-4bac-82c1-4a34b2e55b40\") " pod="calico-system/whisker-7bfcb5cdc4-zzl9s" Sep 12 23:01:38.052398 kubelet[3136]: I0912 23:01:38.052219 3136 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffl8z\" (UniqueName: \"kubernetes.io/projected/de32dc27-37ab-4bac-82c1-4a34b2e55b40-kube-api-access-ffl8z\") pod \"whisker-7bfcb5cdc4-zzl9s\" (UID: \"de32dc27-37ab-4bac-82c1-4a34b2e55b40\") " pod="calico-system/whisker-7bfcb5cdc4-zzl9s" Sep 12 23:01:38.301485 containerd[1728]: time="2025-09-12T23:01:38.301425041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bfcb5cdc4-zzl9s,Uid:de32dc27-37ab-4bac-82c1-4a34b2e55b40,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:38.417053 systemd-networkd[1575]: cali2d4a44d73fc: Link UP Sep 12 23:01:38.418089 systemd-networkd[1575]: cali2d4a44d73fc: Gained carrier Sep 12 23:01:38.429187 containerd[1728]: 2025-09-12 23:01:38.330 [INFO][4331] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:01:38.429187 containerd[1728]: 2025-09-12 23:01:38.336 [INFO][4331] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0 whisker-7bfcb5cdc4- calico-system de32dc27-37ab-4bac-82c1-4a34b2e55b40 899 0 2025-09-12 23:01:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bfcb5cdc4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c whisker-7bfcb5cdc4-zzl9s eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2d4a44d73fc [] [] }} ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-" Sep 12 23:01:38.429187 containerd[1728]: 2025-09-12 23:01:38.336 [INFO][4331] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.429187 containerd[1728]: 2025-09-12 23:01:38.356 [INFO][4345] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" HandleID="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Workload="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.357 [INFO][4345] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" HandleID="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Workload="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd6d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"whisker-7bfcb5cdc4-zzl9s", "timestamp":"2025-09-12 23:01:38.356929604 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.357 [INFO][4345] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.357 [INFO][4345] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.357 [INFO][4345] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.361 [INFO][4345] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.364 [INFO][4345] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.367 [INFO][4345] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.368 [INFO][4345] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.429718 containerd[1728]: 2025-09-12 23:01:38.370 [INFO][4345] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.370 [INFO][4345] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.371 [INFO][4345] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.376 [INFO][4345] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.383 [INFO][4345] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.193/26] block=192.168.35.192/26 handle="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.383 [INFO][4345] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.193/26] handle="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.383 [INFO][4345] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:38.430180 containerd[1728]: 2025-09-12 23:01:38.383 [INFO][4345] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.193/26] IPv6=[] ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" HandleID="k8s-pod-network.75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Workload="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.430386 containerd[1728]: 2025-09-12 23:01:38.385 [INFO][4331] cni-plugin/k8s.go 418: Populated endpoint ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0", GenerateName:"whisker-7bfcb5cdc4-", Namespace:"calico-system", SelfLink:"", UID:"de32dc27-37ab-4bac-82c1-4a34b2e55b40", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bfcb5cdc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"whisker-7bfcb5cdc4-zzl9s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d4a44d73fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:38.430386 containerd[1728]: 2025-09-12 23:01:38.385 [INFO][4331] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.193/32] ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.430480 containerd[1728]: 2025-09-12 23:01:38.385 [INFO][4331] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d4a44d73fc ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.430480 containerd[1728]: 2025-09-12 23:01:38.417 [INFO][4331] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.430556 containerd[1728]: 2025-09-12 23:01:38.417 [INFO][4331] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0", GenerateName:"whisker-7bfcb5cdc4-", Namespace:"calico-system", SelfLink:"", UID:"de32dc27-37ab-4bac-82c1-4a34b2e55b40", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bfcb5cdc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee", Pod:"whisker-7bfcb5cdc4-zzl9s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d4a44d73fc", MAC:"de:b0:74:3f:c5:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:38.430767 containerd[1728]: 2025-09-12 23:01:38.427 [INFO][4331] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" Namespace="calico-system" Pod="whisker-7bfcb5cdc4-zzl9s" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-whisker--7bfcb5cdc4--zzl9s-eth0" Sep 12 23:01:38.476953 containerd[1728]: time="2025-09-12T23:01:38.476927170Z" level=info msg="connecting to shim 75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee" address="unix:///run/containerd/s/fef7fb0853197bb05ec390f3db0535487ffbaebe197cf3c9d1f84696a1226f82" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:38.509566 systemd[1]: Started cri-containerd-75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee.scope - libcontainer container 75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee. Sep 12 23:01:38.592713 containerd[1728]: time="2025-09-12T23:01:38.592688890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bfcb5cdc4-zzl9s,Uid:de32dc27-37ab-4bac-82c1-4a34b2e55b40,Namespace:calico-system,Attempt:0,} returns sandbox id \"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee\"" Sep 12 23:01:38.594166 containerd[1728]: time="2025-09-12T23:01:38.594146657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 23:01:39.033703 systemd-networkd[1575]: vxlan.calico: Link UP Sep 12 23:01:39.033710 systemd-networkd[1575]: vxlan.calico: Gained carrier Sep 12 23:01:39.715567 systemd-networkd[1575]: cali2d4a44d73fc: Gained IPv6LL Sep 12 23:01:39.813735 kubelet[3136]: I0912 23:01:39.813709 3136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc5f576-7283-4e9c-b7f6-17b22bad935a" path="/var/lib/kubelet/pods/0cc5f576-7283-4e9c-b7f6-17b22bad935a/volumes" Sep 12 23:01:39.966625 containerd[1728]: time="2025-09-12T23:01:39.966562727Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:39.968763 containerd[1728]: time="2025-09-12T23:01:39.968690995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 23:01:39.971547 containerd[1728]: time="2025-09-12T23:01:39.971527639Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:39.975366 containerd[1728]: time="2025-09-12T23:01:39.975020266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:39.975366 containerd[1728]: time="2025-09-12T23:01:39.975295335Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.381121604s" Sep 12 23:01:39.975366 containerd[1728]: time="2025-09-12T23:01:39.975315621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 23:01:39.981719 containerd[1728]: time="2025-09-12T23:01:39.981698687Z" level=info msg="CreateContainer within sandbox \"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 23:01:39.997989 containerd[1728]: time="2025-09-12T23:01:39.995607703Z" level=info msg="Container 0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:40.010807 containerd[1728]: time="2025-09-12T23:01:40.010785336Z" level=info msg="CreateContainer within sandbox \"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e\"" Sep 12 23:01:40.011552 containerd[1728]: time="2025-09-12T23:01:40.011092226Z" level=info msg="StartContainer for \"0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e\"" Sep 12 23:01:40.012214 containerd[1728]: time="2025-09-12T23:01:40.012168831Z" level=info msg="connecting to shim 0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e" address="unix:///run/containerd/s/fef7fb0853197bb05ec390f3db0535487ffbaebe197cf3c9d1f84696a1226f82" protocol=ttrpc version=3 Sep 12 23:01:40.029623 systemd[1]: Started cri-containerd-0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e.scope - libcontainer container 0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e. Sep 12 23:01:40.074130 containerd[1728]: time="2025-09-12T23:01:40.074101202Z" level=info msg="StartContainer for \"0805c9c19579237d9abe8372c13f79a8b3b211a6929ec0f7299d813ebc5b056e\" returns successfully" Sep 12 23:01:40.075123 containerd[1728]: time="2025-09-12T23:01:40.075070538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 23:01:40.419628 systemd-networkd[1575]: vxlan.calico: Gained IPv6LL Sep 12 23:01:40.812386 containerd[1728]: time="2025-09-12T23:01:40.812254379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jchqr,Uid:b48c1214-e762-4786-bd68-58bad958ec5e,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:40.812625 containerd[1728]: time="2025-09-12T23:01:40.812254305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7vb8z,Uid:1b987837-dd23-4076-8418-d77fb0bca3b7,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:40.919766 systemd-networkd[1575]: cali3292449619f: Link UP Sep 12 23:01:40.920654 systemd-networkd[1575]: cali3292449619f: Gained carrier Sep 12 23:01:40.933547 containerd[1728]: 2025-09-12 23:01:40.854 [INFO][4634] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0 coredns-674b8bbfcf- kube-system b48c1214-e762-4786-bd68-58bad958ec5e 825 0 2025-09-12 23:01:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c coredns-674b8bbfcf-jchqr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3292449619f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-" Sep 12 23:01:40.933547 containerd[1728]: 2025-09-12 23:01:40.854 [INFO][4634] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.933547 containerd[1728]: 2025-09-12 23:01:40.886 [INFO][4660] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" HandleID="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.886 [INFO][4660] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" HandleID="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f100), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"coredns-674b8bbfcf-jchqr", "timestamp":"2025-09-12 23:01:40.8864421 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.886 [INFO][4660] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.886 [INFO][4660] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.886 [INFO][4660] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.894 [INFO][4660] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.897 [INFO][4660] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.902 [INFO][4660] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.903 [INFO][4660] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934461 containerd[1728]: 2025-09-12 23:01:40.905 [INFO][4660] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.905 [INFO][4660] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.906 [INFO][4660] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3 Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.910 [INFO][4660] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4660] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.194/26] block=192.168.35.192/26 handle="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4660] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.194/26] handle="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4660] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:40.934659 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4660] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.194/26] IPv6=[] ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" HandleID="k8s-pod-network.b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.916 [INFO][4634] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b48c1214-e762-4786-bd68-58bad958ec5e", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"coredns-674b8bbfcf-jchqr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3292449619f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.916 [INFO][4634] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.194/32] ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.916 [INFO][4634] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3292449619f ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.918 [INFO][4634] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.918 [INFO][4634] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b48c1214-e762-4786-bd68-58bad958ec5e", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3", Pod:"coredns-674b8bbfcf-jchqr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3292449619f", MAC:"a2:42:f9:2e:32:50", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:40.934756 containerd[1728]: 2025-09-12 23:01:40.930 [INFO][4634] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" Namespace="kube-system" Pod="coredns-674b8bbfcf-jchqr" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--jchqr-eth0" Sep 12 23:01:40.973512 containerd[1728]: time="2025-09-12T23:01:40.973240826Z" level=info msg="connecting to shim b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3" address="unix:///run/containerd/s/9140ad2d5c111fae7ea78c4124b5ead07b4784a044641a9f1a9576798e3bb20b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:40.987619 systemd[1]: Started cri-containerd-b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3.scope - libcontainer container b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3. Sep 12 23:01:41.034441 systemd-networkd[1575]: calif3e5f34bf0e: Link UP Sep 12 23:01:41.034615 systemd-networkd[1575]: calif3e5f34bf0e: Gained carrier Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.874 [INFO][4647] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0 csi-node-driver- calico-system 1b987837-dd23-4076-8418-d77fb0bca3b7 713 0 2025-09-12 23:01:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c csi-node-driver-7vb8z eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif3e5f34bf0e [] [] }} ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.874 [INFO][4647] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.902 [INFO][4667] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" HandleID="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Workload="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.902 [INFO][4667] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" HandleID="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Workload="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cdcd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"csi-node-driver-7vb8z", "timestamp":"2025-09-12 23:01:40.902167795 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.902 [INFO][4667] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4667] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.914 [INFO][4667] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:40.994 [INFO][4667] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.003 [INFO][4667] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.007 [INFO][4667] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.008 [INFO][4667] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.012 [INFO][4667] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.012 [INFO][4667] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.013 [INFO][4667] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95 Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.018 [INFO][4667] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.027 [INFO][4667] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.195/26] block=192.168.35.192/26 handle="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.028 [INFO][4667] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.195/26] handle="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.028 [INFO][4667] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:41.049628 containerd[1728]: 2025-09-12 23:01:41.029 [INFO][4667] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.195/26] IPv6=[] ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" HandleID="k8s-pod-network.a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Workload="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.030 [INFO][4647] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1b987837-dd23-4076-8418-d77fb0bca3b7", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"csi-node-driver-7vb8z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif3e5f34bf0e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.032 [INFO][4647] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.195/32] ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.032 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif3e5f34bf0e ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.034 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.035 [INFO][4647] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1b987837-dd23-4076-8418-d77fb0bca3b7", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95", Pod:"csi-node-driver-7vb8z", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif3e5f34bf0e", MAC:"7e:5d:34:f2:ff:7b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:41.050542 containerd[1728]: 2025-09-12 23:01:41.047 [INFO][4647] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" Namespace="calico-system" Pod="csi-node-driver-7vb8z" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-csi--node--driver--7vb8z-eth0" Sep 12 23:01:41.059151 containerd[1728]: time="2025-09-12T23:01:41.059125114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jchqr,Uid:b48c1214-e762-4786-bd68-58bad958ec5e,Namespace:kube-system,Attempt:0,} returns sandbox id \"b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3\"" Sep 12 23:01:41.069611 containerd[1728]: time="2025-09-12T23:01:41.068956086Z" level=info msg="CreateContainer within sandbox \"b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:01:41.107389 containerd[1728]: time="2025-09-12T23:01:41.107369030Z" level=info msg="Container dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:41.119252 containerd[1728]: time="2025-09-12T23:01:41.119155819Z" level=info msg="connecting to shim a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95" address="unix:///run/containerd/s/9ef137b8e368b243c3152ed6324e2ecec0b0204389a2b695f05a48894e02acc7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:41.124435 containerd[1728]: time="2025-09-12T23:01:41.124413063Z" level=info msg="CreateContainer within sandbox \"b842d8e1c6d64bcd48b18c09c6f5c4d56ac44f562acec52789cee69de5ad4eb3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d\"" Sep 12 23:01:41.125843 containerd[1728]: time="2025-09-12T23:01:41.125705992Z" level=info msg="StartContainer for \"dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d\"" Sep 12 23:01:41.126869 containerd[1728]: time="2025-09-12T23:01:41.126611144Z" level=info msg="connecting to shim dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d" address="unix:///run/containerd/s/9140ad2d5c111fae7ea78c4124b5ead07b4784a044641a9f1a9576798e3bb20b" protocol=ttrpc version=3 Sep 12 23:01:41.147622 systemd[1]: Started cri-containerd-a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95.scope - libcontainer container a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95. Sep 12 23:01:41.151010 systemd[1]: Started cri-containerd-dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d.scope - libcontainer container dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d. Sep 12 23:01:41.181281 containerd[1728]: time="2025-09-12T23:01:41.181257677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7vb8z,Uid:1b987837-dd23-4076-8418-d77fb0bca3b7,Namespace:calico-system,Attempt:0,} returns sandbox id \"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95\"" Sep 12 23:01:41.184246 containerd[1728]: time="2025-09-12T23:01:41.183801794Z" level=info msg="StartContainer for \"dfd4f0fe8eb50c2fb7da8b0dc1ec7ab6893c5d20bcd29d3928f919f3a532c91d\" returns successfully" Sep 12 23:01:41.949804 kubelet[3136]: I0912 23:01:41.949749 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-jchqr" podStartSLOduration=36.949737477 podStartE2EDuration="36.949737477s" podCreationTimestamp="2025-09-12 23:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:41.94931541 +0000 UTC m=+42.220942204" watchObservedRunningTime="2025-09-12 23:01:41.949737477 +0000 UTC m=+42.221364257" Sep 12 23:01:42.295981 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3041934686.mount: Deactivated successfully. Sep 12 23:01:42.350603 containerd[1728]: time="2025-09-12T23:01:42.350572128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.352967 containerd[1728]: time="2025-09-12T23:01:42.352872639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 23:01:42.355561 containerd[1728]: time="2025-09-12T23:01:42.355541450Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.359822 containerd[1728]: time="2025-09-12T23:01:42.359775439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.360234 containerd[1728]: time="2025-09-12T23:01:42.360100282Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.28488834s" Sep 12 23:01:42.360234 containerd[1728]: time="2025-09-12T23:01:42.360124639Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 23:01:42.361531 containerd[1728]: time="2025-09-12T23:01:42.361043647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 23:01:42.365895 containerd[1728]: time="2025-09-12T23:01:42.365873692Z" level=info msg="CreateContainer within sandbox \"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 23:01:42.382268 containerd[1728]: time="2025-09-12T23:01:42.381598849Z" level=info msg="Container 634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:42.395885 containerd[1728]: time="2025-09-12T23:01:42.395861148Z" level=info msg="CreateContainer within sandbox \"75b604fe56602790e1765e8323f6d26c5901d6a54e5d928e047d4db211871dee\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837\"" Sep 12 23:01:42.396389 containerd[1728]: time="2025-09-12T23:01:42.396363779Z" level=info msg="StartContainer for \"634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837\"" Sep 12 23:01:42.397455 containerd[1728]: time="2025-09-12T23:01:42.397413947Z" level=info msg="connecting to shim 634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837" address="unix:///run/containerd/s/fef7fb0853197bb05ec390f3db0535487ffbaebe197cf3c9d1f84696a1226f82" protocol=ttrpc version=3 Sep 12 23:01:42.411639 systemd[1]: Started cri-containerd-634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837.scope - libcontainer container 634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837. Sep 12 23:01:42.450479 containerd[1728]: time="2025-09-12T23:01:42.450454960Z" level=info msg="StartContainer for \"634c8888e1f831aaf1129949179781e08668a4400c0bbbf68e71c4c69aa73837\" returns successfully" Sep 12 23:01:42.723642 systemd-networkd[1575]: cali3292449619f: Gained IPv6LL Sep 12 23:01:42.812156 containerd[1728]: time="2025-09-12T23:01:42.811945752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vkg6h,Uid:62ac665b-4fbe-4cc1-8333-eb18d722c81d,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:42.812156 containerd[1728]: time="2025-09-12T23:01:42.812106532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-565597cb9f-xh8s4,Uid:f39f69e4-a9bf-404f-8a5c-8ebad50034e9,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:42.812324 containerd[1728]: time="2025-09-12T23:01:42.812301582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-8247x,Uid:5358e75c-8a39-437c-b59d-55ac7b3e156a,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:42.980673 systemd-networkd[1575]: cali6f617236da8: Link UP Sep 12 23:01:42.981615 systemd-networkd[1575]: cali6f617236da8: Gained carrier Sep 12 23:01:42.994334 kubelet[3136]: I0912 23:01:42.994291 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7bfcb5cdc4-zzl9s" podStartSLOduration=2.227256072 podStartE2EDuration="5.99427625s" podCreationTimestamp="2025-09-12 23:01:37 +0000 UTC" firstStartedPulling="2025-09-12 23:01:38.593718806 +0000 UTC m=+38.865345599" lastFinishedPulling="2025-09-12 23:01:42.360738985 +0000 UTC m=+42.632365777" observedRunningTime="2025-09-12 23:01:42.953737646 +0000 UTC m=+43.225364439" watchObservedRunningTime="2025-09-12 23:01:42.99427625 +0000 UTC m=+43.265903046" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.896 [INFO][4883] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0 calico-apiserver-746b8b545d- calico-apiserver 5358e75c-8a39-437c-b59d-55ac7b3e156a 833 0 2025-09-12 23:01:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:746b8b545d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c calico-apiserver-746b8b545d-8247x eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6f617236da8 [] [] }} ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.896 [INFO][4883] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.926 [INFO][4903] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" HandleID="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.926 [INFO][4903] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" HandleID="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-a-36add7270c", "pod":"calico-apiserver-746b8b545d-8247x", "timestamp":"2025-09-12 23:01:42.92648452 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.926 [INFO][4903] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.926 [INFO][4903] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.926 [INFO][4903] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.931 [INFO][4903] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.935 [INFO][4903] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.939 [INFO][4903] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.942 [INFO][4903] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.944 [INFO][4903] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.945 [INFO][4903] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.948 [INFO][4903] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392 Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.955 [INFO][4903] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.969 [INFO][4903] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.196/26] block=192.168.35.192/26 handle="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.969 [INFO][4903] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.196/26] handle="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.970 [INFO][4903] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:42.998555 containerd[1728]: 2025-09-12 23:01:42.971 [INFO][4903] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.196/26] IPv6=[] ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" HandleID="k8s-pod-network.6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.974 [INFO][4883] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0", GenerateName:"calico-apiserver-746b8b545d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5358e75c-8a39-437c-b59d-55ac7b3e156a", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"746b8b545d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"calico-apiserver-746b8b545d-8247x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f617236da8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.977 [INFO][4883] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.196/32] ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.977 [INFO][4883] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f617236da8 ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.982 [INFO][4883] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.983 [INFO][4883] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0", GenerateName:"calico-apiserver-746b8b545d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5358e75c-8a39-437c-b59d-55ac7b3e156a", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"746b8b545d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392", Pod:"calico-apiserver-746b8b545d-8247x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f617236da8", MAC:"b2:95:e4:33:d4:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:42.999023 containerd[1728]: 2025-09-12 23:01:42.995 [INFO][4883] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-8247x" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--8247x-eth0" Sep 12 23:01:43.044045 systemd-networkd[1575]: calif3e5f34bf0e: Gained IPv6LL Sep 12 23:01:43.045508 containerd[1728]: time="2025-09-12T23:01:43.045468976Z" level=info msg="connecting to shim 6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392" address="unix:///run/containerd/s/e027a65fafef1fb9e30aef58a198510e8e176cb60b7dd55f0f3051a09e87b334" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:43.068618 systemd[1]: Started cri-containerd-6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392.scope - libcontainer container 6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392. Sep 12 23:01:43.088324 systemd-networkd[1575]: cali7f6b350f280: Link UP Sep 12 23:01:43.088511 systemd-networkd[1575]: cali7f6b350f280: Gained carrier Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.887 [INFO][4858] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0 coredns-674b8bbfcf- kube-system 62ac665b-4fbe-4cc1-8333-eb18d722c81d 832 0 2025-09-12 23:01:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c coredns-674b8bbfcf-vkg6h eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7f6b350f280 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.888 [INFO][4858] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.928 [INFO][4896] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" HandleID="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.929 [INFO][4896] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" HandleID="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5950), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"coredns-674b8bbfcf-vkg6h", "timestamp":"2025-09-12 23:01:42.928268976 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.929 [INFO][4896] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.969 [INFO][4896] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:42.969 [INFO][4896] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.032 [INFO][4896] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.035 [INFO][4896] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.042 [INFO][4896] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.044 [INFO][4896] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.047 [INFO][4896] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.047 [INFO][4896] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.048 [INFO][4896] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67 Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.072 [INFO][4896] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4896] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.197/26] block=192.168.35.192/26 handle="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4896] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.197/26] handle="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4896] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:43.104236 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4896] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.197/26] IPv6=[] ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" HandleID="k8s-pod-network.2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Workload="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.079 [INFO][4858] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"62ac665b-4fbe-4cc1-8333-eb18d722c81d", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"coredns-674b8bbfcf-vkg6h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f6b350f280", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.079 [INFO][4858] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.197/32] ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.079 [INFO][4858] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f6b350f280 ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.088 [INFO][4858] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.088 [INFO][4858] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"62ac665b-4fbe-4cc1-8333-eb18d722c81d", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67", Pod:"coredns-674b8bbfcf-vkg6h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f6b350f280", MAC:"12:b6:7b:7f:00:40", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.104965 containerd[1728]: 2025-09-12 23:01:43.102 [INFO][4858] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" Namespace="kube-system" Pod="coredns-674b8bbfcf-vkg6h" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-coredns--674b8bbfcf--vkg6h-eth0" Sep 12 23:01:43.141730 containerd[1728]: time="2025-09-12T23:01:43.141648644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-8247x,Uid:5358e75c-8a39-437c-b59d-55ac7b3e156a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392\"" Sep 12 23:01:43.156664 containerd[1728]: time="2025-09-12T23:01:43.156628868Z" level=info msg="connecting to shim 2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67" address="unix:///run/containerd/s/245696cddf6f79079ca987a0c142d7138568e187a66091ab8dad1e42ce9af156" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:43.172235 systemd-networkd[1575]: cali9c88815ac94: Link UP Sep 12 23:01:43.172961 systemd-networkd[1575]: cali9c88815ac94: Gained carrier Sep 12 23:01:43.185741 systemd[1]: Started cri-containerd-2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67.scope - libcontainer container 2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67. Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:42.892 [INFO][4868] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0 calico-kube-controllers-565597cb9f- calico-system f39f69e4-a9bf-404f-8a5c-8ebad50034e9 828 0 2025-09-12 23:01:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:565597cb9f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c calico-kube-controllers-565597cb9f-xh8s4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9c88815ac94 [] [] }} ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:42.893 [INFO][4868] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:42.935 [INFO][4898] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" HandleID="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:42.935 [INFO][4898] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" HandleID="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"calico-kube-controllers-565597cb9f-xh8s4", "timestamp":"2025-09-12 23:01:42.935111762 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:42.935 [INFO][4898] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4898] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.077 [INFO][4898] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.132 [INFO][4898] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.139 [INFO][4898] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.142 [INFO][4898] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.146 [INFO][4898] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.149 [INFO][4898] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.149 [INFO][4898] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.152 [INFO][4898] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0 Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.157 [INFO][4898] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.167 [INFO][4898] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.198/26] block=192.168.35.192/26 handle="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.167 [INFO][4898] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.198/26] handle="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.167 [INFO][4898] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:43.190148 containerd[1728]: 2025-09-12 23:01:43.167 [INFO][4898] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.198/26] IPv6=[] ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" HandleID="k8s-pod-network.65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.169 [INFO][4868] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0", GenerateName:"calico-kube-controllers-565597cb9f-", Namespace:"calico-system", SelfLink:"", UID:"f39f69e4-a9bf-404f-8a5c-8ebad50034e9", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"565597cb9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"calico-kube-controllers-565597cb9f-xh8s4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9c88815ac94", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.170 [INFO][4868] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.198/32] ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.170 [INFO][4868] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c88815ac94 ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.174 [INFO][4868] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.174 [INFO][4868] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0", GenerateName:"calico-kube-controllers-565597cb9f-", Namespace:"calico-system", SelfLink:"", UID:"f39f69e4-a9bf-404f-8a5c-8ebad50034e9", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"565597cb9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0", Pod:"calico-kube-controllers-565597cb9f-xh8s4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9c88815ac94", MAC:"ee:77:e3:fd:97:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.191658 containerd[1728]: 2025-09-12 23:01:43.186 [INFO][4868] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" Namespace="calico-system" Pod="calico-kube-controllers-565597cb9f-xh8s4" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--kube--controllers--565597cb9f--xh8s4-eth0" Sep 12 23:01:43.247944 containerd[1728]: time="2025-09-12T23:01:43.247865086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vkg6h,Uid:62ac665b-4fbe-4cc1-8333-eb18d722c81d,Namespace:kube-system,Attempt:0,} returns sandbox id \"2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67\"" Sep 12 23:01:43.255844 containerd[1728]: time="2025-09-12T23:01:43.255753091Z" level=info msg="CreateContainer within sandbox \"2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:01:43.287776 containerd[1728]: time="2025-09-12T23:01:43.287743619Z" level=info msg="connecting to shim 65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0" address="unix:///run/containerd/s/405229b982476da4ff081c158bc21c25f2b447ae08085196a466a005387b8239" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:43.292642 containerd[1728]: time="2025-09-12T23:01:43.292619783Z" level=info msg="Container b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:43.305657 systemd[1]: Started cri-containerd-65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0.scope - libcontainer container 65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0. Sep 12 23:01:43.313774 containerd[1728]: time="2025-09-12T23:01:43.313753062Z" level=info msg="CreateContainer within sandbox \"2d55b6c2046f963b6fe82e25bcb60606142ec153312782f24d35296424ef8e67\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064\"" Sep 12 23:01:43.314423 containerd[1728]: time="2025-09-12T23:01:43.314391636Z" level=info msg="StartContainer for \"b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064\"" Sep 12 23:01:43.316111 containerd[1728]: time="2025-09-12T23:01:43.316093286Z" level=info msg="connecting to shim b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064" address="unix:///run/containerd/s/245696cddf6f79079ca987a0c142d7138568e187a66091ab8dad1e42ce9af156" protocol=ttrpc version=3 Sep 12 23:01:43.334633 systemd[1]: Started cri-containerd-b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064.scope - libcontainer container b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064. Sep 12 23:01:43.349842 containerd[1728]: time="2025-09-12T23:01:43.349814558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-565597cb9f-xh8s4,Uid:f39f69e4-a9bf-404f-8a5c-8ebad50034e9,Namespace:calico-system,Attempt:0,} returns sandbox id \"65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0\"" Sep 12 23:01:43.368959 containerd[1728]: time="2025-09-12T23:01:43.368874939Z" level=info msg="StartContainer for \"b9748fa981d3dbf039c30930a85df51f0425d822b465fbe6636532c863441064\" returns successfully" Sep 12 23:01:43.734342 containerd[1728]: time="2025-09-12T23:01:43.734318331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:43.736752 containerd[1728]: time="2025-09-12T23:01:43.736731931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 23:01:43.739551 containerd[1728]: time="2025-09-12T23:01:43.739517737Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:43.747378 containerd[1728]: time="2025-09-12T23:01:43.747342721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:43.747675 containerd[1728]: time="2025-09-12T23:01:43.747655556Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.386100769s" Sep 12 23:01:43.747710 containerd[1728]: time="2025-09-12T23:01:43.747680990Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 23:01:43.748452 containerd[1728]: time="2025-09-12T23:01:43.748422014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:01:43.753529 containerd[1728]: time="2025-09-12T23:01:43.753507989Z" level=info msg="CreateContainer within sandbox \"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 23:01:43.774654 containerd[1728]: time="2025-09-12T23:01:43.774631692Z" level=info msg="Container 5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:43.789571 containerd[1728]: time="2025-09-12T23:01:43.789546093Z" level=info msg="CreateContainer within sandbox \"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536\"" Sep 12 23:01:43.791694 containerd[1728]: time="2025-09-12T23:01:43.790012764Z" level=info msg="StartContainer for \"5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536\"" Sep 12 23:01:43.791694 containerd[1728]: time="2025-09-12T23:01:43.791179274Z" level=info msg="connecting to shim 5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536" address="unix:///run/containerd/s/9ef137b8e368b243c3152ed6324e2ecec0b0204389a2b695f05a48894e02acc7" protocol=ttrpc version=3 Sep 12 23:01:43.810614 systemd[1]: Started cri-containerd-5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536.scope - libcontainer container 5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536. Sep 12 23:01:43.814768 containerd[1728]: time="2025-09-12T23:01:43.814748393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-xm6wc,Uid:c7bc1133-2e59-4613-a38f-a2f5f55d823e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:43.846789 containerd[1728]: time="2025-09-12T23:01:43.846727019Z" level=info msg="StartContainer for \"5260885d217c932cfaab092a13ae8f4972c1d8ecd616ee939e488737f4968536\" returns successfully" Sep 12 23:01:43.903710 systemd-networkd[1575]: calide5ae8066f3: Link UP Sep 12 23:01:43.904400 systemd-networkd[1575]: calide5ae8066f3: Gained carrier Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.856 [INFO][5141] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0 calico-apiserver-746b8b545d- calico-apiserver c7bc1133-2e59-4613-a38f-a2f5f55d823e 834 0 2025-09-12 23:01:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:746b8b545d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c calico-apiserver-746b8b545d-xm6wc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calide5ae8066f3 [] [] }} ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.856 [INFO][5141] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.873 [INFO][5164] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" HandleID="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.873 [INFO][5164] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" HandleID="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f2b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.0.0-a-36add7270c", "pod":"calico-apiserver-746b8b545d-xm6wc", "timestamp":"2025-09-12 23:01:43.873223585 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.873 [INFO][5164] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.873 [INFO][5164] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.873 [INFO][5164] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.878 [INFO][5164] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.881 [INFO][5164] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.883 [INFO][5164] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.884 [INFO][5164] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.886 [INFO][5164] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.886 [INFO][5164] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.887 [INFO][5164] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.892 [INFO][5164] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.900 [INFO][5164] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.199/26] block=192.168.35.192/26 handle="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.900 [INFO][5164] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.199/26] handle="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.900 [INFO][5164] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:43.914289 containerd[1728]: 2025-09-12 23:01:43.900 [INFO][5164] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.199/26] IPv6=[] ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" HandleID="k8s-pod-network.dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Workload="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.901 [INFO][5141] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0", GenerateName:"calico-apiserver-746b8b545d-", Namespace:"calico-apiserver", SelfLink:"", UID:"c7bc1133-2e59-4613-a38f-a2f5f55d823e", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"746b8b545d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"calico-apiserver-746b8b545d-xm6wc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calide5ae8066f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.901 [INFO][5141] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.199/32] ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.901 [INFO][5141] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calide5ae8066f3 ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.902 [INFO][5141] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.902 [INFO][5141] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0", GenerateName:"calico-apiserver-746b8b545d-", Namespace:"calico-apiserver", SelfLink:"", UID:"c7bc1133-2e59-4613-a38f-a2f5f55d823e", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"746b8b545d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd", Pod:"calico-apiserver-746b8b545d-xm6wc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calide5ae8066f3", MAC:"56:79:d7:62:27:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:43.915193 containerd[1728]: 2025-09-12 23:01:43.913 [INFO][5141] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" Namespace="calico-apiserver" Pod="calico-apiserver-746b8b545d-xm6wc" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-calico--apiserver--746b8b545d--xm6wc-eth0" Sep 12 23:01:43.950433 containerd[1728]: time="2025-09-12T23:01:43.949592931Z" level=info msg="connecting to shim dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd" address="unix:///run/containerd/s/2d8abca257625d0b98df91da572056e8fce2b59d93b32874f6e3947afb7ad076" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:43.964169 kubelet[3136]: I0912 23:01:43.964127 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vkg6h" podStartSLOduration=38.964110999 podStartE2EDuration="38.964110999s" podCreationTimestamp="2025-09-12 23:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:43.962879362 +0000 UTC m=+44.234506156" watchObservedRunningTime="2025-09-12 23:01:43.964110999 +0000 UTC m=+44.235737828" Sep 12 23:01:43.973715 systemd[1]: Started cri-containerd-dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd.scope - libcontainer container dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd. Sep 12 23:01:44.039935 containerd[1728]: time="2025-09-12T23:01:44.039864257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-746b8b545d-xm6wc,Uid:c7bc1133-2e59-4613-a38f-a2f5f55d823e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd\"" Sep 12 23:01:44.515940 systemd-networkd[1575]: cali9c88815ac94: Gained IPv6LL Sep 12 23:01:44.579817 systemd-networkd[1575]: cali6f617236da8: Gained IPv6LL Sep 12 23:01:44.643600 systemd-networkd[1575]: cali7f6b350f280: Gained IPv6LL Sep 12 23:01:44.813100 containerd[1728]: time="2025-09-12T23:01:44.812884861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hzg8c,Uid:c92e1465-873d-4803-a773-cd505c1cec05,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:44.917861 systemd-networkd[1575]: calib3ec6b3e0e2: Link UP Sep 12 23:01:44.919257 systemd-networkd[1575]: calib3ec6b3e0e2: Gained carrier Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.853 [INFO][5231] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0 goldmane-54d579b49d- calico-system c92e1465-873d-4803-a773-cd505c1cec05 835 0 2025-09-12 23:01:15 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.0.0-a-36add7270c goldmane-54d579b49d-hzg8c eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib3ec6b3e0e2 [] [] }} ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.853 [INFO][5231] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.878 [INFO][5244] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" HandleID="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Workload="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.879 [INFO][5244] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" HandleID="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Workload="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f190), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.0.0-a-36add7270c", "pod":"goldmane-54d579b49d-hzg8c", "timestamp":"2025-09-12 23:01:44.87875555 +0000 UTC"}, Hostname:"ci-4459.0.0-a-36add7270c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.879 [INFO][5244] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.879 [INFO][5244] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.879 [INFO][5244] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.0.0-a-36add7270c' Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.885 [INFO][5244] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.889 [INFO][5244] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.892 [INFO][5244] ipam/ipam.go 511: Trying affinity for 192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.893 [INFO][5244] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.896 [INFO][5244] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.192/26 host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.896 [INFO][5244] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.192/26 handle="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.897 [INFO][5244] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88 Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.900 [INFO][5244] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.192/26 handle="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.911 [INFO][5244] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.200/26] block=192.168.35.192/26 handle="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.911 [INFO][5244] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.200/26] handle="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" host="ci-4459.0.0-a-36add7270c" Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.911 [INFO][5244] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:44.937667 containerd[1728]: 2025-09-12 23:01:44.911 [INFO][5244] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.200/26] IPv6=[] ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" HandleID="k8s-pod-network.23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Workload="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.912 [INFO][5231] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"c92e1465-873d-4803-a773-cd505c1cec05", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"", Pod:"goldmane-54d579b49d-hzg8c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3ec6b3e0e2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.912 [INFO][5231] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.200/32] ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.912 [INFO][5231] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3ec6b3e0e2 ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.919 [INFO][5231] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.919 [INFO][5231] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"c92e1465-873d-4803-a773-cd505c1cec05", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.0.0-a-36add7270c", ContainerID:"23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88", Pod:"goldmane-54d579b49d-hzg8c", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3ec6b3e0e2", MAC:"b6:2e:1c:d4:23:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:44.938244 containerd[1728]: 2025-09-12 23:01:44.934 [INFO][5231] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" Namespace="calico-system" Pod="goldmane-54d579b49d-hzg8c" WorkloadEndpoint="ci--4459.0.0--a--36add7270c-k8s-goldmane--54d579b49d--hzg8c-eth0" Sep 12 23:01:44.999403 containerd[1728]: time="2025-09-12T23:01:44.999348055Z" level=info msg="connecting to shim 23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88" address="unix:///run/containerd/s/778f1b65f51f810e3ee303d4f2632c06cc7620d05f5be85675d9438bbe993569" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:45.027612 systemd-networkd[1575]: calide5ae8066f3: Gained IPv6LL Sep 12 23:01:45.044705 systemd[1]: Started cri-containerd-23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88.scope - libcontainer container 23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88. Sep 12 23:01:45.237519 containerd[1728]: time="2025-09-12T23:01:45.237481015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hzg8c,Uid:c92e1465-873d-4803-a773-cd505c1cec05,Namespace:calico-system,Attempt:0,} returns sandbox id \"23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88\"" Sep 12 23:01:45.755556 containerd[1728]: time="2025-09-12T23:01:45.755518266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:45.757981 containerd[1728]: time="2025-09-12T23:01:45.757959770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 23:01:45.760800 containerd[1728]: time="2025-09-12T23:01:45.760762949Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:45.764820 containerd[1728]: time="2025-09-12T23:01:45.764375058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:45.764820 containerd[1728]: time="2025-09-12T23:01:45.764745901Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.016286197s" Sep 12 23:01:45.764820 containerd[1728]: time="2025-09-12T23:01:45.764767935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:01:45.765850 containerd[1728]: time="2025-09-12T23:01:45.765829028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 23:01:45.771616 containerd[1728]: time="2025-09-12T23:01:45.771304829Z" level=info msg="CreateContainer within sandbox \"6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:01:45.805743 containerd[1728]: time="2025-09-12T23:01:45.804707150Z" level=info msg="Container 9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:45.823421 containerd[1728]: time="2025-09-12T23:01:45.823382757Z" level=info msg="CreateContainer within sandbox \"6ccffa12221de834d8fd44f276f525b8559e913cfc9668177abfb5f9ab4ff392\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711\"" Sep 12 23:01:45.823931 containerd[1728]: time="2025-09-12T23:01:45.823872780Z" level=info msg="StartContainer for \"9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711\"" Sep 12 23:01:45.824850 containerd[1728]: time="2025-09-12T23:01:45.824826437Z" level=info msg="connecting to shim 9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711" address="unix:///run/containerd/s/e027a65fafef1fb9e30aef58a198510e8e176cb60b7dd55f0f3051a09e87b334" protocol=ttrpc version=3 Sep 12 23:01:45.848606 systemd[1]: Started cri-containerd-9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711.scope - libcontainer container 9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711. Sep 12 23:01:45.886206 containerd[1728]: time="2025-09-12T23:01:45.886124549Z" level=info msg="StartContainer for \"9ae568921475b8fdf9910284e3128c99fdce1c44523efd79cb43c61f9400c711\" returns successfully" Sep 12 23:01:46.435639 systemd-networkd[1575]: calib3ec6b3e0e2: Gained IPv6LL Sep 12 23:01:46.967261 kubelet[3136]: I0912 23:01:46.967199 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:01:48.206124 containerd[1728]: time="2025-09-12T23:01:48.206092764Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.209122 containerd[1728]: time="2025-09-12T23:01:48.209092769Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 23:01:48.211623 containerd[1728]: time="2025-09-12T23:01:48.211483755Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.217876 containerd[1728]: time="2025-09-12T23:01:48.217846654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.218316 containerd[1728]: time="2025-09-12T23:01:48.218235439Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.452377668s" Sep 12 23:01:48.218316 containerd[1728]: time="2025-09-12T23:01:48.218259956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 23:01:48.220518 containerd[1728]: time="2025-09-12T23:01:48.219316030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 23:01:48.236328 containerd[1728]: time="2025-09-12T23:01:48.236294373Z" level=info msg="CreateContainer within sandbox \"65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 23:01:48.260520 containerd[1728]: time="2025-09-12T23:01:48.259142496Z" level=info msg="Container 68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:48.280744 containerd[1728]: time="2025-09-12T23:01:48.280710246Z" level=info msg="CreateContainer within sandbox \"65ce0cc185a79072b086137aa19a56d023a2d6f80dd3cfe29c392dd137f184e0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\"" Sep 12 23:01:48.281225 containerd[1728]: time="2025-09-12T23:01:48.281206082Z" level=info msg="StartContainer for \"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\"" Sep 12 23:01:48.282448 containerd[1728]: time="2025-09-12T23:01:48.282417200Z" level=info msg="connecting to shim 68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45" address="unix:///run/containerd/s/405229b982476da4ff081c158bc21c25f2b447ae08085196a466a005387b8239" protocol=ttrpc version=3 Sep 12 23:01:48.303643 systemd[1]: Started cri-containerd-68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45.scope - libcontainer container 68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45. Sep 12 23:01:48.370333 containerd[1728]: time="2025-09-12T23:01:48.370310984Z" level=info msg="StartContainer for \"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" returns successfully" Sep 12 23:01:48.987718 kubelet[3136]: I0912 23:01:48.987672 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-565597cb9f-xh8s4" podStartSLOduration=28.119631669 podStartE2EDuration="32.987568838s" podCreationTimestamp="2025-09-12 23:01:16 +0000 UTC" firstStartedPulling="2025-09-12 23:01:43.350919607 +0000 UTC m=+43.622546406" lastFinishedPulling="2025-09-12 23:01:48.218856788 +0000 UTC m=+48.490483575" observedRunningTime="2025-09-12 23:01:48.987176793 +0000 UTC m=+49.258803585" watchObservedRunningTime="2025-09-12 23:01:48.987568838 +0000 UTC m=+49.259195671" Sep 12 23:01:48.990257 kubelet[3136]: I0912 23:01:48.990214 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-746b8b545d-8247x" podStartSLOduration=32.367654553 podStartE2EDuration="34.990194071s" podCreationTimestamp="2025-09-12 23:01:14 +0000 UTC" firstStartedPulling="2025-09-12 23:01:43.142878622 +0000 UTC m=+43.414505410" lastFinishedPulling="2025-09-12 23:01:45.765418141 +0000 UTC m=+46.037044928" observedRunningTime="2025-09-12 23:01:45.978990986 +0000 UTC m=+46.250617780" watchObservedRunningTime="2025-09-12 23:01:48.990194071 +0000 UTC m=+49.261820867" Sep 12 23:01:49.014626 containerd[1728]: time="2025-09-12T23:01:49.014604372Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"0a799c69e2a443b1150cebd82a291d8bde3856c044c6e3fd7fd27ed5f1128275\" pid:5412 exited_at:{seconds:1757718109 nanos:14200558}" Sep 12 23:01:49.850719 containerd[1728]: time="2025-09-12T23:01:49.850691886Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.856478 containerd[1728]: time="2025-09-12T23:01:49.856406077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 23:01:49.860462 containerd[1728]: time="2025-09-12T23:01:49.860441185Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.863608 containerd[1728]: time="2025-09-12T23:01:49.863583469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.863916 containerd[1728]: time="2025-09-12T23:01:49.863897983Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.644556397s" Sep 12 23:01:49.863977 containerd[1728]: time="2025-09-12T23:01:49.863965482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 23:01:49.864675 containerd[1728]: time="2025-09-12T23:01:49.864656644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:01:49.869555 containerd[1728]: time="2025-09-12T23:01:49.869534502Z" level=info msg="CreateContainer within sandbox \"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 23:01:49.893944 containerd[1728]: time="2025-09-12T23:01:49.891725866Z" level=info msg="Container c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:49.910992 containerd[1728]: time="2025-09-12T23:01:49.910969374Z" level=info msg="CreateContainer within sandbox \"a696b9ce369d85bb543ab81fb036f9da421b9fb1589f50bd8f8a0e96360f0e95\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2\"" Sep 12 23:01:49.911600 containerd[1728]: time="2025-09-12T23:01:49.911457904Z" level=info msg="StartContainer for \"c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2\"" Sep 12 23:01:49.912848 containerd[1728]: time="2025-09-12T23:01:49.912820515Z" level=info msg="connecting to shim c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2" address="unix:///run/containerd/s/9ef137b8e368b243c3152ed6324e2ecec0b0204389a2b695f05a48894e02acc7" protocol=ttrpc version=3 Sep 12 23:01:49.933640 systemd[1]: Started cri-containerd-c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2.scope - libcontainer container c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2. Sep 12 23:01:49.975509 containerd[1728]: time="2025-09-12T23:01:49.974355405Z" level=info msg="StartContainer for \"c53f0b6168770b5b48f32565743187e0af086229f1e1c2d1a516e3136890ecc2\" returns successfully" Sep 12 23:01:50.223078 containerd[1728]: time="2025-09-12T23:01:50.223014493Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:50.226512 containerd[1728]: time="2025-09-12T23:01:50.225807386Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 23:01:50.227021 containerd[1728]: time="2025-09-12T23:01:50.226986292Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 362.295244ms" Sep 12 23:01:50.227141 containerd[1728]: time="2025-09-12T23:01:50.227023777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:01:50.227881 containerd[1728]: time="2025-09-12T23:01:50.227786086Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 23:01:50.233755 containerd[1728]: time="2025-09-12T23:01:50.233733542Z" level=info msg="CreateContainer within sandbox \"dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:01:50.252992 containerd[1728]: time="2025-09-12T23:01:50.251731018Z" level=info msg="Container f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:50.271860 containerd[1728]: time="2025-09-12T23:01:50.271838851Z" level=info msg="CreateContainer within sandbox \"dd51014c46c3687078470c38b0b85b8ea47c08a8d5d36868b55e25c6ff0491dd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b\"" Sep 12 23:01:50.272349 containerd[1728]: time="2025-09-12T23:01:50.272174315Z" level=info msg="StartContainer for \"f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b\"" Sep 12 23:01:50.273325 containerd[1728]: time="2025-09-12T23:01:50.273287281Z" level=info msg="connecting to shim f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b" address="unix:///run/containerd/s/2d8abca257625d0b98df91da572056e8fce2b59d93b32874f6e3947afb7ad076" protocol=ttrpc version=3 Sep 12 23:01:50.290648 systemd[1]: Started cri-containerd-f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b.scope - libcontainer container f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b. Sep 12 23:01:50.329115 containerd[1728]: time="2025-09-12T23:01:50.329085969Z" level=info msg="StartContainer for \"f82b921967bb2a75cb0c3599b369bf85269033db434515c0bc79e0184524eb6b\" returns successfully" Sep 12 23:01:50.895172 kubelet[3136]: I0912 23:01:50.895137 3136 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 23:01:50.895741 kubelet[3136]: I0912 23:01:50.895267 3136 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 23:01:51.001223 kubelet[3136]: I0912 23:01:51.001157 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-746b8b545d-xm6wc" podStartSLOduration=30.814234483 podStartE2EDuration="37.001144166s" podCreationTimestamp="2025-09-12 23:01:14 +0000 UTC" firstStartedPulling="2025-09-12 23:01:44.040567391 +0000 UTC m=+44.312194186" lastFinishedPulling="2025-09-12 23:01:50.227477071 +0000 UTC m=+50.499103869" observedRunningTime="2025-09-12 23:01:51.000319802 +0000 UTC m=+51.271946594" watchObservedRunningTime="2025-09-12 23:01:51.001144166 +0000 UTC m=+51.272770955" Sep 12 23:01:51.016761 kubelet[3136]: I0912 23:01:51.016720 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7vb8z" podStartSLOduration=26.335809952 podStartE2EDuration="35.01670712s" podCreationTimestamp="2025-09-12 23:01:16 +0000 UTC" firstStartedPulling="2025-09-12 23:01:41.183660379 +0000 UTC m=+41.455287166" lastFinishedPulling="2025-09-12 23:01:49.864557541 +0000 UTC m=+50.136184334" observedRunningTime="2025-09-12 23:01:51.016608464 +0000 UTC m=+51.288235261" watchObservedRunningTime="2025-09-12 23:01:51.01670712 +0000 UTC m=+51.288333911" Sep 12 23:01:51.242101 kubelet[3136]: I0912 23:01:51.241625 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:01:51.989205 kubelet[3136]: I0912 23:01:51.989122 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:01:52.569023 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount360867666.mount: Deactivated successfully. Sep 12 23:01:53.358429 containerd[1728]: time="2025-09-12T23:01:53.358396574Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:53.361060 containerd[1728]: time="2025-09-12T23:01:53.361035675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 23:01:53.364234 containerd[1728]: time="2025-09-12T23:01:53.364002482Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:53.369652 containerd[1728]: time="2025-09-12T23:01:53.369630162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:53.370097 containerd[1728]: time="2025-09-12T23:01:53.370076988Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.142264671s" Sep 12 23:01:53.370137 containerd[1728]: time="2025-09-12T23:01:53.370104118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 23:01:53.377873 containerd[1728]: time="2025-09-12T23:01:53.377851568Z" level=info msg="CreateContainer within sandbox \"23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 23:01:53.397504 containerd[1728]: time="2025-09-12T23:01:53.395402107Z" level=info msg="Container e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:53.418136 containerd[1728]: time="2025-09-12T23:01:53.418110961Z" level=info msg="CreateContainer within sandbox \"23bf64179fcb30afab9a6890eff7a58f67fdbcf616e60badedf3ed66b049ff88\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\"" Sep 12 23:01:53.418655 containerd[1728]: time="2025-09-12T23:01:53.418513072Z" level=info msg="StartContainer for \"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\"" Sep 12 23:01:53.419595 containerd[1728]: time="2025-09-12T23:01:53.419470660Z" level=info msg="connecting to shim e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491" address="unix:///run/containerd/s/778f1b65f51f810e3ee303d4f2632c06cc7620d05f5be85675d9438bbe993569" protocol=ttrpc version=3 Sep 12 23:01:53.442636 systemd[1]: Started cri-containerd-e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491.scope - libcontainer container e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491. Sep 12 23:01:53.488217 containerd[1728]: time="2025-09-12T23:01:53.488192562Z" level=info msg="StartContainer for \"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" returns successfully" Sep 12 23:01:54.059137 containerd[1728]: time="2025-09-12T23:01:54.059080992Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"af56b5683a538c3312949854276a89d1af5bfd0478c7e225917342bffe8bf5f7\" pid:5561 exit_status:1 exited_at:{seconds:1757718114 nanos:58788414}" Sep 12 23:01:55.050038 containerd[1728]: time="2025-09-12T23:01:55.050012950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"aaacaddfdc7744a3a65d80f2ffdd1bdfe334997408e7bca9dfabc94a933ecb11\" pid:5588 exit_status:1 exited_at:{seconds:1757718115 nanos:49864008}" Sep 12 23:02:03.923904 containerd[1728]: time="2025-09-12T23:02:03.923863207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"7401aa4247d29ddc61bfaca3fdbfa5ea6ce46a79ea29b2f3ea2ff0d970277c9c\" pid:5618 exited_at:{seconds:1757718123 nanos:923520586}" Sep 12 23:02:07.981512 containerd[1728]: time="2025-09-12T23:02:07.981385822Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"e43b5596b710e47f1a17b97794c0caf64f2e175e39d6221d814aa016201e4edc\" pid:5646 exited_at:{seconds:1757718127 nanos:981131078}" Sep 12 23:02:07.997835 kubelet[3136]: I0912 23:02:07.997784 3136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-hzg8c" podStartSLOduration=44.865209012 podStartE2EDuration="52.997770535s" podCreationTimestamp="2025-09-12 23:01:15 +0000 UTC" firstStartedPulling="2025-09-12 23:01:45.239219384 +0000 UTC m=+45.510846182" lastFinishedPulling="2025-09-12 23:01:53.371780912 +0000 UTC m=+53.643407705" observedRunningTime="2025-09-12 23:01:54.008369703 +0000 UTC m=+54.279996497" watchObservedRunningTime="2025-09-12 23:02:07.997770535 +0000 UTC m=+68.269397400" Sep 12 23:02:09.183064 containerd[1728]: time="2025-09-12T23:02:09.183036127Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"4e6f50f8167a28abcad33ff8bb05a0350a3dacde5f4a73f86bcc66c1498c5a5a\" pid:5671 exited_at:{seconds:1757718129 nanos:182843130}" Sep 12 23:02:18.797530 kubelet[3136]: I0912 23:02:18.797189 3136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:02:19.011089 containerd[1728]: time="2025-09-12T23:02:19.011056305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"5d2299b75b8a5f23b462446933470655e19c915cef8eaa7c1f0cb02cb00193d7\" pid:5695 exited_at:{seconds:1757718139 nanos:10689922}" Sep 12 23:02:25.054378 containerd[1728]: time="2025-09-12T23:02:25.054340585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"a8b8ab270d7e37178a428c0e5ee0f3239a406c5edc9a147822424bb89f5f8296\" pid:5724 exited_at:{seconds:1757718145 nanos:54092922}" Sep 12 23:02:37.982617 containerd[1728]: time="2025-09-12T23:02:37.982479746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"5a6f11d8755b866b00e064ae75a7671c1643bf4b40af5affa1e5fc8a107b37c1\" pid:5751 exited_at:{seconds:1757718157 nanos:982254633}" Sep 12 23:02:49.019299 containerd[1728]: time="2025-09-12T23:02:49.019254192Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"eed0f8bc5101719d40034497174e06c9e397265a44a5caec7a033f63f87d39f3\" pid:5777 exited_at:{seconds:1757718169 nanos:19075359}" Sep 12 23:02:55.053666 containerd[1728]: time="2025-09-12T23:02:55.053542702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"c586279d191afa36192fe5030f4e4957302d8ed45d3314dbdc4214815e9e5318\" pid:5799 exited_at:{seconds:1757718175 nanos:53322802}" Sep 12 23:03:03.925938 containerd[1728]: time="2025-09-12T23:03:03.925896747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"cc029ac6e70289045eca4e51345ba18dbefdb7023c918aef6866b46930b20f9b\" pid:5829 exited_at:{seconds:1757718183 nanos:925717849}" Sep 12 23:03:07.978556 containerd[1728]: time="2025-09-12T23:03:07.978517273Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"1cd9cb5c8cfb2c501fd7596ff5fe902673ad4869147287622a423ccd943ba89e\" pid:5856 exited_at:{seconds:1757718187 nanos:978262530}" Sep 12 23:03:09.184599 containerd[1728]: time="2025-09-12T23:03:09.184565864Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"54d6eacc90bd1894e1fd7bd002d3ffb32a6fdb34fae5ed2be2619481ee53b131\" pid:5879 exited_at:{seconds:1757718189 nanos:184372529}" Sep 12 23:03:18.104421 systemd[1]: Started sshd@7-10.200.8.17:22-10.200.16.10:34444.service - OpenSSH per-connection server daemon (10.200.16.10:34444). Sep 12 23:03:18.726533 sshd[5914]: Accepted publickey for core from 10.200.16.10 port 34444 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:18.727365 sshd-session[5914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:18.731187 systemd-logind[1704]: New session 10 of user core. Sep 12 23:03:18.733620 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 23:03:19.009992 containerd[1728]: time="2025-09-12T23:03:19.009762116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"ea22a60ba40c7ecbd2f03b460ef7db91dd445caab0ed4490e69e964938b373af\" pid:5931 exited_at:{seconds:1757718199 nanos:9390404}" Sep 12 23:03:19.288640 sshd[5917]: Connection closed by 10.200.16.10 port 34444 Sep 12 23:03:19.289143 sshd-session[5914]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:19.291792 systemd-logind[1704]: Session 10 logged out. Waiting for processes to exit. Sep 12 23:03:19.291981 systemd[1]: sshd@7-10.200.8.17:22-10.200.16.10:34444.service: Deactivated successfully. Sep 12 23:03:19.293941 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 23:03:19.295619 systemd-logind[1704]: Removed session 10. Sep 12 23:03:24.410747 systemd[1]: Started sshd@8-10.200.8.17:22-10.200.16.10:45052.service - OpenSSH per-connection server daemon (10.200.16.10:45052). Sep 12 23:03:25.054075 sshd[5953]: Accepted publickey for core from 10.200.16.10 port 45052 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:25.055186 sshd-session[5953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:25.065278 systemd-logind[1704]: New session 11 of user core. Sep 12 23:03:25.070648 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 23:03:25.071765 containerd[1728]: time="2025-09-12T23:03:25.071624305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"6e858de205ca184eefe47dc7c26110a3c35abb6674a76b2f7733f054d04b75e9\" pid:5968 exited_at:{seconds:1757718205 nanos:70298730}" Sep 12 23:03:25.545481 sshd[5978]: Connection closed by 10.200.16.10 port 45052 Sep 12 23:03:25.545833 sshd-session[5953]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:25.548447 systemd[1]: sshd@8-10.200.8.17:22-10.200.16.10:45052.service: Deactivated successfully. Sep 12 23:03:25.549937 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 23:03:25.550921 systemd-logind[1704]: Session 11 logged out. Waiting for processes to exit. Sep 12 23:03:25.551883 systemd-logind[1704]: Removed session 11. Sep 12 23:03:30.661253 systemd[1]: Started sshd@9-10.200.8.17:22-10.200.16.10:49456.service - OpenSSH per-connection server daemon (10.200.16.10:49456). Sep 12 23:03:31.280780 sshd[5991]: Accepted publickey for core from 10.200.16.10 port 49456 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:31.281631 sshd-session[5991]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:31.285308 systemd-logind[1704]: New session 12 of user core. Sep 12 23:03:31.294618 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 23:03:31.769817 sshd[5994]: Connection closed by 10.200.16.10 port 49456 Sep 12 23:03:31.769060 sshd-session[5991]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:31.772699 systemd[1]: sshd@9-10.200.8.17:22-10.200.16.10:49456.service: Deactivated successfully. Sep 12 23:03:31.774257 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 23:03:31.775369 systemd-logind[1704]: Session 12 logged out. Waiting for processes to exit. Sep 12 23:03:31.776649 systemd-logind[1704]: Removed session 12. Sep 12 23:03:31.878951 systemd[1]: Started sshd@10-10.200.8.17:22-10.200.16.10:49460.service - OpenSSH per-connection server daemon (10.200.16.10:49460). Sep 12 23:03:32.497828 sshd[6007]: Accepted publickey for core from 10.200.16.10 port 49460 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:32.498664 sshd-session[6007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:32.502061 systemd-logind[1704]: New session 13 of user core. Sep 12 23:03:32.505610 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 23:03:32.997583 sshd[6011]: Connection closed by 10.200.16.10 port 49460 Sep 12 23:03:32.997921 sshd-session[6007]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:33.000256 systemd-logind[1704]: Session 13 logged out. Waiting for processes to exit. Sep 12 23:03:33.000773 systemd[1]: sshd@10-10.200.8.17:22-10.200.16.10:49460.service: Deactivated successfully. Sep 12 23:03:33.002172 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 23:03:33.003121 systemd-logind[1704]: Removed session 13. Sep 12 23:03:33.107581 systemd[1]: Started sshd@11-10.200.8.17:22-10.200.16.10:49472.service - OpenSSH per-connection server daemon (10.200.16.10:49472). Sep 12 23:03:33.733466 sshd[6021]: Accepted publickey for core from 10.200.16.10 port 49472 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:33.734191 sshd-session[6021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:33.737481 systemd-logind[1704]: New session 14 of user core. Sep 12 23:03:33.741613 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 23:03:34.244497 sshd[6024]: Connection closed by 10.200.16.10 port 49472 Sep 12 23:03:34.244810 sshd-session[6021]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:34.247380 systemd-logind[1704]: Session 14 logged out. Waiting for processes to exit. Sep 12 23:03:34.247446 systemd[1]: sshd@11-10.200.8.17:22-10.200.16.10:49472.service: Deactivated successfully. Sep 12 23:03:34.248920 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 23:03:34.250193 systemd-logind[1704]: Removed session 14. Sep 12 23:03:37.979402 containerd[1728]: time="2025-09-12T23:03:37.979351900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"8a1f7bbccbab11bc96b3adab2a08f6c6691bab670983aa023b5909b3ac754607\" pid:6054 exit_status:1 exited_at:{seconds:1757718217 nanos:979123861}" Sep 12 23:03:39.355920 systemd[1]: Started sshd@12-10.200.8.17:22-10.200.16.10:49484.service - OpenSSH per-connection server daemon (10.200.16.10:49484). Sep 12 23:03:39.981801 sshd[6069]: Accepted publickey for core from 10.200.16.10 port 49484 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:39.982790 sshd-session[6069]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:39.986376 systemd-logind[1704]: New session 15 of user core. Sep 12 23:03:39.992616 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 23:03:40.464604 sshd[6072]: Connection closed by 10.200.16.10 port 49484 Sep 12 23:03:40.464955 sshd-session[6069]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:40.467355 systemd[1]: sshd@12-10.200.8.17:22-10.200.16.10:49484.service: Deactivated successfully. Sep 12 23:03:40.468881 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 23:03:40.469553 systemd-logind[1704]: Session 15 logged out. Waiting for processes to exit. Sep 12 23:03:40.471087 systemd-logind[1704]: Removed session 15. Sep 12 23:03:45.578204 systemd[1]: Started sshd@13-10.200.8.17:22-10.200.16.10:41946.service - OpenSSH per-connection server daemon (10.200.16.10:41946). Sep 12 23:03:46.206420 sshd[6084]: Accepted publickey for core from 10.200.16.10 port 41946 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:46.207407 sshd-session[6084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:46.211156 systemd-logind[1704]: New session 16 of user core. Sep 12 23:03:46.213635 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 23:03:46.690140 sshd[6087]: Connection closed by 10.200.16.10 port 41946 Sep 12 23:03:46.690480 sshd-session[6084]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:46.692944 systemd[1]: sshd@13-10.200.8.17:22-10.200.16.10:41946.service: Deactivated successfully. Sep 12 23:03:46.694552 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 23:03:46.695206 systemd-logind[1704]: Session 16 logged out. Waiting for processes to exit. Sep 12 23:03:46.696234 systemd-logind[1704]: Removed session 16. Sep 12 23:03:49.008144 containerd[1728]: time="2025-09-12T23:03:49.008111597Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"ec435f9830f5818e70b07b089cb95ab7f3a9c1d424e7dce3cbf1fe0927a438ab\" pid:6111 exited_at:{seconds:1757718229 nanos:7894839}" Sep 12 23:03:51.802098 systemd[1]: Started sshd@14-10.200.8.17:22-10.200.16.10:59454.service - OpenSSH per-connection server daemon (10.200.16.10:59454). Sep 12 23:03:52.428974 sshd[6122]: Accepted publickey for core from 10.200.16.10 port 59454 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:52.431177 sshd-session[6122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:52.435705 systemd-logind[1704]: New session 17 of user core. Sep 12 23:03:52.443183 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 23:03:52.944372 sshd[6125]: Connection closed by 10.200.16.10 port 59454 Sep 12 23:03:52.945177 sshd-session[6122]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:52.949340 systemd[1]: sshd@14-10.200.8.17:22-10.200.16.10:59454.service: Deactivated successfully. Sep 12 23:03:52.952695 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 23:03:52.953603 systemd-logind[1704]: Session 17 logged out. Waiting for processes to exit. Sep 12 23:03:52.955631 systemd-logind[1704]: Removed session 17. Sep 12 23:03:53.056171 systemd[1]: Started sshd@15-10.200.8.17:22-10.200.16.10:59462.service - OpenSSH per-connection server daemon (10.200.16.10:59462). Sep 12 23:03:53.690041 sshd[6137]: Accepted publickey for core from 10.200.16.10 port 59462 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:53.690430 sshd-session[6137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:53.695940 systemd-logind[1704]: New session 18 of user core. Sep 12 23:03:53.703652 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 23:03:54.243182 sshd[6140]: Connection closed by 10.200.16.10 port 59462 Sep 12 23:03:54.243068 sshd-session[6137]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:54.246777 systemd[1]: sshd@15-10.200.8.17:22-10.200.16.10:59462.service: Deactivated successfully. Sep 12 23:03:54.247663 systemd-logind[1704]: Session 18 logged out. Waiting for processes to exit. Sep 12 23:03:54.248905 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 23:03:54.251052 systemd-logind[1704]: Removed session 18. Sep 12 23:03:54.357681 systemd[1]: Started sshd@16-10.200.8.17:22-10.200.16.10:59470.service - OpenSSH per-connection server daemon (10.200.16.10:59470). Sep 12 23:03:54.979463 sshd[6150]: Accepted publickey for core from 10.200.16.10 port 59470 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:54.980284 sshd-session[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:54.983767 systemd-logind[1704]: New session 19 of user core. Sep 12 23:03:54.989658 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 23:03:55.062378 containerd[1728]: time="2025-09-12T23:03:55.062346703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"17402e78904af7cf4170de5d678a26575be28d8d0272cf286d776c0433798ca4\" pid:6165 exited_at:{seconds:1757718235 nanos:61976417}" Sep 12 23:03:56.022176 sshd[6153]: Connection closed by 10.200.16.10 port 59470 Sep 12 23:03:56.022640 sshd-session[6150]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:56.026366 systemd-logind[1704]: Session 19 logged out. Waiting for processes to exit. Sep 12 23:03:56.027438 systemd[1]: sshd@16-10.200.8.17:22-10.200.16.10:59470.service: Deactivated successfully. Sep 12 23:03:56.031223 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 23:03:56.034327 systemd-logind[1704]: Removed session 19. Sep 12 23:03:56.136798 systemd[1]: Started sshd@17-10.200.8.17:22-10.200.16.10:59474.service - OpenSSH per-connection server daemon (10.200.16.10:59474). Sep 12 23:03:56.757637 sshd[6191]: Accepted publickey for core from 10.200.16.10 port 59474 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:56.758393 sshd-session[6191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:56.761933 systemd-logind[1704]: New session 20 of user core. Sep 12 23:03:56.765629 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 23:03:57.315437 sshd[6194]: Connection closed by 10.200.16.10 port 59474 Sep 12 23:03:57.315769 sshd-session[6191]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:57.318024 systemd[1]: sshd@17-10.200.8.17:22-10.200.16.10:59474.service: Deactivated successfully. Sep 12 23:03:57.319601 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 23:03:57.320282 systemd-logind[1704]: Session 20 logged out. Waiting for processes to exit. Sep 12 23:03:57.321442 systemd-logind[1704]: Removed session 20. Sep 12 23:03:57.431717 systemd[1]: Started sshd@18-10.200.8.17:22-10.200.16.10:59478.service - OpenSSH per-connection server daemon (10.200.16.10:59478). Sep 12 23:03:58.052356 sshd[6204]: Accepted publickey for core from 10.200.16.10 port 59478 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:03:58.053149 sshd-session[6204]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:58.056641 systemd-logind[1704]: New session 21 of user core. Sep 12 23:03:58.062604 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 23:03:58.531073 sshd[6207]: Connection closed by 10.200.16.10 port 59478 Sep 12 23:03:58.532133 sshd-session[6204]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:58.534538 systemd-logind[1704]: Session 21 logged out. Waiting for processes to exit. Sep 12 23:03:58.534608 systemd[1]: sshd@18-10.200.8.17:22-10.200.16.10:59478.service: Deactivated successfully. Sep 12 23:03:58.535938 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 23:03:58.537555 systemd-logind[1704]: Removed session 21. Sep 12 23:04:03.686057 systemd[1]: Started sshd@19-10.200.8.17:22-10.200.16.10:46666.service - OpenSSH per-connection server daemon (10.200.16.10:46666). Sep 12 23:04:03.930458 containerd[1728]: time="2025-09-12T23:04:03.930254577Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"2d507b23b2ad00c838e845a966404a0e65265eb8b05b2e9e45b816a730b09ad7\" pid:6239 exited_at:{seconds:1757718243 nanos:930025161}" Sep 12 23:04:04.320759 sshd[6223]: Accepted publickey for core from 10.200.16.10 port 46666 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:04:04.321674 sshd-session[6223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:04:04.324808 systemd-logind[1704]: New session 22 of user core. Sep 12 23:04:04.330639 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 23:04:04.814035 sshd[6251]: Connection closed by 10.200.16.10 port 46666 Sep 12 23:04:04.814387 sshd-session[6223]: pam_unix(sshd:session): session closed for user core Sep 12 23:04:04.817111 systemd-logind[1704]: Session 22 logged out. Waiting for processes to exit. Sep 12 23:04:04.817209 systemd[1]: sshd@19-10.200.8.17:22-10.200.16.10:46666.service: Deactivated successfully. Sep 12 23:04:04.818746 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 23:04:04.819961 systemd-logind[1704]: Removed session 22. Sep 12 23:04:07.994292 containerd[1728]: time="2025-09-12T23:04:07.994251889Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22a2101fce323ca79525c588bae8cd4cd384e9dc81d6546421115528d3a2a5c2\" id:\"da18737681503f637e469f5bdd3b92579619677d5f371f52ea232db4ad8b33f4\" pid:6276 exited_at:{seconds:1757718247 nanos:993935003}" Sep 12 23:04:09.192218 containerd[1728]: time="2025-09-12T23:04:09.192175456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"89bcb1b7ed376c94e6f6cb0b2d96793897284b848f2a8abdace422b7f162f2ac\" pid:6300 exited_at:{seconds:1757718249 nanos:191944058}" Sep 12 23:04:09.928310 systemd[1]: Started sshd@20-10.200.8.17:22-10.200.16.10:33358.service - OpenSSH per-connection server daemon (10.200.16.10:33358). Sep 12 23:04:10.551606 sshd[6310]: Accepted publickey for core from 10.200.16.10 port 33358 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:04:10.552458 sshd-session[6310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:04:10.556321 systemd-logind[1704]: New session 23 of user core. Sep 12 23:04:10.564625 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 23:04:11.044899 sshd[6313]: Connection closed by 10.200.16.10 port 33358 Sep 12 23:04:11.046635 sshd-session[6310]: pam_unix(sshd:session): session closed for user core Sep 12 23:04:11.050693 systemd-logind[1704]: Session 23 logged out. Waiting for processes to exit. Sep 12 23:04:11.053122 systemd[1]: sshd@20-10.200.8.17:22-10.200.16.10:33358.service: Deactivated successfully. Sep 12 23:04:11.055980 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 23:04:11.058084 systemd-logind[1704]: Removed session 23. Sep 12 23:04:13.849682 update_engine[1706]: I20250912 23:04:13.849640 1706 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 23:04:13.849682 update_engine[1706]: I20250912 23:04:13.849678 1706 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.849803 1706 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850095 1706 omaha_request_params.cc:62] Current group set to alpha Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850181 1706 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850188 1706 update_attempter.cc:643] Scheduling an action processor start. Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850206 1706 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850227 1706 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850271 1706 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850275 1706 omaha_request_action.cc:272] Request: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: Sep 12 23:04:13.850519 update_engine[1706]: I20250912 23:04:13.850281 1706 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:04:13.851083 locksmithd[1805]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 23:04:13.851228 update_engine[1706]: I20250912 23:04:13.851171 1706 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:04:13.851671 update_engine[1706]: I20250912 23:04:13.851649 1706 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:04:13.883661 update_engine[1706]: E20250912 23:04:13.883630 1706 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:04:13.883739 update_engine[1706]: I20250912 23:04:13.883706 1706 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 23:04:16.161079 systemd[1]: Started sshd@21-10.200.8.17:22-10.200.16.10:33368.service - OpenSSH per-connection server daemon (10.200.16.10:33368). Sep 12 23:04:16.782938 sshd[6326]: Accepted publickey for core from 10.200.16.10 port 33368 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:04:16.783850 sshd-session[6326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:04:16.787630 systemd-logind[1704]: New session 24 of user core. Sep 12 23:04:16.792863 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 23:04:17.263614 sshd[6329]: Connection closed by 10.200.16.10 port 33368 Sep 12 23:04:17.263947 sshd-session[6326]: pam_unix(sshd:session): session closed for user core Sep 12 23:04:17.266392 systemd-logind[1704]: Session 24 logged out. Waiting for processes to exit. Sep 12 23:04:17.266948 systemd[1]: sshd@21-10.200.8.17:22-10.200.16.10:33368.service: Deactivated successfully. Sep 12 23:04:17.268363 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 23:04:17.270033 systemd-logind[1704]: Removed session 24. Sep 12 23:04:19.005829 containerd[1728]: time="2025-09-12T23:04:19.005709406Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68aecf2c4346ec4bdc81cd3fbd3adcd7d806de594030f98a007bf20d21d60f45\" id:\"0e7b61abd5061b8af0cf42553aae012ecec1001b0a2f689745e0f7c8be04e6d3\" pid:6352 exited_at:{seconds:1757718259 nanos:5454564}" Sep 12 23:04:22.375527 systemd[1]: Started sshd@22-10.200.8.17:22-10.200.16.10:52926.service - OpenSSH per-connection server daemon (10.200.16.10:52926). Sep 12 23:04:22.998345 sshd[6368]: Accepted publickey for core from 10.200.16.10 port 52926 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:04:22.999468 sshd-session[6368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:04:23.003660 systemd-logind[1704]: New session 25 of user core. Sep 12 23:04:23.008636 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 23:04:23.500283 sshd[6371]: Connection closed by 10.200.16.10 port 52926 Sep 12 23:04:23.500854 sshd-session[6368]: pam_unix(sshd:session): session closed for user core Sep 12 23:04:23.504097 systemd-logind[1704]: Session 25 logged out. Waiting for processes to exit. Sep 12 23:04:23.505582 systemd[1]: sshd@22-10.200.8.17:22-10.200.16.10:52926.service: Deactivated successfully. Sep 12 23:04:23.509428 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 23:04:23.511321 systemd-logind[1704]: Removed session 25. Sep 12 23:04:23.848564 update_engine[1706]: I20250912 23:04:23.848520 1706 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:04:23.848808 update_engine[1706]: I20250912 23:04:23.848603 1706 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:04:23.848931 update_engine[1706]: I20250912 23:04:23.848907 1706 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:04:23.923977 update_engine[1706]: E20250912 23:04:23.923946 1706 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:04:23.924063 update_engine[1706]: I20250912 23:04:23.924021 1706 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 23:04:25.062103 containerd[1728]: time="2025-09-12T23:04:25.062066522Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8cc8b77b610cc708923ac9dc5b7ed4be5655c8b77019871a88dbce61c25d491\" id:\"90c94a67ca7d2790c59eb94597504dbd948ff75bacf3a65ee3ef27a6c4c248a2\" pid:6397 exited_at:{seconds:1757718265 nanos:61883648}" Sep 12 23:04:28.619231 systemd[1]: Started sshd@23-10.200.8.17:22-10.200.16.10:52942.service - OpenSSH per-connection server daemon (10.200.16.10:52942). Sep 12 23:04:29.241776 sshd[6409]: Accepted publickey for core from 10.200.16.10 port 52942 ssh2: RSA SHA256:iICXlNsXx8loOwZfJIFtboVHRCxNi5XgLL8FA7cZeOk Sep 12 23:04:29.242680 sshd-session[6409]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:04:29.247457 systemd-logind[1704]: New session 26 of user core. Sep 12 23:04:29.252651 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 23:04:29.725081 sshd[6412]: Connection closed by 10.200.16.10 port 52942 Sep 12 23:04:29.726820 sshd-session[6409]: pam_unix(sshd:session): session closed for user core Sep 12 23:04:29.729315 systemd-logind[1704]: Session 26 logged out. Waiting for processes to exit. Sep 12 23:04:29.729479 systemd[1]: sshd@23-10.200.8.17:22-10.200.16.10:52942.service: Deactivated successfully. Sep 12 23:04:29.731119 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 23:04:29.732122 systemd-logind[1704]: Removed session 26. Sep 12 23:04:33.851693 update_engine[1706]: I20250912 23:04:33.851648 1706 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:04:33.851941 update_engine[1706]: I20250912 23:04:33.851723 1706 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:04:33.852069 update_engine[1706]: I20250912 23:04:33.852044 1706 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:04:33.866906 update_engine[1706]: E20250912 23:04:33.866876 1706 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:04:33.866968 update_engine[1706]: I20250912 23:04:33.866931 1706 libcurl_http_fetcher.cc:283] No HTTP response, retry 3