Sep 12 17:41:50.927143 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 17:41:50.927186 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:41:50.927202 kernel: BIOS-provided physical RAM map: Sep 12 17:41:50.927213 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 17:41:50.927223 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000786cdfff] usable Sep 12 17:41:50.927233 kernel: BIOS-e820: [mem 0x00000000786ce000-0x000000007894dfff] reserved Sep 12 17:41:50.927246 kernel: BIOS-e820: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Sep 12 17:41:50.927258 kernel: BIOS-e820: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Sep 12 17:41:50.927271 kernel: BIOS-e820: [mem 0x00000000789de000-0x000000007c97bfff] usable Sep 12 17:41:50.927282 kernel: BIOS-e820: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Sep 12 17:41:50.927293 kernel: NX (Execute Disable) protection: active Sep 12 17:41:50.927304 kernel: APIC: Static calls initialized Sep 12 17:41:50.927315 kernel: e820: update [mem 0x768c0018-0x768c8e57] usable ==> usable Sep 12 17:41:50.927327 kernel: extended physical RAM map: Sep 12 17:41:50.927344 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 17:41:50.927356 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000768c0017] usable Sep 12 17:41:50.927369 kernel: reserve setup_data: [mem 0x00000000768c0018-0x00000000768c8e57] usable Sep 12 17:41:50.927381 kernel: reserve setup_data: [mem 0x00000000768c8e58-0x00000000786cdfff] usable Sep 12 17:41:50.927393 kernel: reserve setup_data: [mem 0x00000000786ce000-0x000000007894dfff] reserved Sep 12 17:41:50.927405 kernel: reserve setup_data: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Sep 12 17:41:50.927418 kernel: reserve setup_data: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Sep 12 17:41:50.927430 kernel: reserve setup_data: [mem 0x00000000789de000-0x000000007c97bfff] usable Sep 12 17:41:50.927442 kernel: reserve setup_data: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Sep 12 17:41:50.927454 kernel: efi: EFI v2.7 by EDK II Sep 12 17:41:50.927469 kernel: efi: SMBIOS=0x7886a000 ACPI=0x7895d000 ACPI 2.0=0x7895d014 MEMATTR=0x77003518 Sep 12 17:41:50.927481 kernel: secureboot: Secure boot disabled Sep 12 17:41:50.927493 kernel: SMBIOS 2.7 present. Sep 12 17:41:50.927505 kernel: DMI: Amazon EC2 t3.small/, BIOS 1.0 10/16/2017 Sep 12 17:41:50.927517 kernel: DMI: Memory slots populated: 1/1 Sep 12 17:41:50.927529 kernel: Hypervisor detected: KVM Sep 12 17:41:50.927541 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 17:41:50.927554 kernel: kvm-clock: using sched offset of 5225041336 cycles Sep 12 17:41:50.927567 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:41:50.927580 kernel: tsc: Detected 2499.996 MHz processor Sep 12 17:41:50.927593 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:41:50.927608 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:41:50.927621 kernel: last_pfn = 0x7c97c max_arch_pfn = 0x400000000 Sep 12 17:41:50.927633 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 12 17:41:50.927645 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:41:50.927658 kernel: Using GB pages for direct mapping Sep 12 17:41:50.927680 kernel: ACPI: Early table checksum verification disabled Sep 12 17:41:50.927704 kernel: ACPI: RSDP 0x000000007895D014 000024 (v02 AMAZON) Sep 12 17:41:50.927718 kernel: ACPI: XSDT 0x000000007895C0E8 00006C (v01 AMAZON AMZNFACP 00000001 01000013) Sep 12 17:41:50.927733 kernel: ACPI: FACP 0x0000000078955000 000114 (v01 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 12 17:41:50.927755 kernel: ACPI: DSDT 0x0000000078956000 00115A (v01 AMAZON AMZNDSDT 00000001 AMZN 00000001) Sep 12 17:41:50.927773 kernel: ACPI: FACS 0x00000000789D0000 000040 Sep 12 17:41:50.927788 kernel: ACPI: WAET 0x000000007895B000 000028 (v01 AMAZON AMZNWAET 00000001 AMZN 00000001) Sep 12 17:41:50.927803 kernel: ACPI: SLIT 0x000000007895A000 00006C (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 12 17:41:50.927818 kernel: ACPI: APIC 0x0000000078959000 000076 (v01 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 12 17:41:50.927836 kernel: ACPI: SRAT 0x0000000078958000 0000A0 (v01 AMAZON AMZNSRAT 00000001 AMZN 00000001) Sep 12 17:41:50.927852 kernel: ACPI: HPET 0x0000000078954000 000038 (v01 AMAZON AMZNHPET 00000001 AMZN 00000001) Sep 12 17:41:50.927867 kernel: ACPI: SSDT 0x0000000078953000 000759 (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Sep 12 17:41:50.927882 kernel: ACPI: SSDT 0x0000000078952000 00007F (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Sep 12 17:41:50.927897 kernel: ACPI: BGRT 0x0000000078951000 000038 (v01 AMAZON AMAZON 00000002 01000013) Sep 12 17:41:50.927929 kernel: ACPI: Reserving FACP table memory at [mem 0x78955000-0x78955113] Sep 12 17:41:50.927945 kernel: ACPI: Reserving DSDT table memory at [mem 0x78956000-0x78957159] Sep 12 17:41:50.928995 kernel: ACPI: Reserving FACS table memory at [mem 0x789d0000-0x789d003f] Sep 12 17:41:50.929022 kernel: ACPI: Reserving WAET table memory at [mem 0x7895b000-0x7895b027] Sep 12 17:41:50.929038 kernel: ACPI: Reserving SLIT table memory at [mem 0x7895a000-0x7895a06b] Sep 12 17:41:50.929053 kernel: ACPI: Reserving APIC table memory at [mem 0x78959000-0x78959075] Sep 12 17:41:50.929069 kernel: ACPI: Reserving SRAT table memory at [mem 0x78958000-0x7895809f] Sep 12 17:41:50.929084 kernel: ACPI: Reserving HPET table memory at [mem 0x78954000-0x78954037] Sep 12 17:41:50.929099 kernel: ACPI: Reserving SSDT table memory at [mem 0x78953000-0x78953758] Sep 12 17:41:50.929115 kernel: ACPI: Reserving SSDT table memory at [mem 0x78952000-0x7895207e] Sep 12 17:41:50.929130 kernel: ACPI: Reserving BGRT table memory at [mem 0x78951000-0x78951037] Sep 12 17:41:50.929145 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x7fffffff] Sep 12 17:41:50.929160 kernel: NUMA: Initialized distance table, cnt=1 Sep 12 17:41:50.929178 kernel: NODE_DATA(0) allocated [mem 0x7a8eddc0-0x7a8f4fff] Sep 12 17:41:50.929193 kernel: Zone ranges: Sep 12 17:41:50.929208 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:41:50.929223 kernel: DMA32 [mem 0x0000000001000000-0x000000007c97bfff] Sep 12 17:41:50.929238 kernel: Normal empty Sep 12 17:41:50.929253 kernel: Device empty Sep 12 17:41:50.929268 kernel: Movable zone start for each node Sep 12 17:41:50.929283 kernel: Early memory node ranges Sep 12 17:41:50.929298 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 12 17:41:50.929316 kernel: node 0: [mem 0x0000000000100000-0x00000000786cdfff] Sep 12 17:41:50.929331 kernel: node 0: [mem 0x00000000789de000-0x000000007c97bfff] Sep 12 17:41:50.929346 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007c97bfff] Sep 12 17:41:50.929361 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:41:50.929376 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 12 17:41:50.929392 kernel: On node 0, zone DMA32: 784 pages in unavailable ranges Sep 12 17:41:50.929407 kernel: On node 0, zone DMA32: 13956 pages in unavailable ranges Sep 12 17:41:50.929423 kernel: ACPI: PM-Timer IO Port: 0xb008 Sep 12 17:41:50.929438 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 17:41:50.929456 kernel: IOAPIC[0]: apic_id 0, version 32, address 0xfec00000, GSI 0-23 Sep 12 17:41:50.929471 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 17:41:50.929486 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 17:41:50.929501 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 17:41:50.929516 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 17:41:50.929530 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:41:50.929542 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 17:41:50.929553 kernel: TSC deadline timer available Sep 12 17:41:50.929564 kernel: CPU topo: Max. logical packages: 1 Sep 12 17:41:50.929576 kernel: CPU topo: Max. logical dies: 1 Sep 12 17:41:50.929591 kernel: CPU topo: Max. dies per package: 1 Sep 12 17:41:50.929603 kernel: CPU topo: Max. threads per core: 2 Sep 12 17:41:50.929615 kernel: CPU topo: Num. cores per package: 1 Sep 12 17:41:50.929629 kernel: CPU topo: Num. threads per package: 2 Sep 12 17:41:50.929642 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 12 17:41:50.929657 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 17:41:50.929671 kernel: [mem 0x7ca00000-0xffffffff] available for PCI devices Sep 12 17:41:50.929686 kernel: Booting paravirtualized kernel on KVM Sep 12 17:41:50.929700 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:41:50.929718 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 17:41:50.929731 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 12 17:41:50.929746 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 12 17:41:50.929760 kernel: pcpu-alloc: [0] 0 1 Sep 12 17:41:50.929774 kernel: kvm-guest: PV spinlocks enabled Sep 12 17:41:50.929789 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 17:41:50.929806 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:41:50.929821 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:41:50.929837 kernel: random: crng init done Sep 12 17:41:50.929851 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:41:50.929866 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 17:41:50.929879 kernel: Fallback order for Node 0: 0 Sep 12 17:41:50.929894 kernel: Built 1 zonelists, mobility grouping on. Total pages: 509451 Sep 12 17:41:50.929908 kernel: Policy zone: DMA32 Sep 12 17:41:50.929933 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:41:50.929950 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:41:50.930751 kernel: Kernel/User page tables isolation: enabled Sep 12 17:41:50.930772 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 17:41:50.930787 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 17:41:50.930807 kernel: Dynamic Preempt: voluntary Sep 12 17:41:50.930822 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:41:50.930838 kernel: rcu: RCU event tracing is enabled. Sep 12 17:41:50.930852 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:41:50.930867 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:41:50.930883 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:41:50.930900 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:41:50.930916 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:41:50.930931 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:41:50.930948 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:41:50.930984 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:41:50.931000 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:41:50.931015 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 12 17:41:50.931030 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:41:50.931049 kernel: Console: colour dummy device 80x25 Sep 12 17:41:50.931064 kernel: printk: legacy console [tty0] enabled Sep 12 17:41:50.931079 kernel: printk: legacy console [ttyS0] enabled Sep 12 17:41:50.931094 kernel: ACPI: Core revision 20240827 Sep 12 17:41:50.931110 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 30580167144 ns Sep 12 17:41:50.931126 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:41:50.931142 kernel: x2apic enabled Sep 12 17:41:50.931157 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 17:41:50.931171 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Sep 12 17:41:50.931188 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499996) Sep 12 17:41:50.931200 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 12 17:41:50.931213 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 12 17:41:50.931227 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:41:50.931241 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 17:41:50.931254 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 17:41:50.931268 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Sep 12 17:41:50.931283 kernel: RETBleed: Vulnerable Sep 12 17:41:50.931296 kernel: Speculative Store Bypass: Vulnerable Sep 12 17:41:50.931310 kernel: MDS: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 17:41:50.931324 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 17:41:50.931340 kernel: GDS: Unknown: Dependent on hypervisor status Sep 12 17:41:50.931354 kernel: active return thunk: its_return_thunk Sep 12 17:41:50.931367 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 17:41:50.931381 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:41:50.931395 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:41:50.931409 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:41:50.931424 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 12 17:41:50.931438 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 12 17:41:50.931452 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Sep 12 17:41:50.931466 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Sep 12 17:41:50.931480 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Sep 12 17:41:50.931497 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Sep 12 17:41:50.931510 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:41:50.931524 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 12 17:41:50.931538 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 12 17:41:50.931551 kernel: x86/fpu: xstate_offset[5]: 960, xstate_sizes[5]: 64 Sep 12 17:41:50.931566 kernel: x86/fpu: xstate_offset[6]: 1024, xstate_sizes[6]: 512 Sep 12 17:41:50.931579 kernel: x86/fpu: xstate_offset[7]: 1536, xstate_sizes[7]: 1024 Sep 12 17:41:50.931593 kernel: x86/fpu: xstate_offset[9]: 2560, xstate_sizes[9]: 8 Sep 12 17:41:50.931606 kernel: x86/fpu: Enabled xstate features 0x2ff, context size is 2568 bytes, using 'compacted' format. Sep 12 17:41:50.931620 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:41:50.931633 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:41:50.931650 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:41:50.931665 kernel: landlock: Up and running. Sep 12 17:41:50.931678 kernel: SELinux: Initializing. Sep 12 17:41:50.931692 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:41:50.931706 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:41:50.931720 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz (family: 0x6, model: 0x55, stepping: 0x7) Sep 12 17:41:50.931734 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Sep 12 17:41:50.931748 kernel: signal: max sigframe size: 3632 Sep 12 17:41:50.931762 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:41:50.931777 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:41:50.931795 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 17:41:50.931809 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 17:41:50.931823 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:41:50.931837 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:41:50.931851 kernel: .... node #0, CPUs: #1 Sep 12 17:41:50.931866 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Sep 12 17:41:50.931881 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 12 17:41:50.931895 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:41:50.931909 kernel: smpboot: Total of 2 processors activated (9999.98 BogoMIPS) Sep 12 17:41:50.931926 kernel: Memory: 1908056K/2037804K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 125192K reserved, 0K cma-reserved) Sep 12 17:41:50.931940 kernel: devtmpfs: initialized Sep 12 17:41:50.932989 kernel: x86/mm: Memory block size: 128MB Sep 12 17:41:50.933021 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7895e000-0x789ddfff] (524288 bytes) Sep 12 17:41:50.933043 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:41:50.933056 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:41:50.933072 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:41:50.933087 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:41:50.933103 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:41:50.933123 kernel: audit: type=2000 audit(1757698908.242:1): state=initialized audit_enabled=0 res=1 Sep 12 17:41:50.933137 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:41:50.933150 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:41:50.933164 kernel: cpuidle: using governor menu Sep 12 17:41:50.933179 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:41:50.933194 kernel: dca service started, version 1.12.1 Sep 12 17:41:50.933208 kernel: PCI: Using configuration type 1 for base access Sep 12 17:41:50.933223 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:41:50.933236 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:41:50.933254 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:41:50.933267 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:41:50.933283 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:41:50.933301 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:41:50.933315 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:41:50.933329 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:41:50.933341 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Sep 12 17:41:50.933354 kernel: ACPI: Interpreter enabled Sep 12 17:41:50.933366 kernel: ACPI: PM: (supports S0 S5) Sep 12 17:41:50.933382 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:41:50.933395 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:41:50.933408 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 17:41:50.933422 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Sep 12 17:41:50.933437 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:41:50.933667 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:41:50.933804 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Sep 12 17:41:50.933944 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Sep 12 17:41:50.933984 kernel: acpiphp: Slot [3] registered Sep 12 17:41:50.934000 kernel: acpiphp: Slot [4] registered Sep 12 17:41:50.934015 kernel: acpiphp: Slot [5] registered Sep 12 17:41:50.934029 kernel: acpiphp: Slot [6] registered Sep 12 17:41:50.934044 kernel: acpiphp: Slot [7] registered Sep 12 17:41:50.934059 kernel: acpiphp: Slot [8] registered Sep 12 17:41:50.934075 kernel: acpiphp: Slot [9] registered Sep 12 17:41:50.934089 kernel: acpiphp: Slot [10] registered Sep 12 17:41:50.934107 kernel: acpiphp: Slot [11] registered Sep 12 17:41:50.934120 kernel: acpiphp: Slot [12] registered Sep 12 17:41:50.934133 kernel: acpiphp: Slot [13] registered Sep 12 17:41:50.934146 kernel: acpiphp: Slot [14] registered Sep 12 17:41:50.934160 kernel: acpiphp: Slot [15] registered Sep 12 17:41:50.934174 kernel: acpiphp: Slot [16] registered Sep 12 17:41:50.934188 kernel: acpiphp: Slot [17] registered Sep 12 17:41:50.934202 kernel: acpiphp: Slot [18] registered Sep 12 17:41:50.934216 kernel: acpiphp: Slot [19] registered Sep 12 17:41:50.934231 kernel: acpiphp: Slot [20] registered Sep 12 17:41:50.934247 kernel: acpiphp: Slot [21] registered Sep 12 17:41:50.934260 kernel: acpiphp: Slot [22] registered Sep 12 17:41:50.934274 kernel: acpiphp: Slot [23] registered Sep 12 17:41:50.934286 kernel: acpiphp: Slot [24] registered Sep 12 17:41:50.934299 kernel: acpiphp: Slot [25] registered Sep 12 17:41:50.934312 kernel: acpiphp: Slot [26] registered Sep 12 17:41:50.934325 kernel: acpiphp: Slot [27] registered Sep 12 17:41:50.934339 kernel: acpiphp: Slot [28] registered Sep 12 17:41:50.934354 kernel: acpiphp: Slot [29] registered Sep 12 17:41:50.934371 kernel: acpiphp: Slot [30] registered Sep 12 17:41:50.934384 kernel: acpiphp: Slot [31] registered Sep 12 17:41:50.934399 kernel: PCI host bridge to bus 0000:00 Sep 12 17:41:50.934566 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:41:50.934693 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 17:41:50.934817 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:41:50.934944 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Sep 12 17:41:50.937227 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x2000ffffffff window] Sep 12 17:41:50.937354 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:41:50.937521 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Sep 12 17:41:50.937686 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint Sep 12 17:41:50.937834 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x000000 conventional PCI endpoint Sep 12 17:41:50.937994 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Sep 12 17:41:50.938133 kernel: pci 0000:00:01.3: PIIX4 devres E PIO at fff0-ffff Sep 12 17:41:50.938273 kernel: pci 0000:00:01.3: PIIX4 devres F MMIO at ffc00000-ffffffff Sep 12 17:41:50.938415 kernel: pci 0000:00:01.3: PIIX4 devres G PIO at fff0-ffff Sep 12 17:41:50.938560 kernel: pci 0000:00:01.3: PIIX4 devres H MMIO at ffc00000-ffffffff Sep 12 17:41:50.938708 kernel: pci 0000:00:01.3: PIIX4 devres I PIO at fff0-ffff Sep 12 17:41:50.938854 kernel: pci 0000:00:01.3: PIIX4 devres J PIO at fff0-ffff Sep 12 17:41:50.941081 kernel: pci 0000:00:03.0: [1d0f:1111] type 00 class 0x030000 conventional PCI endpoint Sep 12 17:41:50.941244 kernel: pci 0000:00:03.0: BAR 0 [mem 0x80000000-0x803fffff pref] Sep 12 17:41:50.941380 kernel: pci 0000:00:03.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 12 17:41:50.941510 kernel: pci 0000:00:03.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 17:41:50.941645 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Endpoint Sep 12 17:41:50.941774 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80404000-0x80407fff] Sep 12 17:41:50.941916 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Endpoint Sep 12 17:41:50.942059 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80400000-0x80403fff] Sep 12 17:41:50.942081 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 17:41:50.942096 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 17:41:50.942110 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 17:41:50.942124 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 17:41:50.942138 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Sep 12 17:41:50.942152 kernel: iommu: Default domain type: Translated Sep 12 17:41:50.942166 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:41:50.942180 kernel: efivars: Registered efivars operations Sep 12 17:41:50.942194 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:41:50.942212 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:41:50.942226 kernel: e820: reserve RAM buffer [mem 0x768c0018-0x77ffffff] Sep 12 17:41:50.942239 kernel: e820: reserve RAM buffer [mem 0x786ce000-0x7bffffff] Sep 12 17:41:50.942254 kernel: e820: reserve RAM buffer [mem 0x7c97c000-0x7fffffff] Sep 12 17:41:50.942378 kernel: pci 0000:00:03.0: vgaarb: setting as boot VGA device Sep 12 17:41:50.942504 kernel: pci 0000:00:03.0: vgaarb: bridge control possible Sep 12 17:41:50.942629 kernel: pci 0000:00:03.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 17:41:50.942648 kernel: vgaarb: loaded Sep 12 17:41:50.942668 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Sep 12 17:41:50.942683 kernel: hpet0: 8 comparators, 32-bit 62.500000 MHz counter Sep 12 17:41:50.942699 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 17:41:50.942714 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:41:50.942730 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:41:50.942746 kernel: pnp: PnP ACPI init Sep 12 17:41:50.942762 kernel: pnp: PnP ACPI: found 5 devices Sep 12 17:41:50.942777 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:41:50.942792 kernel: NET: Registered PF_INET protocol family Sep 12 17:41:50.942810 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:41:50.942826 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 17:41:50.942842 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:41:50.942858 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:41:50.942874 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:41:50.942890 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 17:41:50.942906 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:41:50.942921 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:41:50.942937 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:41:50.944988 kernel: NET: Registered PF_XDP protocol family Sep 12 17:41:50.945165 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 17:41:50.945291 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 17:41:50.945413 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 17:41:50.945534 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Sep 12 17:41:50.945654 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x2000ffffffff window] Sep 12 17:41:50.945799 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 12 17:41:50.945820 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:41:50.945843 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 12 17:41:50.945860 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Sep 12 17:41:50.945876 kernel: clocksource: Switched to clocksource tsc Sep 12 17:41:50.945892 kernel: Initialise system trusted keyrings Sep 12 17:41:50.945907 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 17:41:50.945923 kernel: Key type asymmetric registered Sep 12 17:41:50.945939 kernel: Asymmetric key parser 'x509' registered Sep 12 17:41:50.945967 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:41:50.947045 kernel: io scheduler mq-deadline registered Sep 12 17:41:50.947064 kernel: io scheduler kyber registered Sep 12 17:41:50.947077 kernel: io scheduler bfq registered Sep 12 17:41:50.947090 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:41:50.947103 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:41:50.947118 kernel: 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:41:50.947133 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 17:41:50.947148 kernel: i8042: Warning: Keylock active Sep 12 17:41:50.947161 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 17:41:50.947175 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 17:41:50.947367 kernel: rtc_cmos 00:00: RTC can wake from S4 Sep 12 17:41:50.947511 kernel: rtc_cmos 00:00: registered as rtc0 Sep 12 17:41:50.947644 kernel: rtc_cmos 00:00: setting system clock to 2025-09-12T17:41:50 UTC (1757698910) Sep 12 17:41:50.947770 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Sep 12 17:41:50.947813 kernel: intel_pstate: CPU model not supported Sep 12 17:41:50.947834 kernel: efifb: probing for efifb Sep 12 17:41:50.947850 kernel: efifb: framebuffer at 0x80000000, using 1876k, total 1875k Sep 12 17:41:50.947868 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 Sep 12 17:41:50.947883 kernel: efifb: scrolling: redraw Sep 12 17:41:50.947899 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 17:41:50.947914 kernel: Console: switching to colour frame buffer device 100x37 Sep 12 17:41:50.947930 kernel: fb0: EFI VGA frame buffer device Sep 12 17:41:50.947946 kernel: pstore: Using crash dump compression: deflate Sep 12 17:41:50.948927 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 17:41:50.948948 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:41:50.948977 kernel: Segment Routing with IPv6 Sep 12 17:41:50.948992 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:41:50.949013 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:41:50.949030 kernel: Key type dns_resolver registered Sep 12 17:41:50.949046 kernel: IPI shorthand broadcast: enabled Sep 12 17:41:50.949061 kernel: sched_clock: Marking stable (2658039911, 151222740)->(2903655965, -94393314) Sep 12 17:41:50.949078 kernel: registered taskstats version 1 Sep 12 17:41:50.949094 kernel: Loading compiled-in X.509 certificates Sep 12 17:41:50.949111 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 17:41:50.949127 kernel: Demotion targets for Node 0: null Sep 12 17:41:50.949143 kernel: Key type .fscrypt registered Sep 12 17:41:50.949162 kernel: Key type fscrypt-provisioning registered Sep 12 17:41:50.949179 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:41:50.949195 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:41:50.949212 kernel: ima: No architecture policies found Sep 12 17:41:50.949229 kernel: clk: Disabling unused clocks Sep 12 17:41:50.949247 kernel: Warning: unable to open an initial console. Sep 12 17:41:50.949263 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 17:41:50.949280 kernel: Write protecting the kernel read-only data: 24576k Sep 12 17:41:50.949300 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 17:41:50.949319 kernel: Run /init as init process Sep 12 17:41:50.949335 kernel: with arguments: Sep 12 17:41:50.949352 kernel: /init Sep 12 17:41:50.949368 kernel: with environment: Sep 12 17:41:50.949387 kernel: HOME=/ Sep 12 17:41:50.949406 kernel: TERM=linux Sep 12 17:41:50.949423 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:41:50.949442 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:41:50.949464 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:41:50.949483 systemd[1]: Detected virtualization amazon. Sep 12 17:41:50.949500 systemd[1]: Detected architecture x86-64. Sep 12 17:41:50.949516 systemd[1]: Running in initrd. Sep 12 17:41:50.949536 systemd[1]: No hostname configured, using default hostname. Sep 12 17:41:50.949554 systemd[1]: Hostname set to . Sep 12 17:41:50.949570 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:41:50.949588 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:41:50.949605 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:41:50.949623 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:41:50.949642 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:41:50.949659 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:41:50.949679 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:41:50.949698 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:41:50.949718 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:41:50.949735 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:41:50.949752 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:41:50.949770 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:41:50.949787 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:41:50.949806 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:41:50.949824 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:41:50.949842 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:41:50.949859 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:41:50.949876 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:41:50.949894 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:41:50.949911 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:41:50.949928 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:41:50.949946 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:41:50.949978 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:41:50.949996 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:41:50.950014 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:41:50.950031 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:41:50.950048 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:41:50.950067 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:41:50.950084 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:41:50.950102 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:41:50.950123 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:41:50.950140 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:41:50.950157 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:41:50.950211 systemd-journald[207]: Collecting audit messages is disabled. Sep 12 17:41:50.950254 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:41:50.950273 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:41:50.950292 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:41:50.950311 systemd-journald[207]: Journal started Sep 12 17:41:50.950350 systemd-journald[207]: Runtime Journal (/run/log/journal/ec28ad47c1acf53483681acdf0affed4) is 4.8M, max 38.4M, 33.6M free. Sep 12 17:41:50.934302 systemd-modules-load[208]: Inserted module 'overlay' Sep 12 17:41:50.960718 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:41:50.965741 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:41:50.978129 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:41:50.991822 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:41:50.991869 kernel: Bridge firewalling registered Sep 12 17:41:50.984356 systemd-modules-load[208]: Inserted module 'br_netfilter' Sep 12 17:41:50.985249 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:41:50.992640 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:41:50.995812 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:41:51.002114 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:41:51.005177 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 17:41:51.006820 systemd-tmpfiles[224]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:41:51.014531 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:41:51.019983 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:41:51.035380 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:41:51.037476 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:41:51.042183 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:41:51.045946 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:41:51.049531 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:41:51.072447 dracut-cmdline[245]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:41:51.106405 systemd-resolved[246]: Positive Trust Anchors: Sep 12 17:41:51.106423 systemd-resolved[246]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:41:51.106488 systemd-resolved[246]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:41:51.114700 systemd-resolved[246]: Defaulting to hostname 'linux'. Sep 12 17:41:51.117795 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:41:51.118513 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:41:51.171992 kernel: SCSI subsystem initialized Sep 12 17:41:51.182020 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:41:51.192999 kernel: iscsi: registered transport (tcp) Sep 12 17:41:51.215214 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:41:51.215296 kernel: QLogic iSCSI HBA Driver Sep 12 17:41:51.234740 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:41:51.250754 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:41:51.254409 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:41:51.298371 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:41:51.300896 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:41:51.356994 kernel: raid6: avx512x4 gen() 17882 MB/s Sep 12 17:41:51.374983 kernel: raid6: avx512x2 gen() 17145 MB/s Sep 12 17:41:51.392983 kernel: raid6: avx512x1 gen() 17818 MB/s Sep 12 17:41:51.410983 kernel: raid6: avx2x4 gen() 17736 MB/s Sep 12 17:41:51.428984 kernel: raid6: avx2x2 gen() 17768 MB/s Sep 12 17:41:51.447209 kernel: raid6: avx2x1 gen() 13661 MB/s Sep 12 17:41:51.447276 kernel: raid6: using algorithm avx512x4 gen() 17882 MB/s Sep 12 17:41:51.466255 kernel: raid6: .... xor() 7838 MB/s, rmw enabled Sep 12 17:41:51.466321 kernel: raid6: using avx512x2 recovery algorithm Sep 12 17:41:51.488000 kernel: xor: automatically using best checksumming function avx Sep 12 17:41:51.656995 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:41:51.663867 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:41:51.666371 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:41:51.692925 systemd-udevd[456]: Using default interface naming scheme 'v255'. Sep 12 17:41:51.699549 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:41:51.702508 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:41:51.726596 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Sep 12 17:41:51.731804 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Sep 12 17:41:51.757271 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:41:51.760110 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:41:51.824606 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:41:51.827119 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:41:51.916117 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 12 17:41:51.916401 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 12 17:41:51.924130 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:41:51.924194 kernel: ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy. Sep 12 17:41:51.934145 kernel: AES CTR mode by8 optimization enabled Sep 12 17:41:51.971105 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80400000, mac addr 06:d9:f7:5c:05:e5 Sep 12 17:41:51.977979 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 12 17:41:51.978244 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Sep 12 17:41:51.979140 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:41:51.980256 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:41:51.981890 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:41:51.986236 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:41:51.988194 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:41:51.991831 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 12 17:41:52.000636 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:41:52.000698 kernel: GPT:9289727 != 16777215 Sep 12 17:41:52.000718 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:41:52.002129 kernel: GPT:9289727 != 16777215 Sep 12 17:41:52.002182 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:41:52.002927 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:41:52.005097 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 17:41:52.003065 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:41:52.007750 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:41:52.008337 (udev-worker)[521]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:41:52.035909 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:41:52.046983 kernel: nvme nvme0: using unchecked data buffer Sep 12 17:41:52.182052 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 12 17:41:52.193207 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:41:52.204557 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 12 17:41:52.222094 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 12 17:41:52.222639 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 12 17:41:52.234236 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 12 17:41:52.234893 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:41:52.236155 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:41:52.237394 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:41:52.239118 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:41:52.242036 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:41:52.260507 disk-uuid[698]: Primary Header is updated. Sep 12 17:41:52.260507 disk-uuid[698]: Secondary Entries is updated. Sep 12 17:41:52.260507 disk-uuid[698]: Secondary Header is updated. Sep 12 17:41:52.267143 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:41:52.270013 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 17:41:53.283112 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 12 17:41:53.283170 disk-uuid[703]: The operation has completed successfully. Sep 12 17:41:53.401694 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:41:53.401798 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:41:53.438853 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:41:53.460688 sh[966]: Success Sep 12 17:41:53.489438 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:41:53.489517 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:41:53.489540 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:41:53.501976 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 17:41:53.602865 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:41:53.605560 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:41:53.615012 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:41:53.632983 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (989) Sep 12 17:41:53.635984 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 17:41:53.636044 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:41:53.738769 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:41:53.738851 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:41:53.738865 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:41:53.766178 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:41:53.767405 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:41:53.768207 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:41:53.769414 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:41:53.773126 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:41:53.811044 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1023) Sep 12 17:41:53.817953 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:41:53.818042 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:41:53.839143 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 17:41:53.839221 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 17:41:53.846984 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:41:53.849775 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:41:53.854187 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:41:53.880560 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:41:53.883139 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:41:53.926584 systemd-networkd[1158]: lo: Link UP Sep 12 17:41:53.926597 systemd-networkd[1158]: lo: Gained carrier Sep 12 17:41:53.928430 systemd-networkd[1158]: Enumeration completed Sep 12 17:41:53.928873 systemd-networkd[1158]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:41:53.928879 systemd-networkd[1158]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:41:53.929232 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:41:53.931347 systemd[1]: Reached target network.target - Network. Sep 12 17:41:53.932054 systemd-networkd[1158]: eth0: Link UP Sep 12 17:41:53.932060 systemd-networkd[1158]: eth0: Gained carrier Sep 12 17:41:53.932075 systemd-networkd[1158]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:41:53.944076 systemd-networkd[1158]: eth0: DHCPv4 address 172.31.17.147/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 12 17:41:54.344574 ignition[1127]: Ignition 2.21.0 Sep 12 17:41:54.344590 ignition[1127]: Stage: fetch-offline Sep 12 17:41:54.345049 ignition[1127]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:54.345062 ignition[1127]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:54.348157 ignition[1127]: Ignition finished successfully Sep 12 17:41:54.350163 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:41:54.351671 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:41:54.375554 ignition[1169]: Ignition 2.21.0 Sep 12 17:41:54.375571 ignition[1169]: Stage: fetch Sep 12 17:41:54.375980 ignition[1169]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:54.375995 ignition[1169]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:54.376113 ignition[1169]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:54.386611 ignition[1169]: PUT result: OK Sep 12 17:41:54.389313 ignition[1169]: parsed url from cmdline: "" Sep 12 17:41:54.389322 ignition[1169]: no config URL provided Sep 12 17:41:54.389331 ignition[1169]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:41:54.389343 ignition[1169]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:41:54.389373 ignition[1169]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:54.390942 ignition[1169]: PUT result: OK Sep 12 17:41:54.391011 ignition[1169]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 12 17:41:54.391607 ignition[1169]: GET result: OK Sep 12 17:41:54.391719 ignition[1169]: parsing config with SHA512: c24ea242d459cb143e76efbb7c482cecb6ff55ef37003ebd467d09732db1acfc19e2a890287b37587d25f56395dbef7cd0c754392144cec490255676c8c0e7b9 Sep 12 17:41:54.396104 unknown[1169]: fetched base config from "system" Sep 12 17:41:54.396115 unknown[1169]: fetched base config from "system" Sep 12 17:41:54.396691 ignition[1169]: fetch: fetch complete Sep 12 17:41:54.396122 unknown[1169]: fetched user config from "aws" Sep 12 17:41:54.396699 ignition[1169]: fetch: fetch passed Sep 12 17:41:54.396842 ignition[1169]: Ignition finished successfully Sep 12 17:41:54.400098 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:41:54.401609 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:41:54.430688 ignition[1176]: Ignition 2.21.0 Sep 12 17:41:54.430711 ignition[1176]: Stage: kargs Sep 12 17:41:54.431507 ignition[1176]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:54.431527 ignition[1176]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:54.431741 ignition[1176]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:54.433810 ignition[1176]: PUT result: OK Sep 12 17:41:54.437267 ignition[1176]: kargs: kargs passed Sep 12 17:41:54.437343 ignition[1176]: Ignition finished successfully Sep 12 17:41:54.439619 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:41:54.441267 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:41:54.467487 ignition[1182]: Ignition 2.21.0 Sep 12 17:41:54.467503 ignition[1182]: Stage: disks Sep 12 17:41:54.467896 ignition[1182]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:54.467908 ignition[1182]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:54.468055 ignition[1182]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:54.469196 ignition[1182]: PUT result: OK Sep 12 17:41:54.472418 ignition[1182]: disks: disks passed Sep 12 17:41:54.472484 ignition[1182]: Ignition finished successfully Sep 12 17:41:54.475009 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:41:54.476073 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:41:54.476478 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:41:54.477217 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:41:54.477800 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:41:54.478402 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:41:54.480154 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:41:54.533725 systemd-fsck[1190]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 17:41:54.537427 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:41:54.539675 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:41:54.703994 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 17:41:54.704688 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:41:54.706394 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:41:54.708192 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:41:54.710876 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:41:54.714091 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 17:41:54.714165 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:41:54.714199 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:41:54.723667 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:41:54.726138 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:41:54.740982 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1209) Sep 12 17:41:54.745943 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:41:54.746043 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:41:54.754380 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 17:41:54.754459 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 17:41:54.756203 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:41:54.962105 systemd-networkd[1158]: eth0: Gained IPv6LL Sep 12 17:41:55.082133 initrd-setup-root[1233]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:41:55.099408 initrd-setup-root[1240]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:41:55.106777 initrd-setup-root[1247]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:41:55.125843 initrd-setup-root[1254]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:41:55.448160 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:41:55.450464 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:41:55.453059 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:41:55.468637 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:41:55.470995 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:41:55.501270 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:41:55.506287 ignition[1321]: INFO : Ignition 2.21.0 Sep 12 17:41:55.506287 ignition[1321]: INFO : Stage: mount Sep 12 17:41:55.507850 ignition[1321]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:55.507850 ignition[1321]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:55.507850 ignition[1321]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:55.509652 ignition[1321]: INFO : PUT result: OK Sep 12 17:41:55.512337 ignition[1321]: INFO : mount: mount passed Sep 12 17:41:55.514282 ignition[1321]: INFO : Ignition finished successfully Sep 12 17:41:55.514946 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:41:55.517060 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:41:55.707563 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:41:55.738015 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1334) Sep 12 17:41:55.742997 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:41:55.743091 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:41:55.751665 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 12 17:41:55.751737 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 12 17:41:55.753854 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:41:55.786364 ignition[1350]: INFO : Ignition 2.21.0 Sep 12 17:41:55.786364 ignition[1350]: INFO : Stage: files Sep 12 17:41:55.788238 ignition[1350]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:55.788238 ignition[1350]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:55.788238 ignition[1350]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:55.788238 ignition[1350]: INFO : PUT result: OK Sep 12 17:41:55.790878 ignition[1350]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:41:55.791652 ignition[1350]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:41:55.791652 ignition[1350]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:41:55.795435 ignition[1350]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:41:55.796442 ignition[1350]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:41:55.797590 ignition[1350]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:41:55.796999 unknown[1350]: wrote ssh authorized keys file for user: core Sep 12 17:41:55.812670 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:41:55.812670 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 12 17:41:55.897114 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:41:56.413006 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:41:56.413866 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:41:56.419783 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:41:56.419783 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:41:56.419783 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:41:56.422686 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:41:56.422686 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:41:56.422686 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 12 17:41:56.769454 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:41:57.169984 ignition[1350]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:41:57.169984 ignition[1350]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:41:57.172356 ignition[1350]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:41:57.176612 ignition[1350]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:41:57.176612 ignition[1350]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:41:57.176612 ignition[1350]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:41:57.179026 ignition[1350]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:41:57.179026 ignition[1350]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:41:57.179026 ignition[1350]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:41:57.179026 ignition[1350]: INFO : files: files passed Sep 12 17:41:57.179026 ignition[1350]: INFO : Ignition finished successfully Sep 12 17:41:57.179375 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:41:57.183086 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:41:57.184869 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:41:57.195964 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:41:57.196101 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:41:57.212019 initrd-setup-root-after-ignition[1381]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:41:57.212019 initrd-setup-root-after-ignition[1381]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:41:57.214122 initrd-setup-root-after-ignition[1385]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:41:57.216549 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:41:57.217277 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:41:57.219142 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:41:57.264542 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:41:57.264695 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:41:57.266129 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:41:57.267295 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:41:57.268084 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:41:57.269521 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:41:57.295372 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:41:57.297815 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:41:57.319453 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:41:57.320225 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:41:57.321375 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:41:57.322250 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:41:57.322432 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:41:57.323624 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:41:57.324511 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:41:57.325406 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:41:57.326150 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:41:57.326937 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:41:57.327713 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:41:57.328492 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:41:57.329396 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:41:57.330240 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:41:57.331335 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:41:57.332129 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:41:57.333025 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:41:57.333256 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:41:57.334239 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:41:57.335064 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:41:57.335680 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:41:57.336407 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:41:57.337094 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:41:57.337309 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:41:57.338702 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:41:57.338973 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:41:57.339678 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:41:57.339878 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:41:57.342058 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:41:57.343059 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:41:57.343251 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:41:57.346905 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:41:57.348158 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:41:57.349107 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:41:57.351293 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:41:57.352143 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:41:57.361074 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:41:57.361216 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:41:57.381013 ignition[1405]: INFO : Ignition 2.21.0 Sep 12 17:41:57.381013 ignition[1405]: INFO : Stage: umount Sep 12 17:41:57.381013 ignition[1405]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:41:57.381013 ignition[1405]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 12 17:41:57.381013 ignition[1405]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 12 17:41:57.386974 ignition[1405]: INFO : PUT result: OK Sep 12 17:41:57.386974 ignition[1405]: INFO : umount: umount passed Sep 12 17:41:57.386974 ignition[1405]: INFO : Ignition finished successfully Sep 12 17:41:57.388578 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:41:57.389513 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:41:57.389660 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:41:57.390756 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:41:57.390820 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:41:57.391452 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:41:57.391511 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:41:57.392157 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:41:57.392212 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:41:57.392935 systemd[1]: Stopped target network.target - Network. Sep 12 17:41:57.393571 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:41:57.393637 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:41:57.394340 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:41:57.394917 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:41:57.395007 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:41:57.395567 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:41:57.396207 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:41:57.397026 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:41:57.397079 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:41:57.397654 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:41:57.397698 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:41:57.398290 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:41:57.398365 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:41:57.399012 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:41:57.399075 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:41:57.399784 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:41:57.400685 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:41:57.406507 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:41:57.406648 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:41:57.410263 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:41:57.411061 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:41:57.411171 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:41:57.415248 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:41:57.415609 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:41:57.415759 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:41:57.418559 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:41:57.419261 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:41:57.420074 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:41:57.420130 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:41:57.422055 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:41:57.423178 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:41:57.423830 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:41:57.425527 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:41:57.426145 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:41:57.427491 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:41:57.428086 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:41:57.429115 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:41:57.434547 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:41:57.450095 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:41:57.450259 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:41:57.450973 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:41:57.451017 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:41:57.452341 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:41:57.452380 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:41:57.453620 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:41:57.453675 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:41:57.454788 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:41:57.454843 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:41:57.455929 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:41:57.456114 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:41:57.458025 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:41:57.459034 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:41:57.459103 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:41:57.461893 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:41:57.461947 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:41:57.462391 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:41:57.462438 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:41:57.465218 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:41:57.467154 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:41:57.475677 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:41:57.475806 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:41:57.528934 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:41:57.529101 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:41:57.530534 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:41:57.531149 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:41:57.531246 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:41:57.533164 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:41:57.554006 systemd[1]: Switching root. Sep 12 17:41:57.598578 systemd-journald[207]: Journal stopped Sep 12 17:41:59.448395 systemd-journald[207]: Received SIGTERM from PID 1 (systemd). Sep 12 17:41:59.448483 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:41:59.448506 kernel: SELinux: policy capability open_perms=1 Sep 12 17:41:59.448525 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:41:59.448543 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:41:59.448566 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:41:59.448585 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:41:59.448608 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:41:59.448626 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:41:59.448645 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:41:59.448663 kernel: audit: type=1403 audit(1757698918.051:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:41:59.448685 systemd[1]: Successfully loaded SELinux policy in 90.227ms. Sep 12 17:41:59.448727 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.156ms. Sep 12 17:41:59.448749 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:41:59.448769 systemd[1]: Detected virtualization amazon. Sep 12 17:41:59.448790 systemd[1]: Detected architecture x86-64. Sep 12 17:41:59.448810 systemd[1]: Detected first boot. Sep 12 17:41:59.448829 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:41:59.448848 zram_generator::config[1449]: No configuration found. Sep 12 17:41:59.448867 kernel: Guest personality initialized and is inactive Sep 12 17:41:59.448885 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 17:41:59.448903 kernel: Initialized host personality Sep 12 17:41:59.448921 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:41:59.448939 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:41:59.453191 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:41:59.453236 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:41:59.453256 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:41:59.453277 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:41:59.453296 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:41:59.453315 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:41:59.453333 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:41:59.453351 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:41:59.453377 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:41:59.453397 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:41:59.453416 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:41:59.453434 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:41:59.453452 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:41:59.453471 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:41:59.453491 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:41:59.453510 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:41:59.453531 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:41:59.453553 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:41:59.453579 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 17:41:59.453603 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:41:59.453622 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:41:59.453642 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:41:59.453661 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:41:59.453681 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:41:59.453700 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:41:59.453721 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:41:59.453740 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:41:59.453760 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:41:59.453779 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:41:59.453797 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:41:59.453816 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:41:59.454715 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:41:59.454745 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:41:59.454765 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:41:59.454793 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:41:59.454815 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:41:59.454837 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:41:59.454859 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:41:59.454882 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:41:59.454904 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:41:59.454925 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:41:59.454946 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:41:59.454996 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:41:59.455024 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:41:59.455046 systemd[1]: Reached target machines.target - Containers. Sep 12 17:41:59.455066 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:41:59.455088 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:41:59.455110 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:41:59.455131 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:41:59.455152 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:41:59.455174 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:41:59.455197 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:41:59.455219 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:41:59.455240 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:41:59.455262 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:41:59.455285 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:41:59.455306 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:41:59.455327 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:41:59.455348 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:41:59.455370 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:41:59.455395 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:41:59.455417 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:41:59.455438 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:41:59.455460 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:41:59.455482 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:41:59.455503 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:41:59.455527 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:41:59.455549 systemd[1]: Stopped verity-setup.service. Sep 12 17:41:59.455570 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:41:59.455594 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:41:59.455618 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:41:59.455638 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:41:59.455659 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:41:59.455681 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:41:59.455702 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:41:59.455723 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:41:59.455743 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:41:59.455764 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:41:59.455788 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:41:59.455809 kernel: loop: module loaded Sep 12 17:41:59.455830 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:41:59.455851 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:41:59.455871 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:41:59.455892 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:41:59.455913 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:41:59.455935 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:41:59.458004 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:41:59.458057 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:41:59.458077 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:41:59.458096 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:41:59.458117 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:41:59.458138 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:41:59.458157 kernel: fuse: init (API version 7.41) Sep 12 17:41:59.458179 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:41:59.458200 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:41:59.458225 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:41:59.458293 systemd-journald[1525]: Collecting audit messages is disabled. Sep 12 17:41:59.458336 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:41:59.458357 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:41:59.458383 systemd-journald[1525]: Journal started Sep 12 17:41:59.458428 systemd-journald[1525]: Runtime Journal (/run/log/journal/ec28ad47c1acf53483681acdf0affed4) is 4.8M, max 38.4M, 33.6M free. Sep 12 17:41:59.039863 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:41:59.064235 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 12 17:41:59.064924 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:41:59.504847 kernel: ACPI: bus type drm_connector registered Sep 12 17:41:59.504973 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:41:59.508074 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:41:59.515289 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:41:59.523141 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:41:59.533749 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:41:59.536731 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:41:59.538047 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:41:59.539407 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:41:59.540055 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:41:59.543117 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:41:59.544184 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:41:59.578627 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:41:59.587560 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:41:59.588931 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:41:59.595271 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:41:59.614289 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:41:59.631301 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:41:59.635014 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:41:59.638214 kernel: loop0: detected capacity change from 0 to 128016 Sep 12 17:41:59.649046 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:41:59.656713 systemd-journald[1525]: Time spent on flushing to /var/log/journal/ec28ad47c1acf53483681acdf0affed4 is 66.925ms for 1020 entries. Sep 12 17:41:59.656713 systemd-journald[1525]: System Journal (/var/log/journal/ec28ad47c1acf53483681acdf0affed4) is 8M, max 195.6M, 187.6M free. Sep 12 17:41:59.731918 systemd-journald[1525]: Received client request to flush runtime journal. Sep 12 17:41:59.734498 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:41:59.749718 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:41:59.766014 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:41:59.767166 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:41:59.771055 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:41:59.801984 kernel: loop1: detected capacity change from 0 to 111000 Sep 12 17:41:59.817935 systemd-tmpfiles[1598]: ACLs are not supported, ignoring. Sep 12 17:41:59.817988 systemd-tmpfiles[1598]: ACLs are not supported, ignoring. Sep 12 17:41:59.823915 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:41:59.922190 kernel: loop2: detected capacity change from 0 to 221472 Sep 12 17:42:00.053318 kernel: loop3: detected capacity change from 0 to 72360 Sep 12 17:42:00.066899 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:42:00.072644 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:42:00.094920 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:42:00.180008 kernel: loop4: detected capacity change from 0 to 128016 Sep 12 17:42:00.219044 kernel: loop5: detected capacity change from 0 to 111000 Sep 12 17:42:00.253082 kernel: loop6: detected capacity change from 0 to 221472 Sep 12 17:42:00.295043 kernel: loop7: detected capacity change from 0 to 72360 Sep 12 17:42:00.318039 (sd-merge)[1606]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 12 17:42:00.318553 (sd-merge)[1606]: Merged extensions into '/usr'. Sep 12 17:42:00.323437 systemd[1]: Reload requested from client PID 1560 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:42:00.323572 systemd[1]: Reloading... Sep 12 17:42:00.420984 zram_generator::config[1632]: No configuration found. Sep 12 17:42:00.972750 systemd[1]: Reloading finished in 648 ms. Sep 12 17:42:00.995453 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:42:00.996684 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:42:01.017678 systemd[1]: Starting ensure-sysext.service... Sep 12 17:42:01.021066 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:42:01.037367 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:42:01.085176 systemd[1]: Reload requested from client PID 1684 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:42:01.085192 systemd[1]: Reloading... Sep 12 17:42:01.168878 systemd-tmpfiles[1685]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:42:01.175041 systemd-tmpfiles[1685]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:42:01.241580 systemd-tmpfiles[1685]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:42:01.247319 systemd-tmpfiles[1685]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:42:01.282777 systemd-tmpfiles[1685]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:42:01.300506 systemd-tmpfiles[1685]: ACLs are not supported, ignoring. Sep 12 17:42:01.312239 systemd-tmpfiles[1685]: ACLs are not supported, ignoring. Sep 12 17:42:01.443133 systemd-tmpfiles[1685]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:42:01.443152 systemd-tmpfiles[1685]: Skipping /boot Sep 12 17:42:01.700422 systemd-tmpfiles[1685]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:42:01.708435 systemd-tmpfiles[1685]: Skipping /boot Sep 12 17:42:01.807319 ldconfig[1551]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:42:01.866112 systemd-udevd[1686]: Using default interface naming scheme 'v255'. Sep 12 17:42:01.997002 zram_generator::config[1712]: No configuration found. Sep 12 17:42:02.608461 (udev-worker)[1756]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:42:02.803056 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:42:02.942000 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 12 17:42:03.018997 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Sep 12 17:42:03.028495 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:42:03.028598 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input5 Sep 12 17:42:03.070983 kernel: ACPI: button: Sleep Button [SLPF] Sep 12 17:42:03.094286 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 17:42:03.095184 systemd[1]: Reloading finished in 2006 ms. Sep 12 17:42:03.123562 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:42:03.127292 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:42:03.143222 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:42:03.228633 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:42:03.242159 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:42:03.253286 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:42:03.257383 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:42:03.272929 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:42:03.285478 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:42:03.292992 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.293326 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:03.297170 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:42:03.303401 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:42:03.314950 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:42:03.316563 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:03.317243 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:03.317407 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.323822 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.324154 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:03.324413 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:03.324629 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:03.324799 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.334052 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:42:03.342429 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.342828 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:03.355335 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:42:03.357300 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:03.357496 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:03.357745 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:42:03.358517 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:03.398612 systemd[1]: Finished ensure-sysext.service. Sep 12 17:42:03.430420 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:42:03.457732 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:42:03.482691 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:42:03.525744 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:42:03.527289 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:42:03.533458 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:42:03.535062 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:42:03.542987 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:42:03.543747 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:42:03.546688 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:42:03.554362 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:42:03.554579 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:42:03.561469 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:42:03.562762 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:42:03.643150 augenrules[1922]: No rules Sep 12 17:42:03.656449 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:42:03.657027 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:42:03.662615 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:42:03.716702 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:42:03.771114 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:03.795821 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:42:03.796241 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:03.800444 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:03.837525 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 12 17:42:03.842775 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:42:03.917609 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:42:03.931182 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:42:04.000032 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:04.081414 systemd-networkd[1863]: lo: Link UP Sep 12 17:42:04.081434 systemd-networkd[1863]: lo: Gained carrier Sep 12 17:42:04.082290 systemd-resolved[1864]: Positive Trust Anchors: Sep 12 17:42:04.082624 systemd-resolved[1864]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:42:04.082764 systemd-resolved[1864]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:42:04.083371 systemd-networkd[1863]: Enumeration completed Sep 12 17:42:04.083519 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:42:04.084582 systemd-networkd[1863]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:04.084596 systemd-networkd[1863]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:42:04.087293 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:42:04.089214 systemd-networkd[1863]: eth0: Link UP Sep 12 17:42:04.089384 systemd-networkd[1863]: eth0: Gained carrier Sep 12 17:42:04.089417 systemd-networkd[1863]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:04.090622 systemd-resolved[1864]: Defaulting to hostname 'linux'. Sep 12 17:42:04.091331 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:42:04.099061 systemd-networkd[1863]: eth0: DHCPv4 address 172.31.17.147/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 12 17:42:04.099709 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:42:04.100500 systemd[1]: Reached target network.target - Network. Sep 12 17:42:04.101352 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:42:04.104123 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:42:04.105185 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:42:04.105993 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:42:04.107070 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 17:42:04.107659 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:42:04.108248 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:42:04.109474 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:42:04.109939 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:42:04.110012 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:42:04.110702 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:42:04.112587 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:42:04.115603 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:42:04.121849 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:42:04.123410 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:42:04.124081 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:42:04.136634 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:42:04.137985 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:42:04.150587 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:42:04.153153 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:42:04.155599 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:42:04.156332 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:42:04.157450 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:42:04.157486 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:42:04.162423 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:42:04.168155 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:42:04.177456 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:42:04.195591 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:42:04.197749 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:42:04.206494 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:42:04.207161 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:42:04.210387 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 17:42:04.225254 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:42:04.235205 systemd[1]: Started ntpd.service - Network Time Service. Sep 12 17:42:04.280271 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:42:04.294946 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 12 17:42:04.303266 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:42:04.305127 oslogin_cache_refresh[1972]: Refreshing passwd entry cache Sep 12 17:42:04.307746 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Refreshing passwd entry cache Sep 12 17:42:04.309241 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:42:04.315688 extend-filesystems[1971]: Found /dev/nvme0n1p6 Sep 12 17:42:04.320278 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:42:04.327612 oslogin_cache_refresh[1972]: Failure getting users, quitting Sep 12 17:42:04.332591 jq[1970]: false Sep 12 17:42:04.332824 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Failure getting users, quitting Sep 12 17:42:04.332824 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:42:04.332824 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Refreshing group entry cache Sep 12 17:42:04.327380 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:42:04.327637 oslogin_cache_refresh[1972]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:42:04.329313 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:42:04.327704 oslogin_cache_refresh[1972]: Refreshing group entry cache Sep 12 17:42:04.336182 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:42:04.338939 oslogin_cache_refresh[1972]: Failure getting groups, quitting Sep 12 17:42:04.344230 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Failure getting groups, quitting Sep 12 17:42:04.344230 google_oslogin_nss_cache[1972]: oslogin_cache_refresh[1972]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:42:04.342205 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:42:04.338954 oslogin_cache_refresh[1972]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:42:04.359627 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:42:04.361669 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:42:04.361932 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:42:04.362372 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 17:42:04.367468 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 17:42:04.376232 extend-filesystems[1971]: Found /dev/nvme0n1p9 Sep 12 17:42:04.376232 extend-filesystems[1971]: Checking size of /dev/nvme0n1p9 Sep 12 17:42:04.378603 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:42:04.379060 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:42:04.402090 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:42:04.404190 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:42:04.424793 extend-filesystems[1971]: Resized partition /dev/nvme0n1p9 Sep 12 17:42:04.434485 extend-filesystems[2010]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 17:42:04.440775 jq[1990]: true Sep 12 17:42:04.445405 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 12 17:42:04.452092 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:42:04.471651 update_engine[1989]: I20250912 17:42:04.471540 1989 main.cc:92] Flatcar Update Engine starting Sep 12 17:42:04.472498 (ntainerd)[2001]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:42:04.496249 tar[1996]: linux-amd64/helm Sep 12 17:42:04.540220 dbus-daemon[1968]: [system] SELinux support is enabled Sep 12 17:42:04.540481 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:42:04.548708 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:42:04.548747 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:42:04.550010 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:42:04.550039 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:42:04.554554 ntpd[1974]: ntpd 4.2.8p17@1.4004-o Fri Sep 12 14:59:08 UTC 2025 (1): Starting Sep 12 17:42:04.557347 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: ntpd 4.2.8p17@1.4004-o Fri Sep 12 14:59:08 UTC 2025 (1): Starting Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: ---------------------------------------------------- Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: ntp-4 is maintained by Network Time Foundation, Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: corporation. Support and training for ntp-4 are Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: available at https://www.nwtime.org/support Sep 12 17:42:04.558424 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: ---------------------------------------------------- Sep 12 17:42:04.554589 ntpd[1974]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 17:42:04.554599 ntpd[1974]: ---------------------------------------------------- Sep 12 17:42:04.554609 ntpd[1974]: ntp-4 is maintained by Network Time Foundation, Sep 12 17:42:04.554618 ntpd[1974]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 17:42:04.554629 ntpd[1974]: corporation. Support and training for ntp-4 are Sep 12 17:42:04.554638 ntpd[1974]: available at https://www.nwtime.org/support Sep 12 17:42:04.554647 ntpd[1974]: ---------------------------------------------------- Sep 12 17:42:04.565016 coreos-metadata[1967]: Sep 12 17:42:04.563 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 12 17:42:04.579764 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 12 17:42:04.579952 update_engine[1989]: I20250912 17:42:04.577044 1989 update_check_scheduler.cc:74] Next update check in 3m10s Sep 12 17:42:04.580054 jq[2017]: true Sep 12 17:42:04.567356 ntpd[1974]: proto: precision = 0.062 usec (-24) Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.571 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.573 INFO Fetch successful Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.573 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.576 INFO Fetch successful Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.576 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.577 INFO Fetch successful Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.577 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.578 INFO Fetch successful Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.578 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.581 INFO Fetch failed with 404: resource not found Sep 12 17:42:04.587830 coreos-metadata[1967]: Sep 12 17:42:04.587 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: proto: precision = 0.062 usec (-24) Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: basedate set to 2025-08-31 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: gps base set to 2025-08-31 (week 2382) Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listen normally on 3 eth0 172.31.17.147:123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listen normally on 4 lo [::1]:123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: bind(21) AF_INET6 fe80::4d9:f7ff:fe5c:5e5%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: unable to create socket on eth0 (5) for fe80::4d9:f7ff:fe5c:5e5%2#123 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: failed to init interface for address fe80::4d9:f7ff:fe5c:5e5%2 Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: Listening on routing socket on fd #21 for interface updates Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:42:04.594750 ntpd[1974]: 12 Sep 17:42:04 ntpd[1974]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:42:04.581435 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 12 17:42:04.604258 extend-filesystems[2010]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 12 17:42:04.604258 extend-filesystems[2010]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 17:42:04.604258 extend-filesystems[2010]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 12 17:42:04.569743 dbus-daemon[1968]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1863 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.595 INFO Fetch successful Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.595 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.602 INFO Fetch successful Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.602 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.610 INFO Fetch successful Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.610 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.614 INFO Fetch successful Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.614 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 12 17:42:04.637307 coreos-metadata[1967]: Sep 12 17:42:04.614 INFO Fetch successful Sep 12 17:42:04.583212 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:42:04.637758 extend-filesystems[1971]: Resized filesystem in /dev/nvme0n1p9 Sep 12 17:42:04.572268 ntpd[1974]: basedate set to 2025-08-31 Sep 12 17:42:04.584109 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:42:04.572290 ntpd[1974]: gps base set to 2025-08-31 (week 2382) Sep 12 17:42:04.588447 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:42:04.579658 ntpd[1974]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 17:42:04.579712 ntpd[1974]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 17:42:04.579913 ntpd[1974]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 17:42:04.579950 ntpd[1974]: Listen normally on 3 eth0 172.31.17.147:123 Sep 12 17:42:04.582069 ntpd[1974]: Listen normally on 4 lo [::1]:123 Sep 12 17:42:04.582136 ntpd[1974]: bind(21) AF_INET6 fe80::4d9:f7ff:fe5c:5e5%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:42:04.582158 ntpd[1974]: unable to create socket on eth0 (5) for fe80::4d9:f7ff:fe5c:5e5%2#123 Sep 12 17:42:04.582172 ntpd[1974]: failed to init interface for address fe80::4d9:f7ff:fe5c:5e5%2 Sep 12 17:42:04.582207 ntpd[1974]: Listening on routing socket on fd #21 for interface updates Sep 12 17:42:04.583702 ntpd[1974]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:42:04.583738 ntpd[1974]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:42:04.676986 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:42:04.688633 systemd-logind[1987]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 17:42:04.688668 systemd-logind[1987]: Watching system buttons on /dev/input/event3 (Sleep Button) Sep 12 17:42:04.688704 systemd-logind[1987]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:42:04.692159 systemd-logind[1987]: New seat seat0. Sep 12 17:42:04.693110 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:42:04.813377 bash[2088]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:42:04.821450 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:42:04.832252 systemd[1]: Starting sshkeys.service... Sep 12 17:42:04.845446 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:42:04.846655 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:42:04.908072 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:42:04.914165 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:42:05.235595 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 12 17:42:05.261508 dbus-daemon[1968]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 12 17:42:05.265890 dbus-daemon[1968]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2034 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 12 17:42:05.279079 systemd[1]: Starting polkit.service - Authorization Manager... Sep 12 17:42:05.310616 coreos-metadata[2115]: Sep 12 17:42:05.310 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 12 17:42:05.310616 coreos-metadata[2115]: Sep 12 17:42:05.310 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 12 17:42:05.317077 coreos-metadata[2115]: Sep 12 17:42:05.315 INFO Fetch successful Sep 12 17:42:05.317077 coreos-metadata[2115]: Sep 12 17:42:05.315 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 12 17:42:05.320987 coreos-metadata[2115]: Sep 12 17:42:05.319 INFO Fetch successful Sep 12 17:42:05.326201 unknown[2115]: wrote ssh authorized keys file for user: core Sep 12 17:42:05.397394 update-ssh-keys[2167]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:42:05.400465 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:42:05.403246 systemd[1]: Finished sshkeys.service. Sep 12 17:42:05.453716 locksmithd[2040]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:42:05.482984 containerd[2001]: time="2025-09-12T17:42:05Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:42:05.487052 containerd[2001]: time="2025-09-12T17:42:05.487001930Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:42:05.538122 containerd[2001]: time="2025-09-12T17:42:05.537979321Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="26.137µs" Sep 12 17:42:05.538122 containerd[2001]: time="2025-09-12T17:42:05.538026926Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:42:05.538122 containerd[2001]: time="2025-09-12T17:42:05.538054340Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:42:05.538290 containerd[2001]: time="2025-09-12T17:42:05.538240554Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:42:05.538290 containerd[2001]: time="2025-09-12T17:42:05.538263746Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:42:05.538358 containerd[2001]: time="2025-09-12T17:42:05.538297096Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:42:05.538409 containerd[2001]: time="2025-09-12T17:42:05.538366606Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:42:05.538409 containerd[2001]: time="2025-09-12T17:42:05.538381673Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.538691906Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.538716443Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.538733433Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.538745434Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.538846438Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.539108808Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.539148829Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.539164090Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.539198033Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:42:05.539562 containerd[2001]: time="2025-09-12T17:42:05.539510744Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:42:05.540328 containerd[2001]: time="2025-09-12T17:42:05.539587659Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545306462Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545396594Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545420127Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545476556Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545505236Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545522023Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545541647Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545558639Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545574163Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545591524Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545605240Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:42:05.545605 containerd[2001]: time="2025-09-12T17:42:05.545620789Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:42:05.547537 containerd[2001]: time="2025-09-12T17:42:05.545783980Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552081703Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552183387Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552203732Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552228768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552250392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552282594Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552299090Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552323554Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552347220Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552378331Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552483317Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552509772Z" level=info msg="Start snapshots syncer" Sep 12 17:42:05.553379 containerd[2001]: time="2025-09-12T17:42:05.552548710Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:42:05.555724 ntpd[1974]: bind(24) AF_INET6 fe80::4d9:f7ff:fe5c:5e5%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:42:05.557514 containerd[2001]: time="2025-09-12T17:42:05.556146369Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:42:05.557514 containerd[2001]: time="2025-09-12T17:42:05.556238680Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:42:05.557733 ntpd[1974]: 12 Sep 17:42:05 ntpd[1974]: bind(24) AF_INET6 fe80::4d9:f7ff:fe5c:5e5%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:42:05.557733 ntpd[1974]: 12 Sep 17:42:05 ntpd[1974]: unable to create socket on eth0 (6) for fe80::4d9:f7ff:fe5c:5e5%2#123 Sep 12 17:42:05.557733 ntpd[1974]: 12 Sep 17:42:05 ntpd[1974]: failed to init interface for address fe80::4d9:f7ff:fe5c:5e5%2 Sep 12 17:42:05.555763 ntpd[1974]: unable to create socket on eth0 (6) for fe80::4d9:f7ff:fe5c:5e5%2#123 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556381781Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556566362Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556607126Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556634238Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556655743Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556690659Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556722122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556740635Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556781136Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556802570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556826432Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556890698Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556919180Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:42:05.557883 containerd[2001]: time="2025-09-12T17:42:05.556939839Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:42:05.555778 ntpd[1974]: failed to init interface for address fe80::4d9:f7ff:fe5c:5e5%2 Sep 12 17:42:05.555907 polkitd[2163]: Started polkitd version 126 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559079662Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559396765Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559435061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559454688Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559485569Z" level=info msg="runtime interface created" Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559495365Z" level=info msg="created NRI interface" Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559514956Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559539158Z" level=info msg="Connect containerd service" Sep 12 17:42:05.560073 containerd[2001]: time="2025-09-12T17:42:05.559601569Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:42:05.563623 containerd[2001]: time="2025-09-12T17:42:05.563508130Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:42:05.568950 polkitd[2163]: Loading rules from directory /etc/polkit-1/rules.d Sep 12 17:42:05.571280 polkitd[2163]: Loading rules from directory /run/polkit-1/rules.d Sep 12 17:42:05.571980 polkitd[2163]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 17:42:05.572528 polkitd[2163]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 12 17:42:05.572562 polkitd[2163]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 17:42:05.572613 polkitd[2163]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 12 17:42:05.575276 polkitd[2163]: Finished loading, compiling and executing 2 rules Sep 12 17:42:05.576108 systemd[1]: Started polkit.service - Authorization Manager. Sep 12 17:42:05.579181 dbus-daemon[1968]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 12 17:42:05.581020 polkitd[2163]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 12 17:42:05.619850 systemd-hostnamed[2034]: Hostname set to (transient) Sep 12 17:42:05.620375 systemd-resolved[1864]: System hostname changed to 'ip-172-31-17-147'. Sep 12 17:42:05.757988 tar[1996]: linux-amd64/LICENSE Sep 12 17:42:05.757988 tar[1996]: linux-amd64/README.md Sep 12 17:42:05.777923 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:42:05.815401 sshd_keygen[2026]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:42:05.842305 systemd-networkd[1863]: eth0: Gained IPv6LL Sep 12 17:42:05.846126 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:42:05.850735 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:42:05.854450 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:42:05.858155 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 12 17:42:05.864648 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:42:05.868786 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:05.873899 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:42:05.878300 systemd[1]: Started sshd@0-172.31.17.147:22-139.178.68.195:46370.service - OpenSSH per-connection server daemon (139.178.68.195:46370). Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885087846Z" level=info msg="Start subscribing containerd event" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885167225Z" level=info msg="Start recovering state" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885305831Z" level=info msg="Start event monitor" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885323633Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885337108Z" level=info msg="Start streaming server" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885349923Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885360451Z" level=info msg="runtime interface starting up..." Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885369853Z" level=info msg="starting plugins..." Sep 12 17:42:05.885581 containerd[2001]: time="2025-09-12T17:42:05.885388375Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:42:05.886343 containerd[2001]: time="2025-09-12T17:42:05.886309693Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:42:05.886442 containerd[2001]: time="2025-09-12T17:42:05.886379860Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:42:05.886661 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:42:05.891835 containerd[2001]: time="2025-09-12T17:42:05.891797836Z" level=info msg="containerd successfully booted in 0.411394s" Sep 12 17:42:05.910408 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:42:05.910724 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:42:05.920412 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:42:05.957824 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:42:05.975934 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:42:05.980311 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:42:05.989751 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 17:42:05.990736 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:42:06.037832 amazon-ssm-agent[2206]: Initializing new seelog logger Sep 12 17:42:06.038189 amazon-ssm-agent[2206]: New Seelog Logger Creation Complete Sep 12 17:42:06.038189 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.038189 amazon-ssm-agent[2206]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.038321 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 processing appconfig overrides Sep 12 17:42:06.039059 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.039059 amazon-ssm-agent[2206]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.039148 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 processing appconfig overrides Sep 12 17:42:06.039423 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.039423 amazon-ssm-agent[2206]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.039423 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 processing appconfig overrides Sep 12 17:42:06.039756 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0385 INFO Proxy environment variables: Sep 12 17:42:06.042517 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.042517 amazon-ssm-agent[2206]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.042652 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 processing appconfig overrides Sep 12 17:42:06.140105 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0390 INFO http_proxy: Sep 12 17:42:06.179059 sshd[2210]: Accepted publickey for core from 139.178.68.195 port 46370 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:06.182864 sshd-session[2210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:06.199114 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:42:06.202779 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:42:06.226310 systemd-logind[1987]: New session 1 of user core. Sep 12 17:42:06.241122 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0390 INFO no_proxy: Sep 12 17:42:06.245159 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:42:06.254029 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:42:06.271995 (systemd)[2237]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:42:06.277309 systemd-logind[1987]: New session c1 of user core. Sep 12 17:42:06.339468 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0390 INFO https_proxy: Sep 12 17:42:06.437723 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0391 INFO Checking if agent identity type OnPrem can be assumed Sep 12 17:42:06.534342 systemd[2237]: Queued start job for default target default.target. Sep 12 17:42:06.537249 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0392 INFO Checking if agent identity type EC2 can be assumed Sep 12 17:42:06.540722 systemd[2237]: Created slice app.slice - User Application Slice. Sep 12 17:42:06.540771 systemd[2237]: Reached target paths.target - Paths. Sep 12 17:42:06.541266 systemd[2237]: Reached target timers.target - Timers. Sep 12 17:42:06.544080 systemd[2237]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:42:06.565549 systemd[2237]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:42:06.565712 systemd[2237]: Reached target sockets.target - Sockets. Sep 12 17:42:06.565912 systemd[2237]: Reached target basic.target - Basic System. Sep 12 17:42:06.566051 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:42:06.566729 systemd[2237]: Reached target default.target - Main User Target. Sep 12 17:42:06.566781 systemd[2237]: Startup finished in 278ms. Sep 12 17:42:06.573187 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:42:06.635673 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0848 INFO Agent will take identity from EC2 Sep 12 17:42:06.663422 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.663422 amazon-ssm-agent[2206]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 12 17:42:06.663568 amazon-ssm-agent[2206]: 2025/09/12 17:42:06 processing appconfig overrides Sep 12 17:42:06.695029 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0862 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 Sep 12 17:42:06.695029 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0862 INFO [amazon-ssm-agent] OS: linux, Arch: amd64 Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0862 INFO [amazon-ssm-agent] Starting Core Agent Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0863 INFO [amazon-ssm-agent] Registrar detected. Attempting registration Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0863 INFO [Registrar] Starting registrar module Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0877 INFO [EC2Identity] Checking disk for registration info Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0878 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.0878 INFO [EC2Identity] Generating registration keypair Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6118 INFO [EC2Identity] Checking write access before registering Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6122 INFO [EC2Identity] Registering EC2 instance with Systems Manager Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6632 INFO [EC2Identity] EC2 registration was successful. Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6632 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6633 INFO [CredentialRefresher] credentialRefresher has started Sep 12 17:42:06.695343 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6633 INFO [CredentialRefresher] Starting credentials refresher loop Sep 12 17:42:06.695761 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6945 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 12 17:42:06.695761 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6948 INFO [CredentialRefresher] Credentials ready Sep 12 17:42:06.725102 systemd[1]: Started sshd@1-172.31.17.147:22-139.178.68.195:46378.service - OpenSSH per-connection server daemon (139.178.68.195:46378). Sep 12 17:42:06.735316 amazon-ssm-agent[2206]: 2025-09-12 17:42:06.6954 INFO [CredentialRefresher] Next credential rotation will be in 29.99998488416667 minutes Sep 12 17:42:06.900837 sshd[2249]: Accepted publickey for core from 139.178.68.195 port 46378 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:06.902654 sshd-session[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:06.911318 systemd-logind[1987]: New session 2 of user core. Sep 12 17:42:06.917317 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:42:07.036931 sshd[2252]: Connection closed by 139.178.68.195 port 46378 Sep 12 17:42:07.038219 sshd-session[2249]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:07.042150 systemd[1]: sshd@1-172.31.17.147:22-139.178.68.195:46378.service: Deactivated successfully. Sep 12 17:42:07.043890 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:42:07.045023 systemd-logind[1987]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:42:07.046860 systemd-logind[1987]: Removed session 2. Sep 12 17:42:07.067151 systemd[1]: Started sshd@2-172.31.17.147:22-139.178.68.195:46382.service - OpenSSH per-connection server daemon (139.178.68.195:46382). Sep 12 17:42:07.238355 sshd[2258]: Accepted publickey for core from 139.178.68.195 port 46382 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:07.239705 sshd-session[2258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:07.244449 systemd-logind[1987]: New session 3 of user core. Sep 12 17:42:07.251178 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:42:07.369922 sshd[2261]: Connection closed by 139.178.68.195 port 46382 Sep 12 17:42:07.370072 sshd-session[2258]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:07.373524 systemd[1]: sshd@2-172.31.17.147:22-139.178.68.195:46382.service: Deactivated successfully. Sep 12 17:42:07.375635 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:42:07.378062 systemd-logind[1987]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:42:07.379275 systemd-logind[1987]: Removed session 3. Sep 12 17:42:07.711409 amazon-ssm-agent[2206]: 2025-09-12 17:42:07.7112 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 12 17:42:07.812190 amazon-ssm-agent[2206]: 2025-09-12 17:42:07.7277 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2268) started Sep 12 17:42:07.912328 amazon-ssm-agent[2206]: 2025-09-12 17:42:07.7278 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 12 17:42:08.555089 ntpd[1974]: Listen normally on 7 eth0 [fe80::4d9:f7ff:fe5c:5e5%2]:123 Sep 12 17:42:08.555507 ntpd[1974]: 12 Sep 17:42:08 ntpd[1974]: Listen normally on 7 eth0 [fe80::4d9:f7ff:fe5c:5e5%2]:123 Sep 12 17:42:08.773194 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:08.774172 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:42:08.775659 systemd[1]: Startup finished in 2.726s (kernel) + 7.376s (initrd) + 10.811s (userspace) = 20.914s. Sep 12 17:42:08.784821 (kubelet)[2285]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:42:10.208301 kubelet[2285]: E0912 17:42:10.208237 2285 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:42:10.211201 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:42:10.211402 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:42:10.212086 systemd[1]: kubelet.service: Consumed 1.087s CPU time, 265M memory peak. Sep 12 17:42:12.130208 systemd-resolved[1864]: Clock change detected. Flushing caches. Sep 12 17:42:17.979680 systemd[1]: Started sshd@3-172.31.17.147:22-139.178.68.195:56248.service - OpenSSH per-connection server daemon (139.178.68.195:56248). Sep 12 17:42:18.154501 sshd[2298]: Accepted publickey for core from 139.178.68.195 port 56248 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:18.156213 sshd-session[2298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:18.162318 systemd-logind[1987]: New session 4 of user core. Sep 12 17:42:18.167996 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:42:18.286045 sshd[2301]: Connection closed by 139.178.68.195 port 56248 Sep 12 17:42:18.286739 sshd-session[2298]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:18.290916 systemd[1]: sshd@3-172.31.17.147:22-139.178.68.195:56248.service: Deactivated successfully. Sep 12 17:42:18.293015 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:42:18.294158 systemd-logind[1987]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:42:18.295319 systemd-logind[1987]: Removed session 4. Sep 12 17:42:18.326159 systemd[1]: Started sshd@4-172.31.17.147:22-139.178.68.195:56252.service - OpenSSH per-connection server daemon (139.178.68.195:56252). Sep 12 17:42:18.501488 sshd[2307]: Accepted publickey for core from 139.178.68.195 port 56252 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:18.502820 sshd-session[2307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:18.508839 systemd-logind[1987]: New session 5 of user core. Sep 12 17:42:18.516045 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:42:18.629433 sshd[2310]: Connection closed by 139.178.68.195 port 56252 Sep 12 17:42:18.630460 sshd-session[2307]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:18.635064 systemd[1]: sshd@4-172.31.17.147:22-139.178.68.195:56252.service: Deactivated successfully. Sep 12 17:42:18.637151 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:42:18.638348 systemd-logind[1987]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:42:18.640067 systemd-logind[1987]: Removed session 5. Sep 12 17:42:18.661633 systemd[1]: Started sshd@5-172.31.17.147:22-139.178.68.195:56268.service - OpenSSH per-connection server daemon (139.178.68.195:56268). Sep 12 17:42:18.840559 sshd[2316]: Accepted publickey for core from 139.178.68.195 port 56268 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:18.841981 sshd-session[2316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:18.849585 systemd-logind[1987]: New session 6 of user core. Sep 12 17:42:18.856021 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:42:18.972635 sshd[2319]: Connection closed by 139.178.68.195 port 56268 Sep 12 17:42:18.973506 sshd-session[2316]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:18.977476 systemd[1]: sshd@5-172.31.17.147:22-139.178.68.195:56268.service: Deactivated successfully. Sep 12 17:42:18.979376 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:42:18.980542 systemd-logind[1987]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:42:18.982133 systemd-logind[1987]: Removed session 6. Sep 12 17:42:19.007302 systemd[1]: Started sshd@6-172.31.17.147:22-139.178.68.195:56280.service - OpenSSH per-connection server daemon (139.178.68.195:56280). Sep 12 17:42:19.183852 sshd[2325]: Accepted publickey for core from 139.178.68.195 port 56280 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:19.185233 sshd-session[2325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:19.190846 systemd-logind[1987]: New session 7 of user core. Sep 12 17:42:19.198020 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:42:19.341053 sudo[2329]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:42:19.341423 sudo[2329]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:42:19.357198 sudo[2329]: pam_unix(sudo:session): session closed for user root Sep 12 17:42:19.379975 sshd[2328]: Connection closed by 139.178.68.195 port 56280 Sep 12 17:42:19.380665 sshd-session[2325]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:19.385128 systemd[1]: sshd@6-172.31.17.147:22-139.178.68.195:56280.service: Deactivated successfully. Sep 12 17:42:19.387080 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:42:19.388259 systemd-logind[1987]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:42:19.389924 systemd-logind[1987]: Removed session 7. Sep 12 17:42:19.415716 systemd[1]: Started sshd@7-172.31.17.147:22-139.178.68.195:56288.service - OpenSSH per-connection server daemon (139.178.68.195:56288). Sep 12 17:42:19.593590 sshd[2335]: Accepted publickey for core from 139.178.68.195 port 56288 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:19.594873 sshd-session[2335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:19.600479 systemd-logind[1987]: New session 8 of user core. Sep 12 17:42:19.608046 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:42:19.707382 sudo[2340]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:42:19.707693 sudo[2340]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:42:19.712731 sudo[2340]: pam_unix(sudo:session): session closed for user root Sep 12 17:42:19.718543 sudo[2339]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:42:19.719025 sudo[2339]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:42:19.730203 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:42:19.772580 augenrules[2362]: No rules Sep 12 17:42:19.773794 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:42:19.774018 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:42:19.775630 sudo[2339]: pam_unix(sudo:session): session closed for user root Sep 12 17:42:19.798460 sshd[2338]: Connection closed by 139.178.68.195 port 56288 Sep 12 17:42:19.798990 sshd-session[2335]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:19.803053 systemd[1]: sshd@7-172.31.17.147:22-139.178.68.195:56288.service: Deactivated successfully. Sep 12 17:42:19.804917 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:42:19.806180 systemd-logind[1987]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:42:19.807728 systemd-logind[1987]: Removed session 8. Sep 12 17:42:19.831528 systemd[1]: Started sshd@8-172.31.17.147:22-139.178.68.195:56304.service - OpenSSH per-connection server daemon (139.178.68.195:56304). Sep 12 17:42:19.999185 sshd[2371]: Accepted publickey for core from 139.178.68.195 port 56304 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:42:20.000217 sshd-session[2371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:20.008968 systemd-logind[1987]: New session 9 of user core. Sep 12 17:42:20.019009 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:42:20.112932 sudo[2375]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:42:20.113299 sudo[2375]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:42:20.717092 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:42:20.745251 (dockerd)[2394]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:42:21.036674 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:42:21.040884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:21.292107 dockerd[2394]: time="2025-09-12T17:42:21.291737469Z" level=info msg="Starting up" Sep 12 17:42:21.294610 dockerd[2394]: time="2025-09-12T17:42:21.294370546Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:42:21.309750 dockerd[2394]: time="2025-09-12T17:42:21.309704430Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:42:21.388507 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3642557551-merged.mount: Deactivated successfully. Sep 12 17:42:21.391079 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:21.405244 (kubelet)[2420]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:42:21.422419 systemd[1]: var-lib-docker-metacopy\x2dcheck2364336695-merged.mount: Deactivated successfully. Sep 12 17:42:21.460932 kubelet[2420]: E0912 17:42:21.460838 2420 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:42:21.463063 dockerd[2394]: time="2025-09-12T17:42:21.462822521Z" level=info msg="Loading containers: start." Sep 12 17:42:21.465980 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:42:21.466169 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:42:21.466596 systemd[1]: kubelet.service: Consumed 187ms CPU time, 110.2M memory peak. Sep 12 17:42:21.480821 kernel: Initializing XFRM netlink socket Sep 12 17:42:21.773409 (udev-worker)[2429]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:42:21.827124 systemd-networkd[1863]: docker0: Link UP Sep 12 17:42:21.832836 dockerd[2394]: time="2025-09-12T17:42:21.832784389Z" level=info msg="Loading containers: done." Sep 12 17:42:21.851594 dockerd[2394]: time="2025-09-12T17:42:21.851487590Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:42:21.851594 dockerd[2394]: time="2025-09-12T17:42:21.851598574Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:42:21.851823 dockerd[2394]: time="2025-09-12T17:42:21.851683593Z" level=info msg="Initializing buildkit" Sep 12 17:42:21.878075 dockerd[2394]: time="2025-09-12T17:42:21.878036216Z" level=info msg="Completed buildkit initialization" Sep 12 17:42:21.884011 dockerd[2394]: time="2025-09-12T17:42:21.883943041Z" level=info msg="Daemon has completed initialization" Sep 12 17:42:21.884555 dockerd[2394]: time="2025-09-12T17:42:21.884153632Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:42:21.885865 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:42:23.224123 containerd[2001]: time="2025-09-12T17:42:23.224074315Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 17:42:23.738936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4142500936.mount: Deactivated successfully. Sep 12 17:42:24.966492 containerd[2001]: time="2025-09-12T17:42:24.966423780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:24.967657 containerd[2001]: time="2025-09-12T17:42:24.967347733Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117124" Sep 12 17:42:24.968588 containerd[2001]: time="2025-09-12T17:42:24.968546642Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:24.971620 containerd[2001]: time="2025-09-12T17:42:24.971581523Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:24.972791 containerd[2001]: time="2025-09-12T17:42:24.972743467Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.748615147s" Sep 12 17:42:24.972932 containerd[2001]: time="2025-09-12T17:42:24.972910437Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 12 17:42:24.973794 containerd[2001]: time="2025-09-12T17:42:24.973756720Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 17:42:26.377514 containerd[2001]: time="2025-09-12T17:42:26.377461126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:26.379134 containerd[2001]: time="2025-09-12T17:42:26.378503884Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716632" Sep 12 17:42:26.380414 containerd[2001]: time="2025-09-12T17:42:26.380381472Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:26.383830 containerd[2001]: time="2025-09-12T17:42:26.383789018Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:26.384981 containerd[2001]: time="2025-09-12T17:42:26.384574305Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.410772674s" Sep 12 17:42:26.384981 containerd[2001]: time="2025-09-12T17:42:26.384607197Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 12 17:42:26.385296 containerd[2001]: time="2025-09-12T17:42:26.385269301Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 17:42:27.595879 containerd[2001]: time="2025-09-12T17:42:27.595817844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:27.597055 containerd[2001]: time="2025-09-12T17:42:27.596820499Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787698" Sep 12 17:42:27.598152 containerd[2001]: time="2025-09-12T17:42:27.598109910Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:27.601286 containerd[2001]: time="2025-09-12T17:42:27.601247206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:27.602330 containerd[2001]: time="2025-09-12T17:42:27.602293836Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.216927733s" Sep 12 17:42:27.602599 containerd[2001]: time="2025-09-12T17:42:27.602459716Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 12 17:42:27.603152 containerd[2001]: time="2025-09-12T17:42:27.603116516Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 17:42:28.717522 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1001145842.mount: Deactivated successfully. Sep 12 17:42:29.239853 containerd[2001]: time="2025-09-12T17:42:29.239795252Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:29.240872 containerd[2001]: time="2025-09-12T17:42:29.240734811Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410252" Sep 12 17:42:29.241788 containerd[2001]: time="2025-09-12T17:42:29.241737388Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:29.244391 containerd[2001]: time="2025-09-12T17:42:29.243744791Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:29.244391 containerd[2001]: time="2025-09-12T17:42:29.244280324Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.641134537s" Sep 12 17:42:29.244391 containerd[2001]: time="2025-09-12T17:42:29.244307051Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 12 17:42:29.244975 containerd[2001]: time="2025-09-12T17:42:29.244950279Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:42:29.749850 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1116060513.mount: Deactivated successfully. Sep 12 17:42:30.759051 containerd[2001]: time="2025-09-12T17:42:30.758992921Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:30.760069 containerd[2001]: time="2025-09-12T17:42:30.760025397Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 17:42:30.761376 containerd[2001]: time="2025-09-12T17:42:30.761319036Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:30.763986 containerd[2001]: time="2025-09-12T17:42:30.763928995Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:30.765791 containerd[2001]: time="2025-09-12T17:42:30.765113837Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.520134359s" Sep 12 17:42:30.765791 containerd[2001]: time="2025-09-12T17:42:30.765157164Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:42:30.766063 containerd[2001]: time="2025-09-12T17:42:30.766029116Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:42:31.204437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4153319419.mount: Deactivated successfully. Sep 12 17:42:31.211967 containerd[2001]: time="2025-09-12T17:42:31.211909625Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:42:31.213313 containerd[2001]: time="2025-09-12T17:42:31.212832491Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 17:42:31.215681 containerd[2001]: time="2025-09-12T17:42:31.215637546Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:42:31.220461 containerd[2001]: time="2025-09-12T17:42:31.220404822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:42:31.221197 containerd[2001]: time="2025-09-12T17:42:31.221159799Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 455.089394ms" Sep 12 17:42:31.221346 containerd[2001]: time="2025-09-12T17:42:31.221325064Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:42:31.222086 containerd[2001]: time="2025-09-12T17:42:31.222056709Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 17:42:31.622065 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:42:31.624042 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:31.715046 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2185486613.mount: Deactivated successfully. Sep 12 17:42:31.959978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:31.975438 (kubelet)[2770]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:42:32.107420 kubelet[2770]: E0912 17:42:32.107361 2770 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:42:32.111170 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:42:32.111381 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:42:32.113571 systemd[1]: kubelet.service: Consumed 212ms CPU time, 107.5M memory peak. Sep 12 17:42:33.811740 containerd[2001]: time="2025-09-12T17:42:33.811312673Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:33.813219 containerd[2001]: time="2025-09-12T17:42:33.813168219Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 12 17:42:33.813836 containerd[2001]: time="2025-09-12T17:42:33.813747657Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:33.823121 containerd[2001]: time="2025-09-12T17:42:33.822544489Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:33.823121 containerd[2001]: time="2025-09-12T17:42:33.822699440Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.600603727s" Sep 12 17:42:33.823121 containerd[2001]: time="2025-09-12T17:42:33.822735717Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 12 17:42:36.229040 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 12 17:42:36.628557 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:36.628906 systemd[1]: kubelet.service: Consumed 212ms CPU time, 107.5M memory peak. Sep 12 17:42:36.631593 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:36.667872 systemd[1]: Reload requested from client PID 2849 ('systemctl') (unit session-9.scope)... Sep 12 17:42:36.667894 systemd[1]: Reloading... Sep 12 17:42:36.816823 zram_generator::config[2895]: No configuration found. Sep 12 17:42:37.101364 systemd[1]: Reloading finished in 432 ms. Sep 12 17:42:37.152762 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:42:37.152906 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:42:37.153364 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:37.153425 systemd[1]: kubelet.service: Consumed 144ms CPU time, 98.3M memory peak. Sep 12 17:42:37.155930 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:37.406071 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:37.417265 (kubelet)[2956]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:42:37.485226 kubelet[2956]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:42:37.485226 kubelet[2956]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:42:37.485226 kubelet[2956]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:42:37.490053 kubelet[2956]: I0912 17:42:37.489979 2956 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:42:37.871632 kubelet[2956]: I0912 17:42:37.871594 2956 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:42:37.871632 kubelet[2956]: I0912 17:42:37.871631 2956 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:42:37.872128 kubelet[2956]: I0912 17:42:37.872098 2956 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:42:37.923566 kubelet[2956]: I0912 17:42:37.923513 2956 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:42:37.931060 kubelet[2956]: E0912 17:42:37.930782 2956 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.17.147:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:37.953284 kubelet[2956]: I0912 17:42:37.953249 2956 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:42:37.960733 kubelet[2956]: I0912 17:42:37.960694 2956 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:42:37.963316 kubelet[2956]: I0912 17:42:37.963252 2956 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:42:37.963791 kubelet[2956]: I0912 17:42:37.963704 2956 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:42:37.963958 kubelet[2956]: I0912 17:42:37.963750 2956 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-17-147","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:42:37.964099 kubelet[2956]: I0912 17:42:37.963966 2956 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:42:37.964099 kubelet[2956]: I0912 17:42:37.963976 2956 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:42:37.964099 kubelet[2956]: I0912 17:42:37.964082 2956 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:42:37.968342 kubelet[2956]: I0912 17:42:37.968282 2956 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:42:37.968342 kubelet[2956]: I0912 17:42:37.968337 2956 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:42:37.970450 kubelet[2956]: I0912 17:42:37.970391 2956 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:42:37.970450 kubelet[2956]: I0912 17:42:37.970449 2956 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:42:37.976516 kubelet[2956]: W0912 17:42:37.975868 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.17.147:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-17-147&limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:37.976516 kubelet[2956]: E0912 17:42:37.975953 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.17.147:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-17-147&limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:37.976516 kubelet[2956]: W0912 17:42:37.976411 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.17.147:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:37.976516 kubelet[2956]: E0912 17:42:37.976461 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.17.147:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:37.978095 kubelet[2956]: I0912 17:42:37.978070 2956 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:42:37.982756 kubelet[2956]: I0912 17:42:37.982729 2956 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:42:37.982985 kubelet[2956]: W0912 17:42:37.982917 2956 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:42:37.984181 kubelet[2956]: I0912 17:42:37.983698 2956 server.go:1274] "Started kubelet" Sep 12 17:42:37.986504 kubelet[2956]: I0912 17:42:37.985943 2956 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:42:37.986504 kubelet[2956]: I0912 17:42:37.986269 2956 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:42:37.986504 kubelet[2956]: I0912 17:42:37.986328 2956 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:42:37.988507 kubelet[2956]: I0912 17:42:37.987207 2956 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:42:37.991617 kubelet[2956]: I0912 17:42:37.991597 2956 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:42:38.003928 kubelet[2956]: I0912 17:42:38.003892 2956 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:42:38.007249 kubelet[2956]: I0912 17:42:38.007224 2956 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:42:38.007899 kubelet[2956]: E0912 17:42:38.007875 2956 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-17-147\" not found" Sep 12 17:42:38.010696 kubelet[2956]: E0912 17:42:38.010644 2956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.17.147:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-17-147?timeout=10s\": dial tcp 172.31.17.147:6443: connect: connection refused" interval="200ms" Sep 12 17:42:38.012733 kubelet[2956]: I0912 17:42:38.012705 2956 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:42:38.012995 kubelet[2956]: I0912 17:42:38.012846 2956 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:42:38.014133 kubelet[2956]: E0912 17:42:38.010757 2956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.17.147:6443/api/v1/namespaces/default/events\": dial tcp 172.31.17.147:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-17-147.186499e3bcb0e3c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-17-147,UID:ip-172-31-17-147,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-17-147,},FirstTimestamp:2025-09-12 17:42:37.983671238 +0000 UTC m=+0.562189233,LastTimestamp:2025-09-12 17:42:37.983671238 +0000 UTC m=+0.562189233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-17-147,}" Sep 12 17:42:38.015231 kubelet[2956]: E0912 17:42:38.014961 2956 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:42:38.015231 kubelet[2956]: I0912 17:42:38.015047 2956 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:42:38.015231 kubelet[2956]: I0912 17:42:38.015091 2956 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:42:38.015728 kubelet[2956]: I0912 17:42:38.015710 2956 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:42:38.030391 kubelet[2956]: I0912 17:42:38.030340 2956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:42:38.032800 kubelet[2956]: I0912 17:42:38.032365 2956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:42:38.032800 kubelet[2956]: I0912 17:42:38.032399 2956 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:42:38.032800 kubelet[2956]: I0912 17:42:38.032425 2956 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:42:38.032800 kubelet[2956]: E0912 17:42:38.032475 2956 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:42:38.041917 kubelet[2956]: W0912 17:42:38.041858 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.17.147:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:38.042060 kubelet[2956]: E0912 17:42:38.041980 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.17.147:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:38.042610 kubelet[2956]: W0912 17:42:38.042477 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.17.147:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:38.042703 kubelet[2956]: E0912 17:42:38.042663 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.17.147:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:38.051812 kubelet[2956]: I0912 17:42:38.051552 2956 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:42:38.051812 kubelet[2956]: I0912 17:42:38.051573 2956 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:42:38.051812 kubelet[2956]: I0912 17:42:38.051591 2956 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:42:38.054277 kubelet[2956]: I0912 17:42:38.054252 2956 policy_none.go:49] "None policy: Start" Sep 12 17:42:38.056376 kubelet[2956]: I0912 17:42:38.056310 2956 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:42:38.056376 kubelet[2956]: I0912 17:42:38.056341 2956 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:42:38.064372 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:42:38.076836 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:42:38.093296 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:42:38.097413 kubelet[2956]: I0912 17:42:38.097255 2956 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:42:38.098053 kubelet[2956]: I0912 17:42:38.098032 2956 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:42:38.098615 kubelet[2956]: I0912 17:42:38.098162 2956 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:42:38.098615 kubelet[2956]: I0912 17:42:38.098488 2956 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:42:38.100362 kubelet[2956]: E0912 17:42:38.100344 2956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-17-147\" not found" Sep 12 17:42:38.146721 systemd[1]: Created slice kubepods-burstable-podbab27f22f87a4f495618426fb2a137cd.slice - libcontainer container kubepods-burstable-podbab27f22f87a4f495618426fb2a137cd.slice. Sep 12 17:42:38.173273 systemd[1]: Created slice kubepods-burstable-pod08557790a341846ce994f7fe908b90a8.slice - libcontainer container kubepods-burstable-pod08557790a341846ce994f7fe908b90a8.slice. Sep 12 17:42:38.180120 systemd[1]: Created slice kubepods-burstable-pod598d0b485749a4cb069b9b7d958da4b1.slice - libcontainer container kubepods-burstable-pod598d0b485749a4cb069b9b7d958da4b1.slice. Sep 12 17:42:38.200634 kubelet[2956]: I0912 17:42:38.200573 2956 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:38.202345 kubelet[2956]: E0912 17:42:38.201015 2956 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.17.147:6443/api/v1/nodes\": dial tcp 172.31.17.147:6443: connect: connection refused" node="ip-172-31-17-147" Sep 12 17:42:38.211940 kubelet[2956]: E0912 17:42:38.211889 2956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.17.147:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-17-147?timeout=10s\": dial tcp 172.31.17.147:6443: connect: connection refused" interval="400ms" Sep 12 17:42:38.315898 kubelet[2956]: I0912 17:42:38.315799 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-k8s-certs\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:38.315898 kubelet[2956]: I0912 17:42:38.315853 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:38.315898 kubelet[2956]: I0912 17:42:38.315884 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-ca-certs\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:38.315898 kubelet[2956]: I0912 17:42:38.315907 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:38.316310 kubelet[2956]: I0912 17:42:38.315932 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/598d0b485749a4cb069b9b7d958da4b1-kubeconfig\") pod \"kube-scheduler-ip-172-31-17-147\" (UID: \"598d0b485749a4cb069b9b7d958da4b1\") " pod="kube-system/kube-scheduler-ip-172-31-17-147" Sep 12 17:42:38.316310 kubelet[2956]: I0912 17:42:38.315953 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-ca-certs\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:38.316310 kubelet[2956]: I0912 17:42:38.315977 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-k8s-certs\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:38.316310 kubelet[2956]: I0912 17:42:38.316000 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-kubeconfig\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:38.316310 kubelet[2956]: I0912 17:42:38.316033 2956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:38.404225 kubelet[2956]: I0912 17:42:38.404108 2956 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:38.405130 kubelet[2956]: E0912 17:42:38.405090 2956 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.17.147:6443/api/v1/nodes\": dial tcp 172.31.17.147:6443: connect: connection refused" node="ip-172-31-17-147" Sep 12 17:42:38.470994 containerd[2001]: time="2025-09-12T17:42:38.470944367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-17-147,Uid:bab27f22f87a4f495618426fb2a137cd,Namespace:kube-system,Attempt:0,}" Sep 12 17:42:38.485682 containerd[2001]: time="2025-09-12T17:42:38.485432055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-17-147,Uid:08557790a341846ce994f7fe908b90a8,Namespace:kube-system,Attempt:0,}" Sep 12 17:42:38.486476 containerd[2001]: time="2025-09-12T17:42:38.485991252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-17-147,Uid:598d0b485749a4cb069b9b7d958da4b1,Namespace:kube-system,Attempt:0,}" Sep 12 17:42:38.614148 kubelet[2956]: E0912 17:42:38.614085 2956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.17.147:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-17-147?timeout=10s\": dial tcp 172.31.17.147:6443: connect: connection refused" interval="800ms" Sep 12 17:42:38.631894 containerd[2001]: time="2025-09-12T17:42:38.631832071Z" level=info msg="connecting to shim 72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb" address="unix:///run/containerd/s/6e71ce2430e72e08d33387acd87398a5474f1ded28bee32c13a0aa978309a1c4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:42:38.640309 containerd[2001]: time="2025-09-12T17:42:38.640259619Z" level=info msg="connecting to shim 697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00" address="unix:///run/containerd/s/7903c5689f09c6db66e02a4eb6f92ebe45c5bc108a27aff935ec965a04479111" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:42:38.641312 containerd[2001]: time="2025-09-12T17:42:38.640562886Z" level=info msg="connecting to shim 20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af" address="unix:///run/containerd/s/e20c758a77d07c00286b0d0ca62b238673b4670b1def6ca165f0e570155b1b81" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:42:38.775065 systemd[1]: Started cri-containerd-20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af.scope - libcontainer container 20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af. Sep 12 17:42:38.776728 systemd[1]: Started cri-containerd-697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00.scope - libcontainer container 697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00. Sep 12 17:42:38.778682 systemd[1]: Started cri-containerd-72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb.scope - libcontainer container 72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb. Sep 12 17:42:38.810586 kubelet[2956]: I0912 17:42:38.810533 2956 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:38.811260 kubelet[2956]: E0912 17:42:38.811208 2956 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.17.147:6443/api/v1/nodes\": dial tcp 172.31.17.147:6443: connect: connection refused" node="ip-172-31-17-147" Sep 12 17:42:38.905603 containerd[2001]: time="2025-09-12T17:42:38.905454139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-17-147,Uid:598d0b485749a4cb069b9b7d958da4b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb\"" Sep 12 17:42:38.910123 containerd[2001]: time="2025-09-12T17:42:38.909971380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-17-147,Uid:08557790a341846ce994f7fe908b90a8,Namespace:kube-system,Attempt:0,} returns sandbox id \"20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af\"" Sep 12 17:42:38.925492 containerd[2001]: time="2025-09-12T17:42:38.924949514Z" level=info msg="CreateContainer within sandbox \"72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:42:38.926919 containerd[2001]: time="2025-09-12T17:42:38.926379810Z" level=info msg="CreateContainer within sandbox \"20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:42:38.944805 containerd[2001]: time="2025-09-12T17:42:38.944741554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-17-147,Uid:bab27f22f87a4f495618426fb2a137cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00\"" Sep 12 17:42:38.948338 containerd[2001]: time="2025-09-12T17:42:38.948295595Z" level=info msg="CreateContainer within sandbox \"697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:42:38.981842 containerd[2001]: time="2025-09-12T17:42:38.981748771Z" level=info msg="Container b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:42:38.982083 containerd[2001]: time="2025-09-12T17:42:38.982057391Z" level=info msg="Container d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:42:38.982808 containerd[2001]: time="2025-09-12T17:42:38.982682320Z" level=info msg="Container 42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:42:39.002722 containerd[2001]: time="2025-09-12T17:42:39.002658911Z" level=info msg="CreateContainer within sandbox \"20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\"" Sep 12 17:42:39.004862 containerd[2001]: time="2025-09-12T17:42:39.004824862Z" level=info msg="StartContainer for \"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\"" Sep 12 17:42:39.007327 containerd[2001]: time="2025-09-12T17:42:39.007236050Z" level=info msg="connecting to shim b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1" address="unix:///run/containerd/s/e20c758a77d07c00286b0d0ca62b238673b4670b1def6ca165f0e570155b1b81" protocol=ttrpc version=3 Sep 12 17:42:39.042744 containerd[2001]: time="2025-09-12T17:42:39.042544393Z" level=info msg="CreateContainer within sandbox \"72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\"" Sep 12 17:42:39.043524 containerd[2001]: time="2025-09-12T17:42:39.043484169Z" level=info msg="CreateContainer within sandbox \"697e3beefe69fc4dcef85a84dabe4d7995cad6e57526487d08af46483c49aa00\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217\"" Sep 12 17:42:39.043819 systemd[1]: Started cri-containerd-b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1.scope - libcontainer container b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1. Sep 12 17:42:39.044877 containerd[2001]: time="2025-09-12T17:42:39.044613002Z" level=info msg="StartContainer for \"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\"" Sep 12 17:42:39.046066 containerd[2001]: time="2025-09-12T17:42:39.046034696Z" level=info msg="connecting to shim 42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b" address="unix:///run/containerd/s/6e71ce2430e72e08d33387acd87398a5474f1ded28bee32c13a0aa978309a1c4" protocol=ttrpc version=3 Sep 12 17:42:39.046238 containerd[2001]: time="2025-09-12T17:42:39.046215793Z" level=info msg="StartContainer for \"d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217\"" Sep 12 17:42:39.048521 containerd[2001]: time="2025-09-12T17:42:39.048492543Z" level=info msg="connecting to shim d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217" address="unix:///run/containerd/s/7903c5689f09c6db66e02a4eb6f92ebe45c5bc108a27aff935ec965a04479111" protocol=ttrpc version=3 Sep 12 17:42:39.090060 systemd[1]: Started cri-containerd-d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217.scope - libcontainer container d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217. Sep 12 17:42:39.102219 systemd[1]: Started cri-containerd-42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b.scope - libcontainer container 42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b. Sep 12 17:42:39.169617 kubelet[2956]: W0912 17:42:39.168735 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.17.147:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:39.169933 kubelet[2956]: E0912 17:42:39.169793 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.17.147:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:39.174148 kubelet[2956]: W0912 17:42:39.174075 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.17.147:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-17-147&limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:39.174287 kubelet[2956]: E0912 17:42:39.174158 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.17.147:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-17-147&limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:39.185077 containerd[2001]: time="2025-09-12T17:42:39.185037911Z" level=info msg="StartContainer for \"d8de7a0626b53f69bbdfe43f6c53f62e36dc4ad68b37606521c5b2e3e5fe7217\" returns successfully" Sep 12 17:42:39.198253 containerd[2001]: time="2025-09-12T17:42:39.198202054Z" level=info msg="StartContainer for \"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\" returns successfully" Sep 12 17:42:39.240598 containerd[2001]: time="2025-09-12T17:42:39.240457026Z" level=info msg="StartContainer for \"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\" returns successfully" Sep 12 17:42:39.277002 kubelet[2956]: W0912 17:42:39.276927 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.17.147:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:39.277183 kubelet[2956]: E0912 17:42:39.277018 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.17.147:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:39.402184 kubelet[2956]: W0912 17:42:39.402117 2956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.17.147:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.17.147:6443: connect: connection refused Sep 12 17:42:39.402349 kubelet[2956]: E0912 17:42:39.402200 2956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.17.147:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.17.147:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:42:39.414891 kubelet[2956]: E0912 17:42:39.414841 2956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.17.147:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-17-147?timeout=10s\": dial tcp 172.31.17.147:6443: connect: connection refused" interval="1.6s" Sep 12 17:42:39.613585 kubelet[2956]: I0912 17:42:39.613557 2956 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:39.614893 kubelet[2956]: E0912 17:42:39.613928 2956 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.17.147:6443/api/v1/nodes\": dial tcp 172.31.17.147:6443: connect: connection refused" node="ip-172-31-17-147" Sep 12 17:42:41.218985 kubelet[2956]: I0912 17:42:41.218630 2956 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:41.856027 kubelet[2956]: E0912 17:42:41.855977 2956 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-17-147\" not found" node="ip-172-31-17-147" Sep 12 17:42:41.901536 kubelet[2956]: I0912 17:42:41.901271 2956 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-17-147" Sep 12 17:42:41.977747 kubelet[2956]: I0912 17:42:41.977706 2956 apiserver.go:52] "Watching apiserver" Sep 12 17:42:42.015216 kubelet[2956]: I0912 17:42:42.015178 2956 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:42:42.098123 kubelet[2956]: E0912 17:42:42.098060 2956 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-17-147\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:42.678130 kubelet[2956]: E0912 17:42:42.677921 2956 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ip-172-31-17-147\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-17-147" Sep 12 17:42:44.375065 systemd[1]: Reload requested from client PID 3220 ('systemctl') (unit session-9.scope)... Sep 12 17:42:44.375085 systemd[1]: Reloading... Sep 12 17:42:44.516802 zram_generator::config[3264]: No configuration found. Sep 12 17:42:44.804992 systemd[1]: Reloading finished in 429 ms. Sep 12 17:42:44.839096 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:44.856147 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:42:44.856405 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:44.856483 systemd[1]: kubelet.service: Consumed 963ms CPU time, 125.2M memory peak. Sep 12 17:42:44.858598 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:45.211288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:45.223505 (kubelet)[3324]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:42:45.290458 kubelet[3324]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:42:45.290458 kubelet[3324]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:42:45.290458 kubelet[3324]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:42:45.290458 kubelet[3324]: I0912 17:42:45.289449 3324 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:42:45.300614 kubelet[3324]: I0912 17:42:45.300576 3324 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:42:45.300873 kubelet[3324]: I0912 17:42:45.300859 3324 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:42:45.301348 kubelet[3324]: I0912 17:42:45.301328 3324 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:42:45.303929 kubelet[3324]: I0912 17:42:45.303905 3324 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:42:45.306953 kubelet[3324]: I0912 17:42:45.306923 3324 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:42:45.316577 kubelet[3324]: I0912 17:42:45.316545 3324 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:42:45.320788 kubelet[3324]: I0912 17:42:45.320636 3324 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:42:45.321592 kubelet[3324]: I0912 17:42:45.321409 3324 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:42:45.322257 kubelet[3324]: I0912 17:42:45.322064 3324 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:42:45.323783 kubelet[3324]: I0912 17:42:45.322235 3324 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-17-147","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:42:45.323783 kubelet[3324]: I0912 17:42:45.323761 3324 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:42:45.323995 kubelet[3324]: I0912 17:42:45.323883 3324 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:42:45.323995 kubelet[3324]: I0912 17:42:45.323930 3324 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:42:45.324390 kubelet[3324]: I0912 17:42:45.324372 3324 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:42:45.324460 kubelet[3324]: I0912 17:42:45.324396 3324 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:42:45.324609 kubelet[3324]: I0912 17:42:45.324540 3324 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:42:45.324609 kubelet[3324]: I0912 17:42:45.324558 3324 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:42:45.331647 kubelet[3324]: I0912 17:42:45.331613 3324 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:42:45.334816 kubelet[3324]: I0912 17:42:45.333970 3324 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:42:45.334816 kubelet[3324]: I0912 17:42:45.334497 3324 server.go:1274] "Started kubelet" Sep 12 17:42:45.338269 kubelet[3324]: I0912 17:42:45.338243 3324 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:42:45.342786 kubelet[3324]: I0912 17:42:45.342720 3324 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:42:45.344281 kubelet[3324]: I0912 17:42:45.344255 3324 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:42:45.345677 kubelet[3324]: I0912 17:42:45.345639 3324 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:42:45.346076 kubelet[3324]: I0912 17:42:45.346060 3324 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:42:45.346463 kubelet[3324]: I0912 17:42:45.346446 3324 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:42:45.348792 kubelet[3324]: I0912 17:42:45.348753 3324 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:42:45.349197 kubelet[3324]: E0912 17:42:45.349179 3324 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-17-147\" not found" Sep 12 17:42:45.359067 kubelet[3324]: I0912 17:42:45.359039 3324 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:42:45.359355 kubelet[3324]: I0912 17:42:45.359340 3324 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:42:45.363838 kubelet[3324]: I0912 17:42:45.363761 3324 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:42:45.364748 kubelet[3324]: I0912 17:42:45.364721 3324 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:42:45.364882 kubelet[3324]: I0912 17:42:45.364864 3324 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:42:45.365984 kubelet[3324]: I0912 17:42:45.365570 3324 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:42:45.365984 kubelet[3324]: I0912 17:42:45.365601 3324 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:42:45.365984 kubelet[3324]: I0912 17:42:45.365624 3324 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:42:45.365984 kubelet[3324]: E0912 17:42:45.365670 3324 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:42:45.383460 kubelet[3324]: E0912 17:42:45.383424 3324 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:42:45.383689 kubelet[3324]: I0912 17:42:45.383622 3324 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:42:45.452574 kubelet[3324]: I0912 17:42:45.452546 3324 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:42:45.452811 kubelet[3324]: I0912 17:42:45.452723 3324 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:42:45.452811 kubelet[3324]: I0912 17:42:45.452745 3324 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:42:45.452971 kubelet[3324]: I0912 17:42:45.452945 3324 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:42:45.453021 kubelet[3324]: I0912 17:42:45.452969 3324 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:42:45.453021 kubelet[3324]: I0912 17:42:45.453001 3324 policy_none.go:49] "None policy: Start" Sep 12 17:42:45.453864 kubelet[3324]: I0912 17:42:45.453836 3324 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:42:45.453864 kubelet[3324]: I0912 17:42:45.453865 3324 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:42:45.454080 kubelet[3324]: I0912 17:42:45.454061 3324 state_mem.go:75] "Updated machine memory state" Sep 12 17:42:45.461924 kubelet[3324]: I0912 17:42:45.461817 3324 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:42:45.462049 kubelet[3324]: I0912 17:42:45.462033 3324 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:42:45.462097 kubelet[3324]: I0912 17:42:45.462046 3324 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:42:45.464091 kubelet[3324]: I0912 17:42:45.463813 3324 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:42:45.489588 kubelet[3324]: E0912 17:42:45.489547 3324 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ip-172-31-17-147\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:45.580422 kubelet[3324]: I0912 17:42:45.580393 3324 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-17-147" Sep 12 17:42:45.591824 kubelet[3324]: I0912 17:42:45.591678 3324 kubelet_node_status.go:111] "Node was previously registered" node="ip-172-31-17-147" Sep 12 17:42:45.591824 kubelet[3324]: I0912 17:42:45.591808 3324 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-17-147" Sep 12 17:42:45.659839 kubelet[3324]: I0912 17:42:45.659792 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/598d0b485749a4cb069b9b7d958da4b1-kubeconfig\") pod \"kube-scheduler-ip-172-31-17-147\" (UID: \"598d0b485749a4cb069b9b7d958da4b1\") " pod="kube-system/kube-scheduler-ip-172-31-17-147" Sep 12 17:42:45.660000 kubelet[3324]: I0912 17:42:45.659858 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-k8s-certs\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:45.660000 kubelet[3324]: I0912 17:42:45.659888 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:45.660000 kubelet[3324]: I0912 17:42:45.659917 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-k8s-certs\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:45.660000 kubelet[3324]: I0912 17:42:45.659940 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:45.660000 kubelet[3324]: I0912 17:42:45.659962 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bab27f22f87a4f495618426fb2a137cd-ca-certs\") pod \"kube-apiserver-ip-172-31-17-147\" (UID: \"bab27f22f87a4f495618426fb2a137cd\") " pod="kube-system/kube-apiserver-ip-172-31-17-147" Sep 12 17:42:45.660205 kubelet[3324]: I0912 17:42:45.659982 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-ca-certs\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:45.660205 kubelet[3324]: I0912 17:42:45.660031 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:45.660205 kubelet[3324]: I0912 17:42:45.660061 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08557790a341846ce994f7fe908b90a8-kubeconfig\") pod \"kube-controller-manager-ip-172-31-17-147\" (UID: \"08557790a341846ce994f7fe908b90a8\") " pod="kube-system/kube-controller-manager-ip-172-31-17-147" Sep 12 17:42:46.332218 kubelet[3324]: I0912 17:42:46.331970 3324 apiserver.go:52] "Watching apiserver" Sep 12 17:42:46.359502 kubelet[3324]: I0912 17:42:46.359455 3324 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:42:46.491056 kubelet[3324]: I0912 17:42:46.490967 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-17-147" podStartSLOduration=1.490948878 podStartE2EDuration="1.490948878s" podCreationTimestamp="2025-09-12 17:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:42:46.490029946 +0000 UTC m=+1.256927042" watchObservedRunningTime="2025-09-12 17:42:46.490948878 +0000 UTC m=+1.257845968" Sep 12 17:42:46.491260 kubelet[3324]: I0912 17:42:46.491090 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-17-147" podStartSLOduration=1.491084967 podStartE2EDuration="1.491084967s" podCreationTimestamp="2025-09-12 17:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:42:46.476038682 +0000 UTC m=+1.242935780" watchObservedRunningTime="2025-09-12 17:42:46.491084967 +0000 UTC m=+1.257982055" Sep 12 17:42:46.518468 kubelet[3324]: I0912 17:42:46.518277 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-17-147" podStartSLOduration=2.518254415 podStartE2EDuration="2.518254415s" podCreationTimestamp="2025-09-12 17:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:42:46.503123158 +0000 UTC m=+1.270020256" watchObservedRunningTime="2025-09-12 17:42:46.518254415 +0000 UTC m=+1.285151515" Sep 12 17:42:50.440324 update_engine[1989]: I20250912 17:42:50.440244 1989 update_attempter.cc:509] Updating boot flags... Sep 12 17:42:50.801500 kubelet[3324]: I0912 17:42:50.801392 3324 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:42:50.804033 kubelet[3324]: I0912 17:42:50.802484 3324 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:42:50.804090 containerd[2001]: time="2025-09-12T17:42:50.802227382Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:42:51.704981 systemd[1]: Created slice kubepods-besteffort-pode60c4a6d_ec03_428e_8e99_c3b353002a00.slice - libcontainer container kubepods-besteffort-pode60c4a6d_ec03_428e_8e99_c3b353002a00.slice. Sep 12 17:42:51.804024 kubelet[3324]: I0912 17:42:51.803888 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e60c4a6d-ec03-428e-8e99-c3b353002a00-kube-proxy\") pod \"kube-proxy-bfmqn\" (UID: \"e60c4a6d-ec03-428e-8e99-c3b353002a00\") " pod="kube-system/kube-proxy-bfmqn" Sep 12 17:42:51.804024 kubelet[3324]: I0912 17:42:51.803926 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n94rg\" (UniqueName: \"kubernetes.io/projected/e60c4a6d-ec03-428e-8e99-c3b353002a00-kube-api-access-n94rg\") pod \"kube-proxy-bfmqn\" (UID: \"e60c4a6d-ec03-428e-8e99-c3b353002a00\") " pod="kube-system/kube-proxy-bfmqn" Sep 12 17:42:51.804024 kubelet[3324]: I0912 17:42:51.803949 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e60c4a6d-ec03-428e-8e99-c3b353002a00-xtables-lock\") pod \"kube-proxy-bfmqn\" (UID: \"e60c4a6d-ec03-428e-8e99-c3b353002a00\") " pod="kube-system/kube-proxy-bfmqn" Sep 12 17:42:51.804024 kubelet[3324]: I0912 17:42:51.803963 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e60c4a6d-ec03-428e-8e99-c3b353002a00-lib-modules\") pod \"kube-proxy-bfmqn\" (UID: \"e60c4a6d-ec03-428e-8e99-c3b353002a00\") " pod="kube-system/kube-proxy-bfmqn" Sep 12 17:42:51.990660 systemd[1]: Created slice kubepods-besteffort-podcc3b6c57_7afa_4c95_ac34_d2ae4f480113.slice - libcontainer container kubepods-besteffort-podcc3b6c57_7afa_4c95_ac34_d2ae4f480113.slice. Sep 12 17:42:52.015071 containerd[2001]: time="2025-09-12T17:42:52.015027469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bfmqn,Uid:e60c4a6d-ec03-428e-8e99-c3b353002a00,Namespace:kube-system,Attempt:0,}" Sep 12 17:42:52.041155 containerd[2001]: time="2025-09-12T17:42:52.041101332Z" level=info msg="connecting to shim 52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07" address="unix:///run/containerd/s/1bbd6877bff8f18b00c7ba8947bbb2a8274a5ffaee64bfc2c1ffdda9204c2eea" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:42:52.085053 systemd[1]: Started cri-containerd-52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07.scope - libcontainer container 52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07. Sep 12 17:42:52.106987 kubelet[3324]: I0912 17:42:52.106024 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/cc3b6c57-7afa-4c95-ac34-d2ae4f480113-var-lib-calico\") pod \"tigera-operator-58fc44c59b-zsjzc\" (UID: \"cc3b6c57-7afa-4c95-ac34-d2ae4f480113\") " pod="tigera-operator/tigera-operator-58fc44c59b-zsjzc" Sep 12 17:42:52.106987 kubelet[3324]: I0912 17:42:52.106152 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppj5z\" (UniqueName: \"kubernetes.io/projected/cc3b6c57-7afa-4c95-ac34-d2ae4f480113-kube-api-access-ppj5z\") pod \"tigera-operator-58fc44c59b-zsjzc\" (UID: \"cc3b6c57-7afa-4c95-ac34-d2ae4f480113\") " pod="tigera-operator/tigera-operator-58fc44c59b-zsjzc" Sep 12 17:42:52.119210 containerd[2001]: time="2025-09-12T17:42:52.119177910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bfmqn,Uid:e60c4a6d-ec03-428e-8e99-c3b353002a00,Namespace:kube-system,Attempt:0,} returns sandbox id \"52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07\"" Sep 12 17:42:52.124792 containerd[2001]: time="2025-09-12T17:42:52.124183365Z" level=info msg="CreateContainer within sandbox \"52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:42:52.167802 containerd[2001]: time="2025-09-12T17:42:52.167750976Z" level=info msg="Container 981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:42:52.168428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount440458406.mount: Deactivated successfully. Sep 12 17:42:52.181434 containerd[2001]: time="2025-09-12T17:42:52.181383744Z" level=info msg="CreateContainer within sandbox \"52560a762cb04d6b84fee376f58a29890efa594a9208f3a5c6ecb30f0e580a07\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2\"" Sep 12 17:42:52.182357 containerd[2001]: time="2025-09-12T17:42:52.182281479Z" level=info msg="StartContainer for \"981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2\"" Sep 12 17:42:52.184111 containerd[2001]: time="2025-09-12T17:42:52.184059322Z" level=info msg="connecting to shim 981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2" address="unix:///run/containerd/s/1bbd6877bff8f18b00c7ba8947bbb2a8274a5ffaee64bfc2c1ffdda9204c2eea" protocol=ttrpc version=3 Sep 12 17:42:52.207808 systemd[1]: Started cri-containerd-981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2.scope - libcontainer container 981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2. Sep 12 17:42:52.263085 containerd[2001]: time="2025-09-12T17:42:52.262593427Z" level=info msg="StartContainer for \"981e2bfac86e21df8ada3263444afa78ea7b8cdf349f61a83bb2aa37ee22a3d2\" returns successfully" Sep 12 17:42:52.294899 containerd[2001]: time="2025-09-12T17:42:52.294847540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zsjzc,Uid:cc3b6c57-7afa-4c95-ac34-d2ae4f480113,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:42:52.326818 containerd[2001]: time="2025-09-12T17:42:52.326423205Z" level=info msg="connecting to shim 66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4" address="unix:///run/containerd/s/29ef1b7b9b9389a33f3208054def658d895ad217fe66c3676755cfe17f52ef61" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:42:52.360021 systemd[1]: Started cri-containerd-66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4.scope - libcontainer container 66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4. Sep 12 17:42:52.421280 containerd[2001]: time="2025-09-12T17:42:52.421208861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zsjzc,Uid:cc3b6c57-7afa-4c95-ac34-d2ae4f480113,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\"" Sep 12 17:42:52.425266 containerd[2001]: time="2025-09-12T17:42:52.425140959Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:42:52.469228 kubelet[3324]: I0912 17:42:52.469163 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bfmqn" podStartSLOduration=1.469128689 podStartE2EDuration="1.469128689s" podCreationTimestamp="2025-09-12 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:42:52.468967593 +0000 UTC m=+7.235864698" watchObservedRunningTime="2025-09-12 17:42:52.469128689 +0000 UTC m=+7.236025787" Sep 12 17:42:54.010057 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount88440761.mount: Deactivated successfully. Sep 12 17:42:56.510428 containerd[2001]: time="2025-09-12T17:42:56.510379142Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:56.518799 containerd[2001]: time="2025-09-12T17:42:56.518177953Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:42:56.518799 containerd[2001]: time="2025-09-12T17:42:56.518291707Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:56.522833 containerd[2001]: time="2025-09-12T17:42:56.522790413Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:42:56.523848 containerd[2001]: time="2025-09-12T17:42:56.523816248Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 4.09859656s" Sep 12 17:42:56.524028 containerd[2001]: time="2025-09-12T17:42:56.524012723Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:42:56.526443 containerd[2001]: time="2025-09-12T17:42:56.526401945Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:42:56.543705 containerd[2001]: time="2025-09-12T17:42:56.542903819Z" level=info msg="Container efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:42:56.556380 containerd[2001]: time="2025-09-12T17:42:56.556273088Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\"" Sep 12 17:42:56.557033 containerd[2001]: time="2025-09-12T17:42:56.557001222Z" level=info msg="StartContainer for \"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\"" Sep 12 17:42:56.558244 containerd[2001]: time="2025-09-12T17:42:56.558215548Z" level=info msg="connecting to shim efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574" address="unix:///run/containerd/s/29ef1b7b9b9389a33f3208054def658d895ad217fe66c3676755cfe17f52ef61" protocol=ttrpc version=3 Sep 12 17:42:56.586072 systemd[1]: Started cri-containerd-efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574.scope - libcontainer container efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574. Sep 12 17:42:56.625311 containerd[2001]: time="2025-09-12T17:42:56.625252274Z" level=info msg="StartContainer for \"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" returns successfully" Sep 12 17:42:57.483151 kubelet[3324]: I0912 17:42:57.482686 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-zsjzc" podStartSLOduration=2.381153377 podStartE2EDuration="6.482669217s" podCreationTimestamp="2025-09-12 17:42:51 +0000 UTC" firstStartedPulling="2025-09-12 17:42:52.423182216 +0000 UTC m=+7.190079293" lastFinishedPulling="2025-09-12 17:42:56.524698055 +0000 UTC m=+11.291595133" observedRunningTime="2025-09-12 17:42:57.48258034 +0000 UTC m=+12.249477436" watchObservedRunningTime="2025-09-12 17:42:57.482669217 +0000 UTC m=+12.249566316" Sep 12 17:43:00.279486 systemd[1]: cri-containerd-efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574.scope: Deactivated successfully. Sep 12 17:43:00.373960 containerd[2001]: time="2025-09-12T17:43:00.373476288Z" level=info msg="received exit event container_id:\"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" id:\"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" pid:3827 exit_status:1 exited_at:{seconds:1757698980 nanos:285960623}" Sep 12 17:43:00.520540 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574-rootfs.mount: Deactivated successfully. Sep 12 17:43:00.554035 containerd[2001]: time="2025-09-12T17:43:00.553911736Z" level=info msg="TaskExit event in podsandbox handler container_id:\"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" id:\"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" pid:3827 exit_status:1 exited_at:{seconds:1757698980 nanos:285960623}" Sep 12 17:43:01.513557 kubelet[3324]: I0912 17:43:01.513507 3324 scope.go:117] "RemoveContainer" containerID="efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574" Sep 12 17:43:01.517465 containerd[2001]: time="2025-09-12T17:43:01.517401270Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 17:43:01.553836 containerd[2001]: time="2025-09-12T17:43:01.553536467Z" level=info msg="Container 9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:01.573407 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4210686447.mount: Deactivated successfully. Sep 12 17:43:01.701015 containerd[2001]: time="2025-09-12T17:43:01.700303518Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\"" Sep 12 17:43:01.702812 containerd[2001]: time="2025-09-12T17:43:01.702323690Z" level=info msg="StartContainer for \"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\"" Sep 12 17:43:01.704127 containerd[2001]: time="2025-09-12T17:43:01.703730592Z" level=info msg="connecting to shim 9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9" address="unix:///run/containerd/s/29ef1b7b9b9389a33f3208054def658d895ad217fe66c3676755cfe17f52ef61" protocol=ttrpc version=3 Sep 12 17:43:01.806343 systemd[1]: Started cri-containerd-9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9.scope - libcontainer container 9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9. Sep 12 17:43:01.975988 containerd[2001]: time="2025-09-12T17:43:01.975914301Z" level=info msg="StartContainer for \"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\" returns successfully" Sep 12 17:43:04.519340 sudo[2375]: pam_unix(sudo:session): session closed for user root Sep 12 17:43:04.541870 sshd[2374]: Connection closed by 139.178.68.195 port 56304 Sep 12 17:43:04.544637 sshd-session[2371]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:04.550434 systemd-logind[1987]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:43:04.553588 systemd[1]: sshd@8-172.31.17.147:22-139.178.68.195:56304.service: Deactivated successfully. Sep 12 17:43:04.558162 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:43:04.558819 systemd[1]: session-9.scope: Consumed 5.118s CPU time, 151.9M memory peak. Sep 12 17:43:04.563635 systemd-logind[1987]: Removed session 9. Sep 12 17:43:11.650591 kubelet[3324]: W0912 17:43:11.649923 3324 reflector.go:561] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ip-172-31-17-147" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-17-147' and this object Sep 12 17:43:11.650882 systemd[1]: Created slice kubepods-besteffort-pod0af10f0e_5314_4ad6_b4c4_ff8a4081055a.slice - libcontainer container kubepods-besteffort-pod0af10f0e_5314_4ad6_b4c4_ff8a4081055a.slice. Sep 12 17:43:11.656678 kubelet[3324]: E0912 17:43:11.655647 3324 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ip-172-31-17-147\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-17-147' and this object" logger="UnhandledError" Sep 12 17:43:11.659184 kubelet[3324]: W0912 17:43:11.659085 3324 reflector.go:561] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ip-172-31-17-147" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-17-147' and this object Sep 12 17:43:11.659184 kubelet[3324]: E0912 17:43:11.659128 3324 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-172-31-17-147\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-17-147' and this object" logger="UnhandledError" Sep 12 17:43:11.696895 kubelet[3324]: I0912 17:43:11.696855 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-typha-certs\") pod \"calico-typha-6f7bd579f7-rgw5s\" (UID: \"0af10f0e-5314-4ad6-b4c4-ff8a4081055a\") " pod="calico-system/calico-typha-6f7bd579f7-rgw5s" Sep 12 17:43:11.697221 kubelet[3324]: I0912 17:43:11.697104 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv57r\" (UniqueName: \"kubernetes.io/projected/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-kube-api-access-fv57r\") pod \"calico-typha-6f7bd579f7-rgw5s\" (UID: \"0af10f0e-5314-4ad6-b4c4-ff8a4081055a\") " pod="calico-system/calico-typha-6f7bd579f7-rgw5s" Sep 12 17:43:11.697221 kubelet[3324]: I0912 17:43:11.697130 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-tigera-ca-bundle\") pod \"calico-typha-6f7bd579f7-rgw5s\" (UID: \"0af10f0e-5314-4ad6-b4c4-ff8a4081055a\") " pod="calico-system/calico-typha-6f7bd579f7-rgw5s" Sep 12 17:43:11.922270 systemd[1]: Created slice kubepods-besteffort-poda2801e77_528b_4472_a554_2edbb35db800.slice - libcontainer container kubepods-besteffort-poda2801e77_528b_4472_a554_2edbb35db800.slice. Sep 12 17:43:11.998955 kubelet[3324]: I0912 17:43:11.998905 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-lib-modules\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:11.999820 kubelet[3324]: I0912 17:43:11.999377 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-cni-log-dir\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:11.999820 kubelet[3324]: I0912 17:43:11.999407 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-cni-net-dir\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:11.999820 kubelet[3324]: I0912 17:43:11.999422 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-flexvol-driver-host\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:11.999820 kubelet[3324]: I0912 17:43:11.999445 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-var-run-calico\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:11.999820 kubelet[3324]: I0912 17:43:11.999463 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-xtables-lock\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000013 kubelet[3324]: I0912 17:43:11.999501 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-cni-bin-dir\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000013 kubelet[3324]: I0912 17:43:11.999516 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-policysync\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000013 kubelet[3324]: I0912 17:43:11.999535 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a2801e77-528b-4472-a554-2edbb35db800-var-lib-calico\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000013 kubelet[3324]: I0912 17:43:11.999552 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jvp\" (UniqueName: \"kubernetes.io/projected/a2801e77-528b-4472-a554-2edbb35db800-kube-api-access-67jvp\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000013 kubelet[3324]: I0912 17:43:11.999566 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a2801e77-528b-4472-a554-2edbb35db800-node-certs\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.000169 kubelet[3324]: I0912 17:43:11.999581 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2801e77-528b-4472-a554-2edbb35db800-tigera-ca-bundle\") pod \"calico-node-w6hh9\" (UID: \"a2801e77-528b-4472-a554-2edbb35db800\") " pod="calico-system/calico-node-w6hh9" Sep 12 17:43:12.148926 kubelet[3324]: E0912 17:43:12.148168 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:12.196038 kubelet[3324]: E0912 17:43:12.194891 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.196038 kubelet[3324]: W0912 17:43:12.194923 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.196038 kubelet[3324]: E0912 17:43:12.194952 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.197303 kubelet[3324]: E0912 17:43:12.197077 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.197303 kubelet[3324]: W0912 17:43:12.197104 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.197303 kubelet[3324]: E0912 17:43:12.197127 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.197809 kubelet[3324]: E0912 17:43:12.197749 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.198011 kubelet[3324]: W0912 17:43:12.197899 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.198011 kubelet[3324]: E0912 17:43:12.197947 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.198534 kubelet[3324]: E0912 17:43:12.198355 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.198534 kubelet[3324]: W0912 17:43:12.198370 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.198534 kubelet[3324]: E0912 17:43:12.198386 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.199139 kubelet[3324]: E0912 17:43:12.199005 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.199139 kubelet[3324]: W0912 17:43:12.199021 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.199139 kubelet[3324]: E0912 17:43:12.199037 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.199671 kubelet[3324]: E0912 17:43:12.199578 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.199671 kubelet[3324]: W0912 17:43:12.199601 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.199671 kubelet[3324]: E0912 17:43:12.199617 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.200121 kubelet[3324]: E0912 17:43:12.200089 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.200288 kubelet[3324]: W0912 17:43:12.200200 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.200288 kubelet[3324]: E0912 17:43:12.200218 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.200591 kubelet[3324]: E0912 17:43:12.200579 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.200744 kubelet[3324]: W0912 17:43:12.200673 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.200744 kubelet[3324]: E0912 17:43:12.200691 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.201206 kubelet[3324]: E0912 17:43:12.201142 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.201206 kubelet[3324]: W0912 17:43:12.201155 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.201206 kubelet[3324]: E0912 17:43:12.201171 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.201701 kubelet[3324]: E0912 17:43:12.201616 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.201701 kubelet[3324]: W0912 17:43:12.201629 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.201701 kubelet[3324]: E0912 17:43:12.201651 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.202216 kubelet[3324]: E0912 17:43:12.202122 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.202216 kubelet[3324]: W0912 17:43:12.202145 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.202216 kubelet[3324]: E0912 17:43:12.202159 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.202569 kubelet[3324]: E0912 17:43:12.202540 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.202697 kubelet[3324]: W0912 17:43:12.202645 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.202697 kubelet[3324]: E0912 17:43:12.202664 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.203403 kubelet[3324]: E0912 17:43:12.203301 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.203403 kubelet[3324]: W0912 17:43:12.203328 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.203403 kubelet[3324]: E0912 17:43:12.203342 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.203912 kubelet[3324]: E0912 17:43:12.203813 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.203912 kubelet[3324]: W0912 17:43:12.203825 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.203912 kubelet[3324]: E0912 17:43:12.203835 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.204099 kubelet[3324]: E0912 17:43:12.204088 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.204258 kubelet[3324]: W0912 17:43:12.204139 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.204258 kubelet[3324]: E0912 17:43:12.204150 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.204531 kubelet[3324]: E0912 17:43:12.204491 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.204531 kubelet[3324]: W0912 17:43:12.204504 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.204832 kubelet[3324]: E0912 17:43:12.204627 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.205223 kubelet[3324]: E0912 17:43:12.205192 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.205349 kubelet[3324]: W0912 17:43:12.205302 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.205547 kubelet[3324]: E0912 17:43:12.205465 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.205547 kubelet[3324]: I0912 17:43:12.205497 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d4e868f5-c85d-42d9-9fb2-924f44d65af8-varrun\") pod \"csi-node-driver-267zp\" (UID: \"d4e868f5-c85d-42d9-9fb2-924f44d65af8\") " pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:12.206012 kubelet[3324]: E0912 17:43:12.205980 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.206012 kubelet[3324]: W0912 17:43:12.205995 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.206289 kubelet[3324]: E0912 17:43:12.206163 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.206504 kubelet[3324]: E0912 17:43:12.206476 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.206504 kubelet[3324]: W0912 17:43:12.206489 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.206687 kubelet[3324]: E0912 17:43:12.206618 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.207034 kubelet[3324]: E0912 17:43:12.207004 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.207034 kubelet[3324]: W0912 17:43:12.207018 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.207280 kubelet[3324]: E0912 17:43:12.207182 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.207650 kubelet[3324]: E0912 17:43:12.207631 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.207916 kubelet[3324]: W0912 17:43:12.207722 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.208090 kubelet[3324]: E0912 17:43:12.207999 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.208453 kubelet[3324]: E0912 17:43:12.208431 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.208726 kubelet[3324]: W0912 17:43:12.208486 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.208927 kubelet[3324]: E0912 17:43:12.208837 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.209161 kubelet[3324]: E0912 17:43:12.209144 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.209233 kubelet[3324]: W0912 17:43:12.209223 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.209365 kubelet[3324]: E0912 17:43:12.209304 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.209848 kubelet[3324]: E0912 17:43:12.209817 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.210035 kubelet[3324]: W0912 17:43:12.209944 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.210035 kubelet[3324]: E0912 17:43:12.209973 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.210528 kubelet[3324]: E0912 17:43:12.210478 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.210528 kubelet[3324]: W0912 17:43:12.210495 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.210528 kubelet[3324]: E0912 17:43:12.210510 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.211129 kubelet[3324]: E0912 17:43:12.211096 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.211273 kubelet[3324]: W0912 17:43:12.211235 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.211273 kubelet[3324]: E0912 17:43:12.211255 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.211452 kubelet[3324]: I0912 17:43:12.211381 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4e868f5-c85d-42d9-9fb2-924f44d65af8-registration-dir\") pod \"csi-node-driver-267zp\" (UID: \"d4e868f5-c85d-42d9-9fb2-924f44d65af8\") " pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:12.211749 kubelet[3324]: E0912 17:43:12.211722 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.211749 kubelet[3324]: W0912 17:43:12.211735 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.211944 kubelet[3324]: E0912 17:43:12.211805 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.212211 kubelet[3324]: E0912 17:43:12.212181 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.212211 kubelet[3324]: W0912 17:43:12.212195 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.212403 kubelet[3324]: E0912 17:43:12.212330 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.212663 kubelet[3324]: E0912 17:43:12.212619 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.212663 kubelet[3324]: W0912 17:43:12.212632 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.212663 kubelet[3324]: E0912 17:43:12.212645 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.213017 kubelet[3324]: I0912 17:43:12.212887 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4e868f5-c85d-42d9-9fb2-924f44d65af8-kubelet-dir\") pod \"csi-node-driver-267zp\" (UID: \"d4e868f5-c85d-42d9-9fb2-924f44d65af8\") " pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:12.213316 kubelet[3324]: E0912 17:43:12.213285 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.213316 kubelet[3324]: W0912 17:43:12.213300 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.213542 kubelet[3324]: E0912 17:43:12.213421 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.213882 kubelet[3324]: E0912 17:43:12.213840 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.213882 kubelet[3324]: W0912 17:43:12.213865 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.214140 kubelet[3324]: E0912 17:43:12.213953 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.214522 kubelet[3324]: E0912 17:43:12.214445 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.214522 kubelet[3324]: W0912 17:43:12.214459 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.214522 kubelet[3324]: E0912 17:43:12.214472 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.215050 kubelet[3324]: E0912 17:43:12.214973 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.215050 kubelet[3324]: W0912 17:43:12.214987 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.215050 kubelet[3324]: E0912 17:43:12.215000 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.215562 kubelet[3324]: E0912 17:43:12.215498 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.215562 kubelet[3324]: W0912 17:43:12.215512 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.215562 kubelet[3324]: E0912 17:43:12.215536 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.215880 kubelet[3324]: I0912 17:43:12.215662 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4e868f5-c85d-42d9-9fb2-924f44d65af8-socket-dir\") pod \"csi-node-driver-267zp\" (UID: \"d4e868f5-c85d-42d9-9fb2-924f44d65af8\") " pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:12.216206 kubelet[3324]: E0912 17:43:12.216147 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.216206 kubelet[3324]: W0912 17:43:12.216164 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.216206 kubelet[3324]: E0912 17:43:12.216177 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.216665 kubelet[3324]: E0912 17:43:12.216642 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.216949 kubelet[3324]: W0912 17:43:12.216838 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.216949 kubelet[3324]: E0912 17:43:12.216856 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.316536 kubelet[3324]: E0912 17:43:12.316446 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.316536 kubelet[3324]: W0912 17:43:12.316479 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.316536 kubelet[3324]: E0912 17:43:12.316505 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.317215 kubelet[3324]: E0912 17:43:12.317175 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.317215 kubelet[3324]: W0912 17:43:12.317194 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.317497 kubelet[3324]: E0912 17:43:12.317372 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.317787 kubelet[3324]: E0912 17:43:12.317679 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.317787 kubelet[3324]: W0912 17:43:12.317692 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.317787 kubelet[3324]: E0912 17:43:12.317706 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.318148 kubelet[3324]: E0912 17:43:12.318133 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.318298 kubelet[3324]: W0912 17:43:12.318216 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.318298 kubelet[3324]: E0912 17:43:12.318249 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.318614 kubelet[3324]: E0912 17:43:12.318585 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.318614 kubelet[3324]: W0912 17:43:12.318599 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.318816 kubelet[3324]: E0912 17:43:12.318730 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.319138 kubelet[3324]: E0912 17:43:12.319022 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.319138 kubelet[3324]: W0912 17:43:12.319035 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.319138 kubelet[3324]: E0912 17:43:12.319049 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.319555 kubelet[3324]: E0912 17:43:12.319526 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.319716 kubelet[3324]: W0912 17:43:12.319633 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.319716 kubelet[3324]: E0912 17:43:12.319666 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.320057 kubelet[3324]: E0912 17:43:12.320029 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.320057 kubelet[3324]: W0912 17:43:12.320042 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.320346 kubelet[3324]: E0912 17:43:12.320187 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.320568 kubelet[3324]: E0912 17:43:12.320557 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.320715 kubelet[3324]: W0912 17:43:12.320631 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.320715 kubelet[3324]: E0912 17:43:12.320652 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.321060 kubelet[3324]: E0912 17:43:12.321031 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.321060 kubelet[3324]: W0912 17:43:12.321045 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.321295 kubelet[3324]: E0912 17:43:12.321183 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.321489 kubelet[3324]: E0912 17:43:12.321461 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.321489 kubelet[3324]: W0912 17:43:12.321474 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.321728 kubelet[3324]: E0912 17:43:12.321616 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.321964 kubelet[3324]: E0912 17:43:12.321936 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.321964 kubelet[3324]: W0912 17:43:12.321948 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.322302 kubelet[3324]: E0912 17:43:12.322250 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.322444 kubelet[3324]: E0912 17:43:12.322418 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.322444 kubelet[3324]: W0912 17:43:12.322429 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.322621 kubelet[3324]: E0912 17:43:12.322606 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.322745 kubelet[3324]: I0912 17:43:12.322704 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dszb\" (UniqueName: \"kubernetes.io/projected/d4e868f5-c85d-42d9-9fb2-924f44d65af8-kube-api-access-6dszb\") pod \"csi-node-driver-267zp\" (UID: \"d4e868f5-c85d-42d9-9fb2-924f44d65af8\") " pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:12.323023 kubelet[3324]: E0912 17:43:12.322994 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.323023 kubelet[3324]: W0912 17:43:12.323007 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.323272 kubelet[3324]: E0912 17:43:12.323254 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.323593 kubelet[3324]: E0912 17:43:12.323558 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.323593 kubelet[3324]: W0912 17:43:12.323577 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.323815 kubelet[3324]: E0912 17:43:12.323714 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.324113 kubelet[3324]: E0912 17:43:12.324081 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.324113 kubelet[3324]: W0912 17:43:12.324096 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.324313 kubelet[3324]: E0912 17:43:12.324228 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.324576 kubelet[3324]: E0912 17:43:12.324544 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.324576 kubelet[3324]: W0912 17:43:12.324558 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.324836 kubelet[3324]: E0912 17:43:12.324686 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.325003 kubelet[3324]: E0912 17:43:12.324984 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.325003 kubelet[3324]: W0912 17:43:12.324999 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.325118 kubelet[3324]: E0912 17:43:12.325107 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.325348 kubelet[3324]: E0912 17:43:12.325331 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.325348 kubelet[3324]: W0912 17:43:12.325345 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.325456 kubelet[3324]: E0912 17:43:12.325428 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.325599 kubelet[3324]: E0912 17:43:12.325583 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.325599 kubelet[3324]: W0912 17:43:12.325597 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.325695 kubelet[3324]: E0912 17:43:12.325614 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.325846 kubelet[3324]: E0912 17:43:12.325827 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.325846 kubelet[3324]: W0912 17:43:12.325841 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.326020 kubelet[3324]: E0912 17:43:12.325972 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.326080 kubelet[3324]: E0912 17:43:12.326040 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.326080 kubelet[3324]: W0912 17:43:12.326050 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.326161 kubelet[3324]: E0912 17:43:12.326131 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.326313 kubelet[3324]: E0912 17:43:12.326294 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.326313 kubelet[3324]: W0912 17:43:12.326311 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.326397 kubelet[3324]: E0912 17:43:12.326330 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.326631 kubelet[3324]: E0912 17:43:12.326612 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.326631 kubelet[3324]: W0912 17:43:12.326626 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.326734 kubelet[3324]: E0912 17:43:12.326640 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.327891 kubelet[3324]: E0912 17:43:12.327873 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.327891 kubelet[3324]: W0912 17:43:12.327891 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.328199 kubelet[3324]: E0912 17:43:12.327913 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.328199 kubelet[3324]: E0912 17:43:12.328180 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.328199 kubelet[3324]: W0912 17:43:12.328191 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.328603 kubelet[3324]: E0912 17:43:12.328247 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.328603 kubelet[3324]: E0912 17:43:12.328520 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.328603 kubelet[3324]: W0912 17:43:12.328531 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.328603 kubelet[3324]: E0912 17:43:12.328544 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.426799 kubelet[3324]: E0912 17:43:12.426739 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.426799 kubelet[3324]: W0912 17:43:12.426793 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.426961 kubelet[3324]: E0912 17:43:12.426812 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.427542 kubelet[3324]: E0912 17:43:12.427103 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.427542 kubelet[3324]: W0912 17:43:12.427114 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.427542 kubelet[3324]: E0912 17:43:12.427123 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.427717 kubelet[3324]: E0912 17:43:12.427678 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.427755 kubelet[3324]: W0912 17:43:12.427718 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.427755 kubelet[3324]: E0912 17:43:12.427729 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.427985 kubelet[3324]: E0912 17:43:12.427970 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.427985 kubelet[3324]: W0912 17:43:12.427982 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.428052 kubelet[3324]: E0912 17:43:12.427991 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.428377 kubelet[3324]: E0912 17:43:12.428213 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.428377 kubelet[3324]: W0912 17:43:12.428225 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.428377 kubelet[3324]: E0912 17:43:12.428233 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.428524 kubelet[3324]: E0912 17:43:12.428509 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.428524 kubelet[3324]: W0912 17:43:12.428521 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.428579 kubelet[3324]: E0912 17:43:12.428530 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.428742 kubelet[3324]: E0912 17:43:12.428716 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.428742 kubelet[3324]: W0912 17:43:12.428734 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.428742 kubelet[3324]: E0912 17:43:12.428742 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.428978 kubelet[3324]: E0912 17:43:12.428964 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.428978 kubelet[3324]: W0912 17:43:12.428976 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.429035 kubelet[3324]: E0912 17:43:12.428985 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.429228 kubelet[3324]: E0912 17:43:12.429215 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.429277 kubelet[3324]: W0912 17:43:12.429229 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.429277 kubelet[3324]: E0912 17:43:12.429238 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.531519 kubelet[3324]: E0912 17:43:12.529457 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.531519 kubelet[3324]: W0912 17:43:12.529547 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.531519 kubelet[3324]: E0912 17:43:12.531362 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.532984 kubelet[3324]: E0912 17:43:12.532891 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.534122 kubelet[3324]: W0912 17:43:12.533931 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.534122 kubelet[3324]: E0912 17:43:12.533967 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.535924 kubelet[3324]: E0912 17:43:12.535793 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.535924 kubelet[3324]: W0912 17:43:12.535814 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.535924 kubelet[3324]: E0912 17:43:12.535836 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.537196 kubelet[3324]: E0912 17:43:12.537006 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.537196 kubelet[3324]: W0912 17:43:12.537025 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.537196 kubelet[3324]: E0912 17:43:12.537048 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.538459 kubelet[3324]: E0912 17:43:12.538425 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.538459 kubelet[3324]: W0912 17:43:12.538443 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.538863 kubelet[3324]: E0912 17:43:12.538464 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.639907 kubelet[3324]: E0912 17:43:12.639875 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.639907 kubelet[3324]: W0912 17:43:12.639902 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.640308 kubelet[3324]: E0912 17:43:12.639925 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.640936 kubelet[3324]: E0912 17:43:12.640844 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.640936 kubelet[3324]: W0912 17:43:12.640864 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.640936 kubelet[3324]: E0912 17:43:12.640884 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.641135 kubelet[3324]: E0912 17:43:12.641118 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.641135 kubelet[3324]: W0912 17:43:12.641130 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.641199 kubelet[3324]: E0912 17:43:12.641140 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.641454 kubelet[3324]: E0912 17:43:12.641319 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.641454 kubelet[3324]: W0912 17:43:12.641334 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.641454 kubelet[3324]: E0912 17:43:12.641348 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.641721 kubelet[3324]: E0912 17:43:12.641673 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.641721 kubelet[3324]: W0912 17:43:12.641682 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.641721 kubelet[3324]: E0912 17:43:12.641691 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.743998 kubelet[3324]: E0912 17:43:12.743959 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.743998 kubelet[3324]: W0912 17:43:12.743988 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744012 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744295 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.745242 kubelet[3324]: W0912 17:43:12.744308 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744323 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744535 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.745242 kubelet[3324]: W0912 17:43:12.744545 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744558 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744889 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.745242 kubelet[3324]: W0912 17:43:12.744904 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.745242 kubelet[3324]: E0912 17:43:12.744919 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.748601 kubelet[3324]: E0912 17:43:12.745173 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.748601 kubelet[3324]: W0912 17:43:12.745184 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.748601 kubelet[3324]: E0912 17:43:12.745196 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.799142 kubelet[3324]: E0912 17:43:12.799022 3324 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:12.799142 kubelet[3324]: E0912 17:43:12.799126 3324 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-tigera-ca-bundle podName:0af10f0e-5314-4ad6-b4c4-ff8a4081055a nodeName:}" failed. No retries permitted until 2025-09-12 17:43:13.299106972 +0000 UTC m=+28.066004049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-tigera-ca-bundle") pod "calico-typha-6f7bd579f7-rgw5s" (UID: "0af10f0e-5314-4ad6-b4c4-ff8a4081055a") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:12.824038 kubelet[3324]: E0912 17:43:12.823982 3324 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:12.824038 kubelet[3324]: E0912 17:43:12.824034 3324 projected.go:194] Error preparing data for projected volume kube-api-access-fv57r for pod calico-system/calico-typha-6f7bd579f7-rgw5s: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:12.824217 kubelet[3324]: E0912 17:43:12.824097 3324 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-kube-api-access-fv57r podName:0af10f0e-5314-4ad6-b4c4-ff8a4081055a nodeName:}" failed. No retries permitted until 2025-09-12 17:43:13.32408043 +0000 UTC m=+28.090977508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fv57r" (UniqueName: "kubernetes.io/projected/0af10f0e-5314-4ad6-b4c4-ff8a4081055a-kube-api-access-fv57r") pod "calico-typha-6f7bd579f7-rgw5s" (UID: "0af10f0e-5314-4ad6-b4c4-ff8a4081055a") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:12.846407 kubelet[3324]: E0912 17:43:12.846372 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.846407 kubelet[3324]: W0912 17:43:12.846399 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.846635 kubelet[3324]: E0912 17:43:12.846425 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.846684 kubelet[3324]: E0912 17:43:12.846665 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.846684 kubelet[3324]: W0912 17:43:12.846676 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.846763 kubelet[3324]: E0912 17:43:12.846691 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.846937 kubelet[3324]: E0912 17:43:12.846913 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.846937 kubelet[3324]: W0912 17:43:12.846930 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.847056 kubelet[3324]: E0912 17:43:12.846945 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.847160 kubelet[3324]: E0912 17:43:12.847144 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.847377 kubelet[3324]: W0912 17:43:12.847158 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.847377 kubelet[3324]: E0912 17:43:12.847171 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.847594 kubelet[3324]: E0912 17:43:12.847574 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.847594 kubelet[3324]: W0912 17:43:12.847590 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.847682 kubelet[3324]: E0912 17:43:12.847606 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.914474 kubelet[3324]: E0912 17:43:12.912961 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.914474 kubelet[3324]: W0912 17:43:12.912985 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.914474 kubelet[3324]: E0912 17:43:12.913009 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.920046 kubelet[3324]: E0912 17:43:12.919504 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.920046 kubelet[3324]: W0912 17:43:12.919630 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.920046 kubelet[3324]: E0912 17:43:12.919665 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.950590 kubelet[3324]: E0912 17:43:12.950539 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.950590 kubelet[3324]: W0912 17:43:12.950578 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.950856 kubelet[3324]: E0912 17:43:12.950613 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.950966 kubelet[3324]: E0912 17:43:12.950915 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.950966 kubelet[3324]: W0912 17:43:12.950930 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.950966 kubelet[3324]: E0912 17:43:12.950948 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:12.951390 kubelet[3324]: E0912 17:43:12.951182 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:12.951440 kubelet[3324]: W0912 17:43:12.951390 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:12.951440 kubelet[3324]: E0912 17:43:12.951408 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.052037 kubelet[3324]: E0912 17:43:13.051931 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.052037 kubelet[3324]: W0912 17:43:13.051956 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.052037 kubelet[3324]: E0912 17:43:13.051979 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.052255 kubelet[3324]: E0912 17:43:13.052216 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.052255 kubelet[3324]: W0912 17:43:13.052227 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.052255 kubelet[3324]: E0912 17:43:13.052240 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.052520 kubelet[3324]: E0912 17:43:13.052426 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.052520 kubelet[3324]: W0912 17:43:13.052438 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.052520 kubelet[3324]: E0912 17:43:13.052449 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.107539 kubelet[3324]: E0912 17:43:13.107126 3324 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:13.107539 kubelet[3324]: E0912 17:43:13.107319 3324 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2801e77-528b-4472-a554-2edbb35db800-tigera-ca-bundle podName:a2801e77-528b-4472-a554-2edbb35db800 nodeName:}" failed. No retries permitted until 2025-09-12 17:43:13.607293344 +0000 UTC m=+28.374190442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/a2801e77-528b-4472-a554-2edbb35db800-tigera-ca-bundle") pod "calico-node-w6hh9" (UID: "a2801e77-528b-4472-a554-2edbb35db800") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:43:13.153162 kubelet[3324]: E0912 17:43:13.153125 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.153162 kubelet[3324]: W0912 17:43:13.153151 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.153397 kubelet[3324]: E0912 17:43:13.153176 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.153477 kubelet[3324]: E0912 17:43:13.153451 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.153477 kubelet[3324]: W0912 17:43:13.153469 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.153570 kubelet[3324]: E0912 17:43:13.153487 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.153754 kubelet[3324]: E0912 17:43:13.153731 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.153754 kubelet[3324]: W0912 17:43:13.153749 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.153903 kubelet[3324]: E0912 17:43:13.153764 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.254840 kubelet[3324]: E0912 17:43:13.254803 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.254840 kubelet[3324]: W0912 17:43:13.254830 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.255067 kubelet[3324]: E0912 17:43:13.254852 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.255331 kubelet[3324]: E0912 17:43:13.255149 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.255331 kubelet[3324]: W0912 17:43:13.255163 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.255331 kubelet[3324]: E0912 17:43:13.255176 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.255489 kubelet[3324]: E0912 17:43:13.255469 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.255489 kubelet[3324]: W0912 17:43:13.255482 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.255546 kubelet[3324]: E0912 17:43:13.255493 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.356223 kubelet[3324]: E0912 17:43:13.356122 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.356223 kubelet[3324]: W0912 17:43:13.356146 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.356223 kubelet[3324]: E0912 17:43:13.356167 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.356927 kubelet[3324]: E0912 17:43:13.356438 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.356927 kubelet[3324]: W0912 17:43:13.356451 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.356927 kubelet[3324]: E0912 17:43:13.356488 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.357031 kubelet[3324]: E0912 17:43:13.356993 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.357031 kubelet[3324]: W0912 17:43:13.357004 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.357149 kubelet[3324]: E0912 17:43:13.357120 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.357298 kubelet[3324]: E0912 17:43:13.357267 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.357298 kubelet[3324]: W0912 17:43:13.357278 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.357298 kubelet[3324]: E0912 17:43:13.357290 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.357453 kubelet[3324]: E0912 17:43:13.357438 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.357453 kubelet[3324]: W0912 17:43:13.357448 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.357530 kubelet[3324]: E0912 17:43:13.357455 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.357695 kubelet[3324]: E0912 17:43:13.357670 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.357695 kubelet[3324]: W0912 17:43:13.357688 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.357900 kubelet[3324]: E0912 17:43:13.357708 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.357980 kubelet[3324]: E0912 17:43:13.357911 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.357980 kubelet[3324]: W0912 17:43:13.357918 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.357980 kubelet[3324]: E0912 17:43:13.357934 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.358112 kubelet[3324]: E0912 17:43:13.358070 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.358112 kubelet[3324]: W0912 17:43:13.358077 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.358112 kubelet[3324]: E0912 17:43:13.358090 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.358400 kubelet[3324]: E0912 17:43:13.358376 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.358400 kubelet[3324]: W0912 17:43:13.358394 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.358495 kubelet[3324]: E0912 17:43:13.358414 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.358964 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.360788 kubelet[3324]: W0912 17:43:13.358977 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.358987 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.359161 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.360788 kubelet[3324]: W0912 17:43:13.359171 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.359180 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.360095 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.360788 kubelet[3324]: W0912 17:43:13.360105 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.360788 kubelet[3324]: E0912 17:43:13.360115 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.364293 kubelet[3324]: E0912 17:43:13.364274 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.364410 kubelet[3324]: W0912 17:43:13.364398 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.364464 kubelet[3324]: E0912 17:43:13.364455 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.457878 kubelet[3324]: E0912 17:43:13.457846 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.457987 kubelet[3324]: W0912 17:43:13.457897 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.457987 kubelet[3324]: E0912 17:43:13.457930 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.464090 containerd[2001]: time="2025-09-12T17:43:13.464020626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f7bd579f7-rgw5s,Uid:0af10f0e-5314-4ad6-b4c4-ff8a4081055a,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:13.557750 containerd[2001]: time="2025-09-12T17:43:13.557224103Z" level=info msg="connecting to shim 85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9" address="unix:///run/containerd/s/f638b397f1588a2035c98d36dd50bbc38d88e0227ece7a48c498fe6b6b1b8d08" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:13.559293 kubelet[3324]: E0912 17:43:13.558858 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.559293 kubelet[3324]: W0912 17:43:13.558875 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.559293 kubelet[3324]: E0912 17:43:13.558893 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.593299 systemd[1]: Started cri-containerd-85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9.scope - libcontainer container 85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9. Sep 12 17:43:13.653573 containerd[2001]: time="2025-09-12T17:43:13.653539490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f7bd579f7-rgw5s,Uid:0af10f0e-5314-4ad6-b4c4-ff8a4081055a,Namespace:calico-system,Attempt:0,} returns sandbox id \"85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9\"" Sep 12 17:43:13.656609 containerd[2001]: time="2025-09-12T17:43:13.656565378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:43:13.659833 kubelet[3324]: E0912 17:43:13.659809 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.660138 kubelet[3324]: W0912 17:43:13.659999 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.660138 kubelet[3324]: E0912 17:43:13.660032 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.660802 kubelet[3324]: E0912 17:43:13.660745 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.660802 kubelet[3324]: W0912 17:43:13.660762 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.661044 kubelet[3324]: E0912 17:43:13.660892 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.661365 kubelet[3324]: E0912 17:43:13.661333 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.661537 kubelet[3324]: W0912 17:43:13.661460 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.661537 kubelet[3324]: E0912 17:43:13.661483 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.661962 kubelet[3324]: E0912 17:43:13.661924 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.662206 kubelet[3324]: W0912 17:43:13.661941 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.662206 kubelet[3324]: E0912 17:43:13.662080 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.662487 kubelet[3324]: E0912 17:43:13.662453 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.662647 kubelet[3324]: W0912 17:43:13.662466 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.662647 kubelet[3324]: E0912 17:43:13.662581 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.664261 kubelet[3324]: E0912 17:43:13.664249 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:13.664404 kubelet[3324]: W0912 17:43:13.664353 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:13.664404 kubelet[3324]: E0912 17:43:13.664369 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:13.728800 containerd[2001]: time="2025-09-12T17:43:13.728738078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-w6hh9,Uid:a2801e77-528b-4472-a554-2edbb35db800,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:13.770291 containerd[2001]: time="2025-09-12T17:43:13.769488372Z" level=info msg="connecting to shim bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2" address="unix:///run/containerd/s/a6c997eb5e54705b3a9389637a25be9b809fc87a23aebb4a31529af6c20116f2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:13.818038 systemd[1]: Started cri-containerd-bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2.scope - libcontainer container bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2. Sep 12 17:43:13.860836 containerd[2001]: time="2025-09-12T17:43:13.860410410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-w6hh9,Uid:a2801e77-528b-4472-a554-2edbb35db800,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\"" Sep 12 17:43:14.366555 kubelet[3324]: E0912 17:43:14.366484 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:15.143540 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3956207926.mount: Deactivated successfully. Sep 12 17:43:16.023959 containerd[2001]: time="2025-09-12T17:43:16.023905821Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:16.024973 containerd[2001]: time="2025-09-12T17:43:16.024747299Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:43:16.026627 containerd[2001]: time="2025-09-12T17:43:16.026589946Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:16.031985 containerd[2001]: time="2025-09-12T17:43:16.031941723Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:16.034064 containerd[2001]: time="2025-09-12T17:43:16.034024528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.377420665s" Sep 12 17:43:16.034238 containerd[2001]: time="2025-09-12T17:43:16.034219785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:43:16.042371 containerd[2001]: time="2025-09-12T17:43:16.042328883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:43:16.069826 containerd[2001]: time="2025-09-12T17:43:16.069394125Z" level=info msg="CreateContainer within sandbox \"85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:43:16.086944 containerd[2001]: time="2025-09-12T17:43:16.085903710Z" level=info msg="Container 2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:16.107833 containerd[2001]: time="2025-09-12T17:43:16.106029371Z" level=info msg="CreateContainer within sandbox \"85a6ea88b8980b5a7972d9247a60ec3a00ea127a0c397c2268eadef53c7660f9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5\"" Sep 12 17:43:16.108814 containerd[2001]: time="2025-09-12T17:43:16.108765551Z" level=info msg="StartContainer for \"2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5\"" Sep 12 17:43:16.110812 containerd[2001]: time="2025-09-12T17:43:16.110748281Z" level=info msg="connecting to shim 2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5" address="unix:///run/containerd/s/f638b397f1588a2035c98d36dd50bbc38d88e0227ece7a48c498fe6b6b1b8d08" protocol=ttrpc version=3 Sep 12 17:43:16.172061 systemd[1]: Started cri-containerd-2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5.scope - libcontainer container 2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5. Sep 12 17:43:16.315528 containerd[2001]: time="2025-09-12T17:43:16.315402087Z" level=info msg="StartContainer for \"2ad821f761febe552bad7d04091b66c40be8e64f483a84d477239552d9a21cb5\" returns successfully" Sep 12 17:43:16.372437 kubelet[3324]: E0912 17:43:16.372380 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:16.657900 kubelet[3324]: E0912 17:43:16.657862 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.657900 kubelet[3324]: W0912 17:43:16.657899 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.658280 kubelet[3324]: E0912 17:43:16.657926 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.665154 kubelet[3324]: E0912 17:43:16.665120 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.665154 kubelet[3324]: W0912 17:43:16.665152 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.665377 kubelet[3324]: E0912 17:43:16.665191 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.665700 kubelet[3324]: E0912 17:43:16.665679 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.665797 kubelet[3324]: W0912 17:43:16.665701 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.665797 kubelet[3324]: E0912 17:43:16.665719 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.666734 kubelet[3324]: E0912 17:43:16.666711 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.666734 kubelet[3324]: W0912 17:43:16.666733 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.666928 kubelet[3324]: E0912 17:43:16.666750 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.668365 kubelet[3324]: E0912 17:43:16.668339 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.668365 kubelet[3324]: W0912 17:43:16.668365 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.668551 kubelet[3324]: E0912 17:43:16.668388 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.668708 kubelet[3324]: E0912 17:43:16.668632 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.668708 kubelet[3324]: W0912 17:43:16.668646 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.668708 kubelet[3324]: E0912 17:43:16.668661 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.668990 kubelet[3324]: E0912 17:43:16.668891 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.668990 kubelet[3324]: W0912 17:43:16.668903 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.668990 kubelet[3324]: E0912 17:43:16.668917 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.670008 kubelet[3324]: E0912 17:43:16.669958 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.670008 kubelet[3324]: W0912 17:43:16.669979 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.670008 kubelet[3324]: E0912 17:43:16.669996 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.670384 kubelet[3324]: E0912 17:43:16.670227 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.670384 kubelet[3324]: W0912 17:43:16.670241 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.670384 kubelet[3324]: E0912 17:43:16.670256 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.670558 kubelet[3324]: E0912 17:43:16.670451 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.670558 kubelet[3324]: W0912 17:43:16.670462 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.670558 kubelet[3324]: E0912 17:43:16.670476 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.670701 kubelet[3324]: E0912 17:43:16.670657 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.670701 kubelet[3324]: W0912 17:43:16.670666 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.670701 kubelet[3324]: E0912 17:43:16.670679 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.671806 kubelet[3324]: E0912 17:43:16.671010 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.671806 kubelet[3324]: W0912 17:43:16.671025 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.671806 kubelet[3324]: E0912 17:43:16.671039 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.671806 kubelet[3324]: E0912 17:43:16.671478 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.671806 kubelet[3324]: W0912 17:43:16.671491 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.671806 kubelet[3324]: E0912 17:43:16.671505 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.672111 kubelet[3324]: E0912 17:43:16.672055 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.672111 kubelet[3324]: W0912 17:43:16.672068 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.672195 kubelet[3324]: E0912 17:43:16.672084 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.672750 kubelet[3324]: E0912 17:43:16.672727 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.672750 kubelet[3324]: W0912 17:43:16.672749 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.672889 kubelet[3324]: E0912 17:43:16.672765 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.700343 kubelet[3324]: E0912 17:43:16.700297 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.700343 kubelet[3324]: W0912 17:43:16.700339 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.702022 kubelet[3324]: E0912 17:43:16.700366 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.702022 kubelet[3324]: E0912 17:43:16.701986 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.702022 kubelet[3324]: W0912 17:43:16.702006 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.702211 kubelet[3324]: E0912 17:43:16.702051 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.703129 kubelet[3324]: E0912 17:43:16.703101 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.703129 kubelet[3324]: W0912 17:43:16.703126 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.703300 kubelet[3324]: E0912 17:43:16.703150 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.703509 kubelet[3324]: E0912 17:43:16.703490 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.703576 kubelet[3324]: W0912 17:43:16.703509 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.703576 kubelet[3324]: E0912 17:43:16.703537 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.704894 kubelet[3324]: E0912 17:43:16.704871 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.704894 kubelet[3324]: W0912 17:43:16.704894 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.705017 kubelet[3324]: E0912 17:43:16.704924 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.705247 kubelet[3324]: E0912 17:43:16.705213 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.705247 kubelet[3324]: W0912 17:43:16.705233 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.705449 kubelet[3324]: E0912 17:43:16.705430 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.705785 kubelet[3324]: E0912 17:43:16.705740 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.705785 kubelet[3324]: W0912 17:43:16.705757 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.705951 kubelet[3324]: E0912 17:43:16.705933 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.706129 kubelet[3324]: E0912 17:43:16.706112 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.706197 kubelet[3324]: W0912 17:43:16.706130 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.706955 kubelet[3324]: E0912 17:43:16.706932 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.707168 kubelet[3324]: E0912 17:43:16.707151 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.707168 kubelet[3324]: W0912 17:43:16.707176 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.707286 kubelet[3324]: E0912 17:43:16.707261 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.707896 kubelet[3324]: E0912 17:43:16.707875 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.707896 kubelet[3324]: W0912 17:43:16.707895 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.708695 kubelet[3324]: E0912 17:43:16.708063 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.708695 kubelet[3324]: E0912 17:43:16.708137 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.708695 kubelet[3324]: W0912 17:43:16.708147 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.708695 kubelet[3324]: E0912 17:43:16.708175 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.708695 kubelet[3324]: E0912 17:43:16.708592 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.708695 kubelet[3324]: W0912 17:43:16.708604 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.708997 kubelet[3324]: E0912 17:43:16.708757 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.709935 kubelet[3324]: E0912 17:43:16.709915 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.709935 kubelet[3324]: W0912 17:43:16.709934 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.710089 kubelet[3324]: E0912 17:43:16.709964 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.710205 kubelet[3324]: E0912 17:43:16.710189 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.710273 kubelet[3324]: W0912 17:43:16.710206 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.710494 kubelet[3324]: E0912 17:43:16.710473 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.710656 kubelet[3324]: E0912 17:43:16.710570 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.710656 kubelet[3324]: W0912 17:43:16.710583 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.710656 kubelet[3324]: E0912 17:43:16.710609 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.711355 kubelet[3324]: E0912 17:43:16.711334 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.711355 kubelet[3324]: W0912 17:43:16.711351 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.711710 kubelet[3324]: E0912 17:43:16.711369 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.712949 kubelet[3324]: E0912 17:43:16.712928 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.713023 kubelet[3324]: W0912 17:43:16.712949 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.713069 kubelet[3324]: E0912 17:43:16.713046 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:16.713281 kubelet[3324]: E0912 17:43:16.713262 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:16.713344 kubelet[3324]: W0912 17:43:16.713282 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:16.713344 kubelet[3324]: E0912 17:43:16.713297 3324 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:17.280905 containerd[2001]: time="2025-09-12T17:43:17.280858216Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:17.281859 containerd[2001]: time="2025-09-12T17:43:17.281696729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:43:17.282740 containerd[2001]: time="2025-09-12T17:43:17.282707854Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:17.286720 containerd[2001]: time="2025-09-12T17:43:17.286132291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:17.286720 containerd[2001]: time="2025-09-12T17:43:17.286605989Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.244230419s" Sep 12 17:43:17.286720 containerd[2001]: time="2025-09-12T17:43:17.286633932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:43:17.288994 containerd[2001]: time="2025-09-12T17:43:17.288960092Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:43:17.302721 containerd[2001]: time="2025-09-12T17:43:17.298712734Z" level=info msg="Container ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:17.321625 containerd[2001]: time="2025-09-12T17:43:17.321571663Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\"" Sep 12 17:43:17.322357 containerd[2001]: time="2025-09-12T17:43:17.322142085Z" level=info msg="StartContainer for \"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\"" Sep 12 17:43:17.324501 containerd[2001]: time="2025-09-12T17:43:17.323868301Z" level=info msg="connecting to shim ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172" address="unix:///run/containerd/s/a6c997eb5e54705b3a9389637a25be9b809fc87a23aebb4a31529af6c20116f2" protocol=ttrpc version=3 Sep 12 17:43:17.366041 systemd[1]: Started cri-containerd-ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172.scope - libcontainer container ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172. Sep 12 17:43:17.421341 containerd[2001]: time="2025-09-12T17:43:17.421300817Z" level=info msg="StartContainer for \"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\" returns successfully" Sep 12 17:43:17.427459 systemd[1]: cri-containerd-ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172.scope: Deactivated successfully. Sep 12 17:43:17.440501 containerd[2001]: time="2025-09-12T17:43:17.440448328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\" id:\"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\" pid:4289 exited_at:{seconds:1757698997 nanos:430145321}" Sep 12 17:43:17.442105 containerd[2001]: time="2025-09-12T17:43:17.442061540Z" level=info msg="received exit event container_id:\"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\" id:\"ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172\" pid:4289 exited_at:{seconds:1757698997 nanos:430145321}" Sep 12 17:43:17.470391 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ee6fb6ed3ffc03e66b58220b39010819b52de7098487b0730fd095b0ca54f172-rootfs.mount: Deactivated successfully. Sep 12 17:43:17.595360 kubelet[3324]: I0912 17:43:17.595035 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:43:17.616800 kubelet[3324]: I0912 17:43:17.616703 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f7bd579f7-rgw5s" podStartSLOduration=4.230061033 podStartE2EDuration="6.616679899s" podCreationTimestamp="2025-09-12 17:43:11 +0000 UTC" firstStartedPulling="2025-09-12 17:43:13.654874492 +0000 UTC m=+28.421771569" lastFinishedPulling="2025-09-12 17:43:16.041493342 +0000 UTC m=+30.808390435" observedRunningTime="2025-09-12 17:43:16.639510578 +0000 UTC m=+31.406407685" watchObservedRunningTime="2025-09-12 17:43:17.616679899 +0000 UTC m=+32.383576998" Sep 12 17:43:18.366568 kubelet[3324]: E0912 17:43:18.366116 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:18.599892 containerd[2001]: time="2025-09-12T17:43:18.599859420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:43:20.366224 kubelet[3324]: E0912 17:43:20.366175 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:22.216801 containerd[2001]: time="2025-09-12T17:43:22.216083670Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:22.217585 containerd[2001]: time="2025-09-12T17:43:22.217539743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:43:22.218280 containerd[2001]: time="2025-09-12T17:43:22.218233777Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:22.223797 containerd[2001]: time="2025-09-12T17:43:22.223744389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:22.224245 containerd[2001]: time="2025-09-12T17:43:22.224184847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.624292281s" Sep 12 17:43:22.224245 containerd[2001]: time="2025-09-12T17:43:22.224213040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:43:22.228613 containerd[2001]: time="2025-09-12T17:43:22.228513070Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:43:22.242267 containerd[2001]: time="2025-09-12T17:43:22.238728448Z" level=info msg="Container 815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:22.255492 containerd[2001]: time="2025-09-12T17:43:22.255141932Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\"" Sep 12 17:43:22.266814 containerd[2001]: time="2025-09-12T17:43:22.266219728Z" level=info msg="StartContainer for \"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\"" Sep 12 17:43:22.268147 containerd[2001]: time="2025-09-12T17:43:22.268099853Z" level=info msg="connecting to shim 815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7" address="unix:///run/containerd/s/a6c997eb5e54705b3a9389637a25be9b809fc87a23aebb4a31529af6c20116f2" protocol=ttrpc version=3 Sep 12 17:43:22.299009 systemd[1]: Started cri-containerd-815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7.scope - libcontainer container 815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7. Sep 12 17:43:22.366989 kubelet[3324]: E0912 17:43:22.366940 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:22.396342 containerd[2001]: time="2025-09-12T17:43:22.396303879Z" level=info msg="StartContainer for \"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\" returns successfully" Sep 12 17:43:23.455005 systemd[1]: cri-containerd-815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7.scope: Deactivated successfully. Sep 12 17:43:23.455521 systemd[1]: cri-containerd-815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7.scope: Consumed 594ms CPU time, 163.1M memory peak, 12.9M read from disk, 171.3M written to disk. Sep 12 17:43:23.488526 containerd[2001]: time="2025-09-12T17:43:23.488478535Z" level=info msg="received exit event container_id:\"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\" id:\"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\" pid:4349 exited_at:{seconds:1757699003 nanos:488244381}" Sep 12 17:43:23.491583 containerd[2001]: time="2025-09-12T17:43:23.491531849Z" level=info msg="TaskExit event in podsandbox handler container_id:\"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\" id:\"815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7\" pid:4349 exited_at:{seconds:1757699003 nanos:488244381}" Sep 12 17:43:23.533990 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-815d18a23404c22cb7f1c82eca64407954e72512edc7ee7b3af9093638220eb7-rootfs.mount: Deactivated successfully. Sep 12 17:43:23.534713 kubelet[3324]: I0912 17:43:23.534607 3324 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 17:43:23.703386 systemd[1]: Created slice kubepods-burstable-pode23da1d8_1bca_4b44_9e73_fa38bacfb830.slice - libcontainer container kubepods-burstable-pode23da1d8_1bca_4b44_9e73_fa38bacfb830.slice. Sep 12 17:43:23.721619 systemd[1]: Created slice kubepods-besteffort-pod8d009f85_f699_4fc5_8e74_03c3edfda177.slice - libcontainer container kubepods-besteffort-pod8d009f85_f699_4fc5_8e74_03c3edfda177.slice. Sep 12 17:43:23.730640 systemd[1]: Created slice kubepods-besteffort-pod51120cf8_ca12_4fc1_807c_f699b7fc3a7c.slice - libcontainer container kubepods-besteffort-pod51120cf8_ca12_4fc1_807c_f699b7fc3a7c.slice. Sep 12 17:43:23.745721 systemd[1]: Created slice kubepods-burstable-pod37d43a87_2d60_4e31_bcad_4f3417a16039.slice - libcontainer container kubepods-burstable-pod37d43a87_2d60_4e31_bcad_4f3417a16039.slice. Sep 12 17:43:23.760827 systemd[1]: Created slice kubepods-besteffort-pod9aa4e000_a87f_4a24_b970_b02f33097384.slice - libcontainer container kubepods-besteffort-pod9aa4e000_a87f_4a24_b970_b02f33097384.slice. Sep 12 17:43:23.765240 kubelet[3324]: I0912 17:43:23.764514 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/675ef9e0-ef8b-4d35-995f-611dbe700deb-calico-apiserver-certs\") pod \"calico-apiserver-7d55cd4957-hw5ks\" (UID: \"675ef9e0-ef8b-4d35-995f-611dbe700deb\") " pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" Sep 12 17:43:23.765240 kubelet[3324]: I0912 17:43:23.764603 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/51120cf8-ca12-4fc1-807c-f699b7fc3a7c-calico-apiserver-certs\") pod \"calico-apiserver-7d55cd4957-jrtsh\" (UID: \"51120cf8-ca12-4fc1-807c-f699b7fc3a7c\") " pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" Sep 12 17:43:23.765240 kubelet[3324]: I0912 17:43:23.764632 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpz86\" (UniqueName: \"kubernetes.io/projected/8d009f85-f699-4fc5-8e74-03c3edfda177-kube-api-access-rpz86\") pod \"goldmane-7988f88666-9hfk5\" (UID: \"8d009f85-f699-4fc5-8e74-03c3edfda177\") " pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:23.765240 kubelet[3324]: I0912 17:43:23.764712 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37d43a87-2d60-4e31-bcad-4f3417a16039-config-volume\") pod \"coredns-7c65d6cfc9-kfp5q\" (UID: \"37d43a87-2d60-4e31-bcad-4f3417a16039\") " pod="kube-system/coredns-7c65d6cfc9-kfp5q" Sep 12 17:43:23.766038 kubelet[3324]: I0912 17:43:23.765996 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdvz7\" (UniqueName: \"kubernetes.io/projected/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-kube-api-access-rdvz7\") pod \"whisker-54b795c59c-hdh9l\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " pod="calico-system/whisker-54b795c59c-hdh9l" Sep 12 17:43:23.771130 kubelet[3324]: I0912 17:43:23.771062 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d009f85-f699-4fc5-8e74-03c3edfda177-config\") pod \"goldmane-7988f88666-9hfk5\" (UID: \"8d009f85-f699-4fc5-8e74-03c3edfda177\") " pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:23.771130 kubelet[3324]: I0912 17:43:23.771124 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa4e000-a87f-4a24-b970-b02f33097384-tigera-ca-bundle\") pod \"calico-kube-controllers-6dd46b974c-2hw57\" (UID: \"9aa4e000-a87f-4a24-b970-b02f33097384\") " pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" Sep 12 17:43:23.771130 kubelet[3324]: I0912 17:43:23.771168 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2cs\" (UniqueName: \"kubernetes.io/projected/675ef9e0-ef8b-4d35-995f-611dbe700deb-kube-api-access-qm2cs\") pod \"calico-apiserver-7d55cd4957-hw5ks\" (UID: \"675ef9e0-ef8b-4d35-995f-611dbe700deb\") " pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" Sep 12 17:43:23.771130 kubelet[3324]: I0912 17:43:23.771209 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d009f85-f699-4fc5-8e74-03c3edfda177-goldmane-ca-bundle\") pod \"goldmane-7988f88666-9hfk5\" (UID: \"8d009f85-f699-4fc5-8e74-03c3edfda177\") " pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:23.771510 kubelet[3324]: I0912 17:43:23.771241 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-ca-bundle\") pod \"whisker-54b795c59c-hdh9l\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " pod="calico-system/whisker-54b795c59c-hdh9l" Sep 12 17:43:23.771510 kubelet[3324]: I0912 17:43:23.771277 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gh8\" (UniqueName: \"kubernetes.io/projected/51120cf8-ca12-4fc1-807c-f699b7fc3a7c-kube-api-access-n4gh8\") pod \"calico-apiserver-7d55cd4957-jrtsh\" (UID: \"51120cf8-ca12-4fc1-807c-f699b7fc3a7c\") " pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" Sep 12 17:43:23.771510 kubelet[3324]: I0912 17:43:23.771314 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-backend-key-pair\") pod \"whisker-54b795c59c-hdh9l\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " pod="calico-system/whisker-54b795c59c-hdh9l" Sep 12 17:43:23.771510 kubelet[3324]: I0912 17:43:23.771354 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847fv\" (UniqueName: \"kubernetes.io/projected/9aa4e000-a87f-4a24-b970-b02f33097384-kube-api-access-847fv\") pod \"calico-kube-controllers-6dd46b974c-2hw57\" (UID: \"9aa4e000-a87f-4a24-b970-b02f33097384\") " pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" Sep 12 17:43:23.771510 kubelet[3324]: I0912 17:43:23.771388 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8d009f85-f699-4fc5-8e74-03c3edfda177-goldmane-key-pair\") pod \"goldmane-7988f88666-9hfk5\" (UID: \"8d009f85-f699-4fc5-8e74-03c3edfda177\") " pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:23.771717 kubelet[3324]: I0912 17:43:23.771428 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e23da1d8-1bca-4b44-9e73-fa38bacfb830-config-volume\") pod \"coredns-7c65d6cfc9-khsdd\" (UID: \"e23da1d8-1bca-4b44-9e73-fa38bacfb830\") " pod="kube-system/coredns-7c65d6cfc9-khsdd" Sep 12 17:43:23.771717 kubelet[3324]: I0912 17:43:23.771459 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmq8b\" (UniqueName: \"kubernetes.io/projected/e23da1d8-1bca-4b44-9e73-fa38bacfb830-kube-api-access-hmq8b\") pod \"coredns-7c65d6cfc9-khsdd\" (UID: \"e23da1d8-1bca-4b44-9e73-fa38bacfb830\") " pod="kube-system/coredns-7c65d6cfc9-khsdd" Sep 12 17:43:23.771717 kubelet[3324]: I0912 17:43:23.771493 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbn8z\" (UniqueName: \"kubernetes.io/projected/37d43a87-2d60-4e31-bcad-4f3417a16039-kube-api-access-tbn8z\") pod \"coredns-7c65d6cfc9-kfp5q\" (UID: \"37d43a87-2d60-4e31-bcad-4f3417a16039\") " pod="kube-system/coredns-7c65d6cfc9-kfp5q" Sep 12 17:43:23.776605 systemd[1]: Created slice kubepods-besteffort-pod675ef9e0_ef8b_4d35_995f_611dbe700deb.slice - libcontainer container kubepods-besteffort-pod675ef9e0_ef8b_4d35_995f_611dbe700deb.slice. Sep 12 17:43:23.791856 systemd[1]: Created slice kubepods-besteffort-pod26bd2ed8_7dc0_44ea_aeba_3dab86eb47f6.slice - libcontainer container kubepods-besteffort-pod26bd2ed8_7dc0_44ea_aeba_3dab86eb47f6.slice. Sep 12 17:43:24.024089 containerd[2001]: time="2025-09-12T17:43:24.023950786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khsdd,Uid:e23da1d8-1bca-4b44-9e73-fa38bacfb830,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:24.030103 containerd[2001]: time="2025-09-12T17:43:24.029800456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9hfk5,Uid:8d009f85-f699-4fc5-8e74-03c3edfda177,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:24.044392 containerd[2001]: time="2025-09-12T17:43:24.044345579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-jrtsh,Uid:51120cf8-ca12-4fc1-807c-f699b7fc3a7c,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:24.062126 containerd[2001]: time="2025-09-12T17:43:24.062073411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kfp5q,Uid:37d43a87-2d60-4e31-bcad-4f3417a16039,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:24.082619 containerd[2001]: time="2025-09-12T17:43:24.082577294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6dd46b974c-2hw57,Uid:9aa4e000-a87f-4a24-b970-b02f33097384,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:24.087204 containerd[2001]: time="2025-09-12T17:43:24.087015111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-hw5ks,Uid:675ef9e0-ef8b-4d35-995f-611dbe700deb,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:24.108033 containerd[2001]: time="2025-09-12T17:43:24.107988356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54b795c59c-hdh9l,Uid:26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:24.378452 systemd[1]: Created slice kubepods-besteffort-podd4e868f5_c85d_42d9_9fb2_924f44d65af8.slice - libcontainer container kubepods-besteffort-podd4e868f5_c85d_42d9_9fb2_924f44d65af8.slice. Sep 12 17:43:24.387877 containerd[2001]: time="2025-09-12T17:43:24.387761792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-267zp,Uid:d4e868f5-c85d-42d9-9fb2-924f44d65af8,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:24.551793 containerd[2001]: time="2025-09-12T17:43:24.550962183Z" level=error msg="Failed to destroy network for sandbox \"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.563167 systemd[1]: run-netns-cni\x2d5727d0cc\x2d7c2b\x2df767\x2db74b\x2daba610951749.mount: Deactivated successfully. Sep 12 17:43:24.571799 containerd[2001]: time="2025-09-12T17:43:24.571697528Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kfp5q,Uid:37d43a87-2d60-4e31-bcad-4f3417a16039,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.573215 containerd[2001]: time="2025-09-12T17:43:24.572993061Z" level=error msg="Failed to destroy network for sandbox \"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.579079 containerd[2001]: time="2025-09-12T17:43:24.579024070Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9hfk5,Uid:8d009f85-f699-4fc5-8e74-03c3edfda177,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.581886 systemd[1]: run-netns-cni\x2ddb9077b6\x2db938\x2d2e00\x2da63b\x2d5bcd0ec59883.mount: Deactivated successfully. Sep 12 17:43:24.602075 kubelet[3324]: E0912 17:43:24.602022 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.603052 kubelet[3324]: E0912 17:43:24.602110 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:24.603052 kubelet[3324]: E0912 17:43:24.602135 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-9hfk5" Sep 12 17:43:24.603052 kubelet[3324]: E0912 17:43:24.602198 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-9hfk5_calico-system(8d009f85-f699-4fc5-8e74-03c3edfda177)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-9hfk5_calico-system(8d009f85-f699-4fc5-8e74-03c3edfda177)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0c9e9cc51b0fb662dde1102cb0c38014cf52e870cbe658b3922ddacd4c938a6d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-9hfk5" podUID="8d009f85-f699-4fc5-8e74-03c3edfda177" Sep 12 17:43:24.605007 containerd[2001]: time="2025-09-12T17:43:24.602850358Z" level=error msg="Failed to destroy network for sandbox \"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.605081 kubelet[3324]: E0912 17:43:24.602022 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.605081 kubelet[3324]: E0912 17:43:24.602588 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kfp5q" Sep 12 17:43:24.605081 kubelet[3324]: E0912 17:43:24.602640 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-kfp5q" Sep 12 17:43:24.605221 kubelet[3324]: E0912 17:43:24.602706 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-kfp5q_kube-system(37d43a87-2d60-4e31-bcad-4f3417a16039)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-kfp5q_kube-system(37d43a87-2d60-4e31-bcad-4f3417a16039)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"73a07f6a8c7c2f918b7fa7dafd69a47a217c5f3e2ab8b6eafe03a6c0352f7503\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-kfp5q" podUID="37d43a87-2d60-4e31-bcad-4f3417a16039" Sep 12 17:43:24.608838 containerd[2001]: time="2025-09-12T17:43:24.607941940Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khsdd,Uid:e23da1d8-1bca-4b44-9e73-fa38bacfb830,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.610249 kubelet[3324]: E0912 17:43:24.608189 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.610249 kubelet[3324]: E0912 17:43:24.608249 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-khsdd" Sep 12 17:43:24.610249 kubelet[3324]: E0912 17:43:24.608281 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-khsdd" Sep 12 17:43:24.608202 systemd[1]: run-netns-cni\x2d6417c74d\x2dd3c8\x2db3ec\x2d2a7d\x2d684e20039126.mount: Deactivated successfully. Sep 12 17:43:24.611817 kubelet[3324]: E0912 17:43:24.608328 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-khsdd_kube-system(e23da1d8-1bca-4b44-9e73-fa38bacfb830)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-khsdd_kube-system(e23da1d8-1bca-4b44-9e73-fa38bacfb830)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc33a57ca80425c7064459ca87a1ab753a0b3bda8c1220591b23f99fb926be29\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-khsdd" podUID="e23da1d8-1bca-4b44-9e73-fa38bacfb830" Sep 12 17:43:24.612399 containerd[2001]: time="2025-09-12T17:43:24.612358280Z" level=error msg="Failed to destroy network for sandbox \"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.619385 containerd[2001]: time="2025-09-12T17:43:24.619335387Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6dd46b974c-2hw57,Uid:9aa4e000-a87f-4a24-b970-b02f33097384,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.620588 kubelet[3324]: E0912 17:43:24.619954 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.621560 kubelet[3324]: E0912 17:43:24.621171 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" Sep 12 17:43:24.621560 kubelet[3324]: E0912 17:43:24.621213 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" Sep 12 17:43:24.621560 kubelet[3324]: E0912 17:43:24.621271 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6dd46b974c-2hw57_calico-system(9aa4e000-a87f-4a24-b970-b02f33097384)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6dd46b974c-2hw57_calico-system(9aa4e000-a87f-4a24-b970-b02f33097384)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1c8549d70148d792888b47d29c5ca446482ba2d067371a764548bd885f69a1d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" podUID="9aa4e000-a87f-4a24-b970-b02f33097384" Sep 12 17:43:24.621231 systemd[1]: run-netns-cni\x2def2ff2ce\x2dcc8b\x2d311f\x2dc565\x2d906a0397fa9b.mount: Deactivated successfully. Sep 12 17:43:24.623942 containerd[2001]: time="2025-09-12T17:43:24.623905524Z" level=error msg="Failed to destroy network for sandbox \"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.628436 containerd[2001]: time="2025-09-12T17:43:24.628029817Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-hw5ks,Uid:675ef9e0-ef8b-4d35-995f-611dbe700deb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.629633 kubelet[3324]: E0912 17:43:24.629190 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.629633 kubelet[3324]: E0912 17:43:24.629297 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" Sep 12 17:43:24.629633 kubelet[3324]: E0912 17:43:24.629326 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" Sep 12 17:43:24.630024 kubelet[3324]: E0912 17:43:24.629373 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d55cd4957-hw5ks_calico-apiserver(675ef9e0-ef8b-4d35-995f-611dbe700deb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d55cd4957-hw5ks_calico-apiserver(675ef9e0-ef8b-4d35-995f-611dbe700deb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb0f4804923f9a66c5f73282c663ac56729a854526da1f4c203ccd1837473bf1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" podUID="675ef9e0-ef8b-4d35-995f-611dbe700deb" Sep 12 17:43:24.635576 containerd[2001]: time="2025-09-12T17:43:24.635451283Z" level=error msg="Failed to destroy network for sandbox \"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.637355 containerd[2001]: time="2025-09-12T17:43:24.637150027Z" level=error msg="Failed to destroy network for sandbox \"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.641517 containerd[2001]: time="2025-09-12T17:43:24.641309006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54b795c59c-hdh9l,Uid:26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.642203 kubelet[3324]: E0912 17:43:24.642161 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.642424 kubelet[3324]: E0912 17:43:24.642218 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54b795c59c-hdh9l" Sep 12 17:43:24.642424 kubelet[3324]: E0912 17:43:24.642245 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54b795c59c-hdh9l" Sep 12 17:43:24.642424 kubelet[3324]: E0912 17:43:24.642300 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-54b795c59c-hdh9l_calico-system(26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-54b795c59c-hdh9l_calico-system(26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7c3daea485ba810de829d44987cad21d7e4b3fc6a875805e1c0dfeb04b54264b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54b795c59c-hdh9l" podUID="26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6" Sep 12 17:43:24.643947 containerd[2001]: time="2025-09-12T17:43:24.643882795Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-jrtsh,Uid:51120cf8-ca12-4fc1-807c-f699b7fc3a7c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.646358 kubelet[3324]: E0912 17:43:24.645297 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.646358 kubelet[3324]: E0912 17:43:24.645366 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" Sep 12 17:43:24.646358 kubelet[3324]: E0912 17:43:24.645399 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" Sep 12 17:43:24.646559 containerd[2001]: time="2025-09-12T17:43:24.645901929Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:43:24.646622 kubelet[3324]: E0912 17:43:24.645443 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d55cd4957-jrtsh_calico-apiserver(51120cf8-ca12-4fc1-807c-f699b7fc3a7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d55cd4957-jrtsh_calico-apiserver(51120cf8-ca12-4fc1-807c-f699b7fc3a7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee1f64fd0d3d1335a7994127077fcf4d014b980603d7016021f49c8acf7079de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" podUID="51120cf8-ca12-4fc1-807c-f699b7fc3a7c" Sep 12 17:43:24.691362 containerd[2001]: time="2025-09-12T17:43:24.691307755Z" level=error msg="Failed to destroy network for sandbox \"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.693862 containerd[2001]: time="2025-09-12T17:43:24.693741064Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-267zp,Uid:d4e868f5-c85d-42d9-9fb2-924f44d65af8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.694116 kubelet[3324]: E0912 17:43:24.694077 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:24.694240 kubelet[3324]: E0912 17:43:24.694145 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:24.694240 kubelet[3324]: E0912 17:43:24.694173 3324 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-267zp" Sep 12 17:43:24.694559 kubelet[3324]: E0912 17:43:24.694266 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-267zp_calico-system(d4e868f5-c85d-42d9-9fb2-924f44d65af8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-267zp_calico-system(d4e868f5-c85d-42d9-9fb2-924f44d65af8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67d0029f6db8396fc3ab1852a48d456aa673150e9d057a9861a13682fd49ebd6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-267zp" podUID="d4e868f5-c85d-42d9-9fb2-924f44d65af8" Sep 12 17:43:25.532245 systemd[1]: run-netns-cni\x2db31ec21f\x2db526\x2dab09\x2d89e3\x2db525fc150fa9.mount: Deactivated successfully. Sep 12 17:43:25.532362 systemd[1]: run-netns-cni\x2d239ec89e\x2d1acc\x2d9dd5\x2dd7ab\x2da907c580e62a.mount: Deactivated successfully. Sep 12 17:43:25.532421 systemd[1]: run-netns-cni\x2d7d1aab3f\x2d4367\x2df9e5\x2dc961\x2d0c70858d8027.mount: Deactivated successfully. Sep 12 17:43:25.532472 systemd[1]: run-netns-cni\x2d5693b2d3\x2dd2b7\x2d261f\x2d8fe9\x2d6380a2cff37e.mount: Deactivated successfully. Sep 12 17:43:30.495666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount643816358.mount: Deactivated successfully. Sep 12 17:43:30.648546 containerd[2001]: time="2025-09-12T17:43:30.645763264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:43:30.648546 containerd[2001]: time="2025-09-12T17:43:30.638691454Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:30.664792 containerd[2001]: time="2025-09-12T17:43:30.664730852Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:30.681419 containerd[2001]: time="2025-09-12T17:43:30.681303834Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.031776965s" Sep 12 17:43:30.681606 containerd[2001]: time="2025-09-12T17:43:30.681587753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:43:30.695048 containerd[2001]: time="2025-09-12T17:43:30.694995394Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:30.707033 containerd[2001]: time="2025-09-12T17:43:30.706969080Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:43:30.800867 containerd[2001]: time="2025-09-12T17:43:30.797691418Z" level=info msg="Container 66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:30.801992 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3163119614.mount: Deactivated successfully. Sep 12 17:43:30.901415 containerd[2001]: time="2025-09-12T17:43:30.901364729Z" level=info msg="CreateContainer within sandbox \"bf2c246a06bc4718c925cd93ada37aab7c8e27cb7925c0ddd5cbbe13f6385ea2\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\"" Sep 12 17:43:30.901921 containerd[2001]: time="2025-09-12T17:43:30.901888600Z" level=info msg="StartContainer for \"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\"" Sep 12 17:43:30.912672 containerd[2001]: time="2025-09-12T17:43:30.912607920Z" level=info msg="connecting to shim 66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f" address="unix:///run/containerd/s/a6c997eb5e54705b3a9389637a25be9b809fc87a23aebb4a31529af6c20116f2" protocol=ttrpc version=3 Sep 12 17:43:31.133191 systemd[1]: Started cri-containerd-66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f.scope - libcontainer container 66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f. Sep 12 17:43:31.240239 containerd[2001]: time="2025-09-12T17:43:31.240188127Z" level=info msg="StartContainer for \"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" returns successfully" Sep 12 17:43:31.459037 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:43:31.460372 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:43:31.786249 kubelet[3324]: I0912 17:43:31.785083 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-w6hh9" podStartSLOduration=3.963270027 podStartE2EDuration="20.785058512s" podCreationTimestamp="2025-09-12 17:43:11 +0000 UTC" firstStartedPulling="2025-09-12 17:43:13.862125784 +0000 UTC m=+28.629022862" lastFinishedPulling="2025-09-12 17:43:30.68391427 +0000 UTC m=+45.450811347" observedRunningTime="2025-09-12 17:43:31.73014861 +0000 UTC m=+46.497045708" watchObservedRunningTime="2025-09-12 17:43:31.785058512 +0000 UTC m=+46.551955626" Sep 12 17:43:31.839804 kubelet[3324]: I0912 17:43:31.839401 3324 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-ca-bundle\") pod \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " Sep 12 17:43:31.839804 kubelet[3324]: I0912 17:43:31.839448 3324 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-backend-key-pair\") pod \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " Sep 12 17:43:31.855849 kubelet[3324]: I0912 17:43:31.855287 3324 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdvz7\" (UniqueName: \"kubernetes.io/projected/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-kube-api-access-rdvz7\") pod \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\" (UID: \"26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6\") " Sep 12 17:43:31.864909 systemd[1]: var-lib-kubelet-pods-26bd2ed8\x2d7dc0\x2d44ea\x2daeba\x2d3dab86eb47f6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drdvz7.mount: Deactivated successfully. Sep 12 17:43:31.868055 kubelet[3324]: I0912 17:43:31.865074 3324 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-kube-api-access-rdvz7" (OuterVolumeSpecName: "kube-api-access-rdvz7") pod "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6" (UID: "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6"). InnerVolumeSpecName "kube-api-access-rdvz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 17:43:31.868055 kubelet[3324]: I0912 17:43:31.865419 3324 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6" (UID: "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 17:43:31.889790 systemd[1]: var-lib-kubelet-pods-26bd2ed8\x2d7dc0\x2d44ea\x2daeba\x2d3dab86eb47f6-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:43:31.894200 kubelet[3324]: I0912 17:43:31.893555 3324 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6" (UID: "26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 17:43:31.956127 kubelet[3324]: I0912 17:43:31.956081 3324 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdvz7\" (UniqueName: \"kubernetes.io/projected/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-kube-api-access-rdvz7\") on node \"ip-172-31-17-147\" DevicePath \"\"" Sep 12 17:43:31.956127 kubelet[3324]: I0912 17:43:31.956115 3324 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-ca-bundle\") on node \"ip-172-31-17-147\" DevicePath \"\"" Sep 12 17:43:31.956127 kubelet[3324]: I0912 17:43:31.956129 3324 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6-whisker-backend-key-pair\") on node \"ip-172-31-17-147\" DevicePath \"\"" Sep 12 17:43:32.081741 containerd[2001]: time="2025-09-12T17:43:32.081577075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"f79c1e64761c3b9848e8fc4975833d3a4bee6fa3b6abda9570f9bbbd60509e12\" pid:4676 exit_status:1 exited_at:{seconds:1757699012 nanos:81124824}" Sep 12 17:43:32.706193 systemd[1]: Removed slice kubepods-besteffort-pod26bd2ed8_7dc0_44ea_aeba_3dab86eb47f6.slice - libcontainer container kubepods-besteffort-pod26bd2ed8_7dc0_44ea_aeba_3dab86eb47f6.slice. Sep 12 17:43:32.858478 systemd[1]: Created slice kubepods-besteffort-pod6728af9a_b69b_4e4d_8a8e_d87c68b8725e.slice - libcontainer container kubepods-besteffort-pod6728af9a_b69b_4e4d_8a8e_d87c68b8725e.slice. Sep 12 17:43:32.859450 containerd[2001]: time="2025-09-12T17:43:32.859010642Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"c4c6a82177504fdfa0094d84132145d02bcfcd55f069f10f1dca54ccbcc6e00d\" pid:4712 exit_status:1 exited_at:{seconds:1757699012 nanos:856985164}" Sep 12 17:43:32.965683 kubelet[3324]: I0912 17:43:32.965318 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6728af9a-b69b-4e4d-8a8e-d87c68b8725e-whisker-backend-key-pair\") pod \"whisker-6bc5b8687b-9v5r5\" (UID: \"6728af9a-b69b-4e4d-8a8e-d87c68b8725e\") " pod="calico-system/whisker-6bc5b8687b-9v5r5" Sep 12 17:43:32.965683 kubelet[3324]: I0912 17:43:32.965378 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6728af9a-b69b-4e4d-8a8e-d87c68b8725e-whisker-ca-bundle\") pod \"whisker-6bc5b8687b-9v5r5\" (UID: \"6728af9a-b69b-4e4d-8a8e-d87c68b8725e\") " pod="calico-system/whisker-6bc5b8687b-9v5r5" Sep 12 17:43:32.965683 kubelet[3324]: I0912 17:43:32.965399 3324 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzlg7\" (UniqueName: \"kubernetes.io/projected/6728af9a-b69b-4e4d-8a8e-d87c68b8725e-kube-api-access-fzlg7\") pod \"whisker-6bc5b8687b-9v5r5\" (UID: \"6728af9a-b69b-4e4d-8a8e-d87c68b8725e\") " pod="calico-system/whisker-6bc5b8687b-9v5r5" Sep 12 17:43:33.171872 containerd[2001]: time="2025-09-12T17:43:33.171753242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bc5b8687b-9v5r5,Uid:6728af9a-b69b-4e4d-8a8e-d87c68b8725e,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:33.388216 kubelet[3324]: I0912 17:43:33.388094 3324 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6" path="/var/lib/kubelet/pods/26bd2ed8-7dc0-44ea-aeba-3dab86eb47f6/volumes" Sep 12 17:43:33.976977 systemd-networkd[1863]: cali8f4ca90f246: Link UP Sep 12 17:43:33.977987 systemd-networkd[1863]: cali8f4ca90f246: Gained carrier Sep 12 17:43:34.019327 containerd[2001]: 2025-09-12 17:43:33.228 [INFO][4755] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:43:34.019327 containerd[2001]: 2025-09-12 17:43:33.296 [INFO][4755] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0 whisker-6bc5b8687b- calico-system 6728af9a-b69b-4e4d-8a8e-d87c68b8725e 896 0 2025-09-12 17:43:32 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6bc5b8687b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-17-147 whisker-6bc5b8687b-9v5r5 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali8f4ca90f246 [] [] }} ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-" Sep 12 17:43:34.019327 containerd[2001]: 2025-09-12 17:43:33.296 [INFO][4755] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.019327 containerd[2001]: 2025-09-12 17:43:33.867 [INFO][4823] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" HandleID="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Workload="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.869 [INFO][4823] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" HandleID="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Workload="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f750), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-17-147", "pod":"whisker-6bc5b8687b-9v5r5", "timestamp":"2025-09-12 17:43:33.867294712 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.869 [INFO][4823] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.871 [INFO][4823] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.872 [INFO][4823] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.895 [INFO][4823] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" host="ip-172-31-17-147" Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.926 [INFO][4823] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.932 [INFO][4823] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.934 [INFO][4823] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:34.019592 containerd[2001]: 2025-09-12 17:43:33.937 [INFO][4823] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.937 [INFO][4823] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" host="ip-172-31-17-147" Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.940 [INFO][4823] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7 Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.949 [INFO][4823] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" host="ip-172-31-17-147" Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.958 [INFO][4823] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.129/26] block=192.168.89.128/26 handle="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" host="ip-172-31-17-147" Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.958 [INFO][4823] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.129/26] handle="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" host="ip-172-31-17-147" Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.958 [INFO][4823] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:34.020288 containerd[2001]: 2025-09-12 17:43:33.958 [INFO][4823] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.129/26] IPv6=[] ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" HandleID="k8s-pod-network.c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Workload="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.020812 containerd[2001]: 2025-09-12 17:43:33.961 [INFO][4755] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0", GenerateName:"whisker-6bc5b8687b-", Namespace:"calico-system", SelfLink:"", UID:"6728af9a-b69b-4e4d-8a8e-d87c68b8725e", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6bc5b8687b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"whisker-6bc5b8687b-9v5r5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8f4ca90f246", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:34.020812 containerd[2001]: 2025-09-12 17:43:33.961 [INFO][4755] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.129/32] ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.020970 containerd[2001]: 2025-09-12 17:43:33.962 [INFO][4755] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8f4ca90f246 ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.020970 containerd[2001]: 2025-09-12 17:43:33.978 [INFO][4755] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.021059 containerd[2001]: 2025-09-12 17:43:33.980 [INFO][4755] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0", GenerateName:"whisker-6bc5b8687b-", Namespace:"calico-system", SelfLink:"", UID:"6728af9a-b69b-4e4d-8a8e-d87c68b8725e", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6bc5b8687b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7", Pod:"whisker-6bc5b8687b-9v5r5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8f4ca90f246", MAC:"46:8b:61:9e:d8:53", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:34.021152 containerd[2001]: 2025-09-12 17:43:33.994 [INFO][4755] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" Namespace="calico-system" Pod="whisker-6bc5b8687b-9v5r5" WorkloadEndpoint="ip--172--31--17--147-k8s-whisker--6bc5b8687b--9v5r5-eth0" Sep 12 17:43:34.045362 (udev-worker)[4649]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:43:34.170219 containerd[2001]: time="2025-09-12T17:43:34.170152540Z" level=info msg="connecting to shim c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7" address="unix:///run/containerd/s/df7e7e5934b77e06f24dedd9d5473aa00e3563e65529f5238047498a24a55a77" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:34.205051 systemd[1]: Started cri-containerd-c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7.scope - libcontainer container c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7. Sep 12 17:43:34.287942 containerd[2001]: time="2025-09-12T17:43:34.287845746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bc5b8687b-9v5r5,Uid:6728af9a-b69b-4e4d-8a8e-d87c68b8725e,Namespace:calico-system,Attempt:0,} returns sandbox id \"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7\"" Sep 12 17:43:34.290217 containerd[2001]: time="2025-09-12T17:43:34.290176093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:43:35.249218 systemd-networkd[1863]: cali8f4ca90f246: Gained IPv6LL Sep 12 17:43:35.602552 containerd[2001]: time="2025-09-12T17:43:35.602436890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:35.603672 containerd[2001]: time="2025-09-12T17:43:35.603508734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:43:35.605071 containerd[2001]: time="2025-09-12T17:43:35.604792488Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:35.606638 containerd[2001]: time="2025-09-12T17:43:35.606580965Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:35.607585 containerd[2001]: time="2025-09-12T17:43:35.607392887Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.317030979s" Sep 12 17:43:35.607585 containerd[2001]: time="2025-09-12T17:43:35.607423662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:43:35.613107 containerd[2001]: time="2025-09-12T17:43:35.612169715Z" level=info msg="CreateContainer within sandbox \"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:43:35.620888 containerd[2001]: time="2025-09-12T17:43:35.620098891Z" level=info msg="Container f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:35.627858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2713764066.mount: Deactivated successfully. Sep 12 17:43:35.639909 containerd[2001]: time="2025-09-12T17:43:35.639864920Z" level=info msg="CreateContainer within sandbox \"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288\"" Sep 12 17:43:35.640672 containerd[2001]: time="2025-09-12T17:43:35.640638228Z" level=info msg="StartContainer for \"f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288\"" Sep 12 17:43:35.641704 containerd[2001]: time="2025-09-12T17:43:35.641668908Z" level=info msg="connecting to shim f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288" address="unix:///run/containerd/s/df7e7e5934b77e06f24dedd9d5473aa00e3563e65529f5238047498a24a55a77" protocol=ttrpc version=3 Sep 12 17:43:35.674089 systemd[1]: Started cri-containerd-f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288.scope - libcontainer container f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288. Sep 12 17:43:35.748789 containerd[2001]: time="2025-09-12T17:43:35.748706223Z" level=info msg="StartContainer for \"f4579b847045e40fb0e5f718019329df650e441c976480ab770852ec9d940288\" returns successfully" Sep 12 17:43:35.752570 containerd[2001]: time="2025-09-12T17:43:35.752417537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:43:37.373437 containerd[2001]: time="2025-09-12T17:43:37.373216568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-267zp,Uid:d4e868f5-c85d-42d9-9fb2-924f44d65af8,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:37.373437 containerd[2001]: time="2025-09-12T17:43:37.373297147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9hfk5,Uid:8d009f85-f699-4fc5-8e74-03c3edfda177,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:37.577903 systemd-networkd[1863]: calid0d4a7e7c93: Link UP Sep 12 17:43:37.579482 systemd-networkd[1863]: calid0d4a7e7c93: Gained carrier Sep 12 17:43:37.583088 (udev-worker)[5037]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:43:37.630734 containerd[2001]: 2025-09-12 17:43:37.436 [INFO][5004] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:43:37.630734 containerd[2001]: 2025-09-12 17:43:37.454 [INFO][5004] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0 goldmane-7988f88666- calico-system 8d009f85-f699-4fc5-8e74-03c3edfda177 823 0 2025-09-12 17:43:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-17-147 goldmane-7988f88666-9hfk5 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid0d4a7e7c93 [] [] }} ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-" Sep 12 17:43:37.630734 containerd[2001]: 2025-09-12 17:43:37.454 [INFO][5004] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.630734 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5023] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" HandleID="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Workload="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5023] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" HandleID="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Workload="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f8d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-17-147", "pod":"goldmane-7988f88666-9hfk5", "timestamp":"2025-09-12 17:43:37.500115717 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5023] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5023] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5023] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.512 [INFO][5023] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" host="ip-172-31-17-147" Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.517 [INFO][5023] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.524 [INFO][5023] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.526 [INFO][5023] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.631533 containerd[2001]: 2025-09-12 17:43:37.529 [INFO][5023] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.529 [INFO][5023] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" host="ip-172-31-17-147" Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.531 [INFO][5023] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47 Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.540 [INFO][5023] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" host="ip-172-31-17-147" Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.551 [INFO][5023] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.130/26] block=192.168.89.128/26 handle="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" host="ip-172-31-17-147" Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.551 [INFO][5023] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.130/26] handle="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" host="ip-172-31-17-147" Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.551 [INFO][5023] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:37.632178 containerd[2001]: 2025-09-12 17:43:37.551 [INFO][5023] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.130/26] IPv6=[] ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" HandleID="k8s-pod-network.a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Workload="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.633729 containerd[2001]: 2025-09-12 17:43:37.570 [INFO][5004] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"8d009f85-f699-4fc5-8e74-03c3edfda177", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"goldmane-7988f88666-9hfk5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0d4a7e7c93", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:37.633729 containerd[2001]: 2025-09-12 17:43:37.570 [INFO][5004] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.130/32] ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.634142 containerd[2001]: 2025-09-12 17:43:37.570 [INFO][5004] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0d4a7e7c93 ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.634142 containerd[2001]: 2025-09-12 17:43:37.585 [INFO][5004] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.634468 containerd[2001]: 2025-09-12 17:43:37.586 [INFO][5004] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"8d009f85-f699-4fc5-8e74-03c3edfda177", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47", Pod:"goldmane-7988f88666-9hfk5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0d4a7e7c93", MAC:"d6:68:83:1f:7d:ec", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:37.634746 containerd[2001]: 2025-09-12 17:43:37.616 [INFO][5004] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" Namespace="calico-system" Pod="goldmane-7988f88666-9hfk5" WorkloadEndpoint="ip--172--31--17--147-k8s-goldmane--7988f88666--9hfk5-eth0" Sep 12 17:43:37.693141 containerd[2001]: time="2025-09-12T17:43:37.693093487Z" level=info msg="connecting to shim a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47" address="unix:///run/containerd/s/dbe10de209a27fd5ed8fbb8a04e29353e325ad6b59badf4295d2f934ebd86096" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:37.718594 (udev-worker)[5039]: Network interface NamePolicy= disabled on kernel command line. Sep 12 17:43:37.722471 systemd-networkd[1863]: cali123a990e1b6: Link UP Sep 12 17:43:37.728130 systemd-networkd[1863]: cali123a990e1b6: Gained carrier Sep 12 17:43:37.759621 containerd[2001]: 2025-09-12 17:43:37.419 [INFO][4993] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:43:37.759621 containerd[2001]: 2025-09-12 17:43:37.446 [INFO][4993] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0 csi-node-driver- calico-system d4e868f5-c85d-42d9-9fb2-924f44d65af8 707 0 2025-09-12 17:43:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-17-147 csi-node-driver-267zp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali123a990e1b6 [] [] }} ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-" Sep 12 17:43:37.759621 containerd[2001]: 2025-09-12 17:43:37.446 [INFO][4993] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.759621 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5018] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" HandleID="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Workload="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5018] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" HandleID="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Workload="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-17-147", "pod":"csi-node-driver-267zp", "timestamp":"2025-09-12 17:43:37.499250096 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.500 [INFO][5018] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.551 [INFO][5018] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.552 [INFO][5018] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.622 [INFO][5018] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" host="ip-172-31-17-147" Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.640 [INFO][5018] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.654 [INFO][5018] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.664 [INFO][5018] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.760157 containerd[2001]: 2025-09-12 17:43:37.676 [INFO][5018] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.676 [INFO][5018] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" host="ip-172-31-17-147" Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.680 [INFO][5018] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8 Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.690 [INFO][5018] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" host="ip-172-31-17-147" Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.705 [INFO][5018] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.131/26] block=192.168.89.128/26 handle="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" host="ip-172-31-17-147" Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.705 [INFO][5018] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.131/26] handle="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" host="ip-172-31-17-147" Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.705 [INFO][5018] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:37.760659 containerd[2001]: 2025-09-12 17:43:37.705 [INFO][5018] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.131/26] IPv6=[] ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" HandleID="k8s-pod-network.692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Workload="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.762006 containerd[2001]: 2025-09-12 17:43:37.713 [INFO][4993] cni-plugin/k8s.go 418: Populated endpoint ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d4e868f5-c85d-42d9-9fb2-924f44d65af8", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"csi-node-driver-267zp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali123a990e1b6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:37.762409 containerd[2001]: 2025-09-12 17:43:37.714 [INFO][4993] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.131/32] ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.762409 containerd[2001]: 2025-09-12 17:43:37.714 [INFO][4993] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali123a990e1b6 ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.762409 containerd[2001]: 2025-09-12 17:43:37.723 [INFO][4993] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.762550 containerd[2001]: 2025-09-12 17:43:37.725 [INFO][4993] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d4e868f5-c85d-42d9-9fb2-924f44d65af8", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8", Pod:"csi-node-driver-267zp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali123a990e1b6", MAC:"6a:f6:36:ae:5e:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:37.762656 containerd[2001]: 2025-09-12 17:43:37.751 [INFO][4993] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" Namespace="calico-system" Pod="csi-node-driver-267zp" WorkloadEndpoint="ip--172--31--17--147-k8s-csi--node--driver--267zp-eth0" Sep 12 17:43:37.791185 systemd[1]: Started cri-containerd-a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47.scope - libcontainer container a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47. Sep 12 17:43:37.828817 containerd[2001]: time="2025-09-12T17:43:37.828745319Z" level=info msg="connecting to shim 692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8" address="unix:///run/containerd/s/32a496da959e83257bbf3512fa9e82745c6984d6212d4ba9b42596d420b888da" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:37.869188 systemd[1]: Started cri-containerd-692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8.scope - libcontainer container 692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8. Sep 12 17:43:37.953484 containerd[2001]: time="2025-09-12T17:43:37.953417276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9hfk5,Uid:8d009f85-f699-4fc5-8e74-03c3edfda177,Namespace:calico-system,Attempt:0,} returns sandbox id \"a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47\"" Sep 12 17:43:37.968611 containerd[2001]: time="2025-09-12T17:43:37.968444886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-267zp,Uid:d4e868f5-c85d-42d9-9fb2-924f44d65af8,Namespace:calico-system,Attempt:0,} returns sandbox id \"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8\"" Sep 12 17:43:38.316518 kubelet[3324]: I0912 17:43:38.316402 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:43:38.368381 containerd[2001]: time="2025-09-12T17:43:38.368324589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-jrtsh,Uid:51120cf8-ca12-4fc1-807c-f699b7fc3a7c,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:38.799299 systemd-networkd[1863]: calif1d87bdfda1: Link UP Sep 12 17:43:38.799597 systemd-networkd[1863]: calif1d87bdfda1: Gained carrier Sep 12 17:43:38.836869 containerd[2001]: 2025-09-12 17:43:38.506 [INFO][5158] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:43:38.836869 containerd[2001]: 2025-09-12 17:43:38.536 [INFO][5158] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0 calico-apiserver-7d55cd4957- calico-apiserver 51120cf8-ca12-4fc1-807c-f699b7fc3a7c 828 0 2025-09-12 17:43:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d55cd4957 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-17-147 calico-apiserver-7d55cd4957-jrtsh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif1d87bdfda1 [] [] }} ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-" Sep 12 17:43:38.836869 containerd[2001]: 2025-09-12 17:43:38.538 [INFO][5158] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.836869 containerd[2001]: 2025-09-12 17:43:38.671 [INFO][5172] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" HandleID="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.673 [INFO][5172] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" HandleID="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e8e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-17-147", "pod":"calico-apiserver-7d55cd4957-jrtsh", "timestamp":"2025-09-12 17:43:38.671731187 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.674 [INFO][5172] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.674 [INFO][5172] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.674 [INFO][5172] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.696 [INFO][5172] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" host="ip-172-31-17-147" Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.707 [INFO][5172] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.721 [INFO][5172] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.727 [INFO][5172] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:38.838100 containerd[2001]: 2025-09-12 17:43:38.740 [INFO][5172] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.740 [INFO][5172] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" host="ip-172-31-17-147" Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.744 [INFO][5172] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581 Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.761 [INFO][5172] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" host="ip-172-31-17-147" Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.780 [INFO][5172] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.132/26] block=192.168.89.128/26 handle="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" host="ip-172-31-17-147" Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.780 [INFO][5172] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.132/26] handle="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" host="ip-172-31-17-147" Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.780 [INFO][5172] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:38.839002 containerd[2001]: 2025-09-12 17:43:38.780 [INFO][5172] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.132/26] IPv6=[] ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" HandleID="k8s-pod-network.6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.839282 containerd[2001]: 2025-09-12 17:43:38.792 [INFO][5158] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0", GenerateName:"calico-apiserver-7d55cd4957-", Namespace:"calico-apiserver", SelfLink:"", UID:"51120cf8-ca12-4fc1-807c-f699b7fc3a7c", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d55cd4957", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"calico-apiserver-7d55cd4957-jrtsh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif1d87bdfda1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:38.839394 containerd[2001]: 2025-09-12 17:43:38.793 [INFO][5158] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.132/32] ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.839394 containerd[2001]: 2025-09-12 17:43:38.793 [INFO][5158] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif1d87bdfda1 ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.839394 containerd[2001]: 2025-09-12 17:43:38.799 [INFO][5158] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.839531 containerd[2001]: 2025-09-12 17:43:38.799 [INFO][5158] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0", GenerateName:"calico-apiserver-7d55cd4957-", Namespace:"calico-apiserver", SelfLink:"", UID:"51120cf8-ca12-4fc1-807c-f699b7fc3a7c", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d55cd4957", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581", Pod:"calico-apiserver-7d55cd4957-jrtsh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif1d87bdfda1", MAC:"a6:59:65:97:ed:bc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:38.839631 containerd[2001]: 2025-09-12 17:43:38.823 [INFO][5158] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-jrtsh" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--jrtsh-eth0" Sep 12 17:43:38.945051 containerd[2001]: time="2025-09-12T17:43:38.945002610Z" level=info msg="connecting to shim 6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581" address="unix:///run/containerd/s/d29be2ac35a4f9f5abc54d1234e99ba9b88f7bee5062f5260c33f486f79c0ab0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:39.015260 systemd[1]: Started cri-containerd-6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581.scope - libcontainer container 6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581. Sep 12 17:43:39.088993 systemd-networkd[1863]: calid0d4a7e7c93: Gained IPv6LL Sep 12 17:43:39.092081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1654419894.mount: Deactivated successfully. Sep 12 17:43:39.112916 containerd[2001]: time="2025-09-12T17:43:39.112871260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:39.115837 containerd[2001]: time="2025-09-12T17:43:39.115796116Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:43:39.117946 containerd[2001]: time="2025-09-12T17:43:39.117907238Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:39.121406 containerd[2001]: time="2025-09-12T17:43:39.121281365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:39.122621 containerd[2001]: time="2025-09-12T17:43:39.122067171Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.369600557s" Sep 12 17:43:39.122621 containerd[2001]: time="2025-09-12T17:43:39.122426899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:43:39.126184 containerd[2001]: time="2025-09-12T17:43:39.125997064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:43:39.132612 containerd[2001]: time="2025-09-12T17:43:39.132386364Z" level=info msg="CreateContainer within sandbox \"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:43:39.186683 containerd[2001]: time="2025-09-12T17:43:39.185113978Z" level=info msg="Container 401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:39.216602 containerd[2001]: time="2025-09-12T17:43:39.216545623Z" level=info msg="CreateContainer within sandbox \"c879fcfdf1ac6ad2e3f5e32238d23678fc226c5bab6ecc3265be5257a47a08b7\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4\"" Sep 12 17:43:39.218129 containerd[2001]: time="2025-09-12T17:43:39.218092613Z" level=info msg="StartContainer for \"401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4\"" Sep 12 17:43:39.221177 containerd[2001]: time="2025-09-12T17:43:39.221137843Z" level=info msg="connecting to shim 401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4" address="unix:///run/containerd/s/df7e7e5934b77e06f24dedd9d5473aa00e3563e65529f5238047498a24a55a77" protocol=ttrpc version=3 Sep 12 17:43:39.256274 systemd[1]: Started cri-containerd-401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4.scope - libcontainer container 401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4. Sep 12 17:43:39.371017 containerd[2001]: time="2025-09-12T17:43:39.370747178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6dd46b974c-2hw57,Uid:9aa4e000-a87f-4a24-b970-b02f33097384,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:39.484179 systemd[1]: Started sshd@9-172.31.17.147:22-139.178.68.195:36438.service - OpenSSH per-connection server daemon (139.178.68.195:36438). Sep 12 17:43:39.536928 systemd-networkd[1863]: cali123a990e1b6: Gained IPv6LL Sep 12 17:43:39.567204 containerd[2001]: time="2025-09-12T17:43:39.567147972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-jrtsh,Uid:51120cf8-ca12-4fc1-807c-f699b7fc3a7c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581\"" Sep 12 17:43:39.747060 containerd[2001]: time="2025-09-12T17:43:39.746998645Z" level=info msg="StartContainer for \"401771a9caa68d4f36cb226e4a97f3bf4a9d8dccd32fd63bcd080bb3cc085be4\" returns successfully" Sep 12 17:43:39.794931 sshd[5281]: Accepted publickey for core from 139.178.68.195 port 36438 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:43:39.827954 sshd-session[5281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:39.842533 systemd-logind[1987]: New session 10 of user core. Sep 12 17:43:39.850299 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:43:39.857895 systemd-networkd[1863]: calif1d87bdfda1: Gained IPv6LL Sep 12 17:43:39.910030 systemd-networkd[1863]: cali61ce5cabd16: Link UP Sep 12 17:43:39.911209 systemd-networkd[1863]: cali61ce5cabd16: Gained carrier Sep 12 17:43:39.940577 containerd[2001]: 2025-09-12 17:43:39.437 [INFO][5267] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:43:39.940577 containerd[2001]: 2025-09-12 17:43:39.454 [INFO][5267] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0 calico-kube-controllers-6dd46b974c- calico-system 9aa4e000-a87f-4a24-b970-b02f33097384 825 0 2025-09-12 17:43:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6dd46b974c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-17-147 calico-kube-controllers-6dd46b974c-2hw57 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali61ce5cabd16 [] [] }} ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-" Sep 12 17:43:39.940577 containerd[2001]: 2025-09-12 17:43:39.454 [INFO][5267] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.940577 containerd[2001]: 2025-09-12 17:43:39.636 [INFO][5284] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" HandleID="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Workload="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.638 [INFO][5284] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" HandleID="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Workload="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002778b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-17-147", "pod":"calico-kube-controllers-6dd46b974c-2hw57", "timestamp":"2025-09-12 17:43:39.634414662 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.638 [INFO][5284] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.641 [INFO][5284] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.641 [INFO][5284] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.691 [INFO][5284] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" host="ip-172-31-17-147" Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.717 [INFO][5284] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.773 [INFO][5284] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.784 [INFO][5284] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:39.941677 containerd[2001]: 2025-09-12 17:43:39.832 [INFO][5284] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.834 [INFO][5284] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" host="ip-172-31-17-147" Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.848 [INFO][5284] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253 Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.866 [INFO][5284] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" host="ip-172-31-17-147" Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.900 [INFO][5284] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.133/26] block=192.168.89.128/26 handle="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" host="ip-172-31-17-147" Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.900 [INFO][5284] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.133/26] handle="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" host="ip-172-31-17-147" Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.900 [INFO][5284] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:39.942298 containerd[2001]: 2025-09-12 17:43:39.900 [INFO][5284] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.133/26] IPv6=[] ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" HandleID="k8s-pod-network.8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Workload="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.943145 containerd[2001]: 2025-09-12 17:43:39.904 [INFO][5267] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0", GenerateName:"calico-kube-controllers-6dd46b974c-", Namespace:"calico-system", SelfLink:"", UID:"9aa4e000-a87f-4a24-b970-b02f33097384", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6dd46b974c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"calico-kube-controllers-6dd46b974c-2hw57", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali61ce5cabd16", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:39.943463 containerd[2001]: 2025-09-12 17:43:39.905 [INFO][5267] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.133/32] ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.943463 containerd[2001]: 2025-09-12 17:43:39.905 [INFO][5267] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61ce5cabd16 ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.943463 containerd[2001]: 2025-09-12 17:43:39.909 [INFO][5267] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.944081 containerd[2001]: 2025-09-12 17:43:39.909 [INFO][5267] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0", GenerateName:"calico-kube-controllers-6dd46b974c-", Namespace:"calico-system", SelfLink:"", UID:"9aa4e000-a87f-4a24-b970-b02f33097384", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6dd46b974c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253", Pod:"calico-kube-controllers-6dd46b974c-2hw57", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali61ce5cabd16", MAC:"7a:71:e0:1f:d5:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:39.944203 containerd[2001]: 2025-09-12 17:43:39.934 [INFO][5267] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" Namespace="calico-system" Pod="calico-kube-controllers-6dd46b974c-2hw57" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--kube--controllers--6dd46b974c--2hw57-eth0" Sep 12 17:43:39.951604 kubelet[3324]: I0912 17:43:39.951537 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6bc5b8687b-9v5r5" podStartSLOduration=3.104889135 podStartE2EDuration="7.939888316s" podCreationTimestamp="2025-09-12 17:43:32 +0000 UTC" firstStartedPulling="2025-09-12 17:43:34.289746885 +0000 UTC m=+49.056643972" lastFinishedPulling="2025-09-12 17:43:39.124746058 +0000 UTC m=+53.891643153" observedRunningTime="2025-09-12 17:43:39.851237873 +0000 UTC m=+54.618134985" watchObservedRunningTime="2025-09-12 17:43:39.939888316 +0000 UTC m=+54.706785411" Sep 12 17:43:40.008648 containerd[2001]: time="2025-09-12T17:43:40.008511951Z" level=info msg="connecting to shim 8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253" address="unix:///run/containerd/s/01ab5d1bd4eb078c8baa0e7e7f6eb2a7752568ea8df3c85539b650b0008648d2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:40.081419 systemd[1]: Started cri-containerd-8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253.scope - libcontainer container 8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253. Sep 12 17:43:40.295376 containerd[2001]: time="2025-09-12T17:43:40.295092457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6dd46b974c-2hw57,Uid:9aa4e000-a87f-4a24-b970-b02f33097384,Namespace:calico-system,Attempt:0,} returns sandbox id \"8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253\"" Sep 12 17:43:40.368009 containerd[2001]: time="2025-09-12T17:43:40.367887744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khsdd,Uid:e23da1d8-1bca-4b44-9e73-fa38bacfb830,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:40.369034 containerd[2001]: time="2025-09-12T17:43:40.368893433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kfp5q,Uid:37d43a87-2d60-4e31-bcad-4f3417a16039,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:40.369997 containerd[2001]: time="2025-09-12T17:43:40.369225358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-hw5ks,Uid:675ef9e0-ef8b-4d35-995f-611dbe700deb,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:41.074170 systemd-networkd[1863]: cali61ce5cabd16: Gained IPv6LL Sep 12 17:43:41.078947 systemd-networkd[1863]: calic61f63cbdac: Link UP Sep 12 17:43:41.079246 systemd-networkd[1863]: calic61f63cbdac: Gained carrier Sep 12 17:43:41.153794 containerd[2001]: 2025-09-12 17:43:40.574 [INFO][5399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0 coredns-7c65d6cfc9- kube-system 37d43a87-2d60-4e31-bcad-4f3417a16039 824 0 2025-09-12 17:42:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-17-147 coredns-7c65d6cfc9-kfp5q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic61f63cbdac [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-" Sep 12 17:43:41.153794 containerd[2001]: 2025-09-12 17:43:40.578 [INFO][5399] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.153794 containerd[2001]: 2025-09-12 17:43:40.900 [INFO][5435] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" HandleID="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.902 [INFO][5435] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" HandleID="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000392120), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-17-147", "pod":"coredns-7c65d6cfc9-kfp5q", "timestamp":"2025-09-12 17:43:40.900196827 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.902 [INFO][5435] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.903 [INFO][5435] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.903 [INFO][5435] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.940 [INFO][5435] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" host="ip-172-31-17-147" Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.951 [INFO][5435] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.985 [INFO][5435] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.990 [INFO][5435] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.154882 containerd[2001]: 2025-09-12 17:43:40.997 [INFO][5435] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:40.999 [INFO][5435] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" host="ip-172-31-17-147" Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.007 [INFO][5435] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.024 [INFO][5435] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" host="ip-172-31-17-147" Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.056 [INFO][5435] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.134/26] block=192.168.89.128/26 handle="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" host="ip-172-31-17-147" Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.056 [INFO][5435] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.134/26] handle="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" host="ip-172-31-17-147" Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.056 [INFO][5435] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:41.155327 containerd[2001]: 2025-09-12 17:43:41.056 [INFO][5435] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.134/26] IPv6=[] ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" HandleID="k8s-pod-network.42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.066 [INFO][5399] cni-plugin/k8s.go 418: Populated endpoint ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"37d43a87-2d60-4e31-bcad-4f3417a16039", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 42, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"coredns-7c65d6cfc9-kfp5q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic61f63cbdac", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.067 [INFO][5399] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.134/32] ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.067 [INFO][5399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic61f63cbdac ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.079 [INFO][5399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.087 [INFO][5399] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"37d43a87-2d60-4e31-bcad-4f3417a16039", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 42, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d", Pod:"coredns-7c65d6cfc9-kfp5q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic61f63cbdac", MAC:"7e:61:38:29:95:db", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.155626 containerd[2001]: 2025-09-12 17:43:41.130 [INFO][5399] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-kfp5q" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--kfp5q-eth0" Sep 12 17:43:41.171797 sshd[5323]: Connection closed by 139.178.68.195 port 36438 Sep 12 17:43:41.171243 sshd-session[5281]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:41.187540 systemd[1]: sshd@9-172.31.17.147:22-139.178.68.195:36438.service: Deactivated successfully. Sep 12 17:43:41.194026 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:43:41.206217 systemd-logind[1987]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:43:41.210300 systemd-logind[1987]: Removed session 10. Sep 12 17:43:41.364744 systemd-networkd[1863]: calib2ca56d3bb7: Link UP Sep 12 17:43:41.384311 systemd-networkd[1863]: calib2ca56d3bb7: Gained carrier Sep 12 17:43:41.410261 containerd[2001]: time="2025-09-12T17:43:41.410201859Z" level=info msg="connecting to shim 42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d" address="unix:///run/containerd/s/7dda8439fe6ab681bd1a085c4eaf8c362d7aa9f5f7ecb69654beab044154e282" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:40.619 [INFO][5393] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0 coredns-7c65d6cfc9- kube-system e23da1d8-1bca-4b44-9e73-fa38bacfb830 822 0 2025-09-12 17:42:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-17-147 coredns-7c65d6cfc9-khsdd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib2ca56d3bb7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:40.619 [INFO][5393] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:40.913 [INFO][5442] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" HandleID="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:40.915 [INFO][5442] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" HandleID="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001227a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-17-147", "pod":"coredns-7c65d6cfc9-khsdd", "timestamp":"2025-09-12 17:43:40.912281682 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:40.915 [INFO][5442] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.057 [INFO][5442] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.057 [INFO][5442] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.137 [INFO][5442] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.161 [INFO][5442] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.225 [INFO][5442] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.235 [INFO][5442] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.260 [INFO][5442] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.260 [INFO][5442] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.271 [INFO][5442] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362 Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.286 [INFO][5442] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5442] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.135/26] block=192.168.89.128/26 handle="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5442] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.135/26] handle="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" host="ip-172-31-17-147" Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5442] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:41.459988 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5442] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.135/26] IPv6=[] ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" HandleID="k8s-pod-network.a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Workload="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.320 [INFO][5393] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e23da1d8-1bca-4b44-9e73-fa38bacfb830", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 42, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"coredns-7c65d6cfc9-khsdd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2ca56d3bb7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.321 [INFO][5393] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.135/32] ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.321 [INFO][5393] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2ca56d3bb7 ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.388 [INFO][5393] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.392 [INFO][5393] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e23da1d8-1bca-4b44-9e73-fa38bacfb830", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 42, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362", Pod:"coredns-7c65d6cfc9-khsdd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2ca56d3bb7", MAC:"42:79:1a:41:0c:58", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.465001 containerd[2001]: 2025-09-12 17:43:41.435 [INFO][5393] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khsdd" WorkloadEndpoint="ip--172--31--17--147-k8s-coredns--7c65d6cfc9--khsdd-eth0" Sep 12 17:43:41.517054 systemd[1]: Started cri-containerd-42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d.scope - libcontainer container 42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d. Sep 12 17:43:41.561149 systemd-networkd[1863]: cali966eedafe7e: Link UP Sep 12 17:43:41.562832 systemd-networkd[1863]: cali966eedafe7e: Gained carrier Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:40.684 [INFO][5394] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0 calico-apiserver-7d55cd4957- calico-apiserver 675ef9e0-ef8b-4d35-995f-611dbe700deb 826 0 2025-09-12 17:43:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d55cd4957 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-17-147 calico-apiserver-7d55cd4957-hw5ks eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali966eedafe7e [] [] }} ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:40.685 [INFO][5394] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:40.968 [INFO][5448] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" HandleID="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:40.969 [INFO][5448] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" HandleID="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031c080), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-17-147", "pod":"calico-apiserver-7d55cd4957-hw5ks", "timestamp":"2025-09-12 17:43:40.9685569 +0000 UTC"}, Hostname:"ip-172-31-17-147", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:40.969 [INFO][5448] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5448] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.311 [INFO][5448] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-17-147' Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.344 [INFO][5448] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.411 [INFO][5448] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.441 [INFO][5448] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.454 [INFO][5448] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.470 [INFO][5448] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.471 [INFO][5448] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.475 [INFO][5448] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.489 [INFO][5448] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.508 [INFO][5448] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.89.136/26] block=192.168.89.128/26 handle="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.509 [INFO][5448] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.136/26] handle="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" host="ip-172-31-17-147" Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.509 [INFO][5448] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:43:41.647982 containerd[2001]: 2025-09-12 17:43:41.509 [INFO][5448] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.89.136/26] IPv6=[] ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" HandleID="k8s-pod-network.14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Workload="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.528 [INFO][5394] cni-plugin/k8s.go 418: Populated endpoint ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0", GenerateName:"calico-apiserver-7d55cd4957-", Namespace:"calico-apiserver", SelfLink:"", UID:"675ef9e0-ef8b-4d35-995f-611dbe700deb", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d55cd4957", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"", Pod:"calico-apiserver-7d55cd4957-hw5ks", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali966eedafe7e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.531 [INFO][5394] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.136/32] ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.532 [INFO][5394] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali966eedafe7e ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.573 [INFO][5394] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.576 [INFO][5394] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0", GenerateName:"calico-apiserver-7d55cd4957-", Namespace:"calico-apiserver", SelfLink:"", UID:"675ef9e0-ef8b-4d35-995f-611dbe700deb", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d55cd4957", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-17-147", ContainerID:"14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de", Pod:"calico-apiserver-7d55cd4957-hw5ks", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali966eedafe7e", MAC:"2e:46:77:2c:aa:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:43:41.652157 containerd[2001]: 2025-09-12 17:43:41.605 [INFO][5394] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" Namespace="calico-apiserver" Pod="calico-apiserver-7d55cd4957-hw5ks" WorkloadEndpoint="ip--172--31--17--147-k8s-calico--apiserver--7d55cd4957--hw5ks-eth0" Sep 12 17:43:41.691673 containerd[2001]: time="2025-09-12T17:43:41.691493937Z" level=info msg="connecting to shim a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362" address="unix:///run/containerd/s/3c66b3700e00847ab90d62a1a4d12c0c543763488c246ab978da6759dee2b2f4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:41.800999 systemd[1]: Started cri-containerd-a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362.scope - libcontainer container a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362. Sep 12 17:43:41.841332 containerd[2001]: time="2025-09-12T17:43:41.841253120Z" level=info msg="connecting to shim 14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de" address="unix:///run/containerd/s/d1b95a4bbf84cc0ec9ca0a2f622490d2a1f8567d4e70341ad0e5818de4280f98" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:41.858645 containerd[2001]: time="2025-09-12T17:43:41.858589987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-kfp5q,Uid:37d43a87-2d60-4e31-bcad-4f3417a16039,Namespace:kube-system,Attempt:0,} returns sandbox id \"42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d\"" Sep 12 17:43:41.892211 systemd[1]: Started cri-containerd-14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de.scope - libcontainer container 14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de. Sep 12 17:43:41.897930 containerd[2001]: time="2025-09-12T17:43:41.897887045Z" level=info msg="CreateContainer within sandbox \"42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:43:42.030313 containerd[2001]: time="2025-09-12T17:43:42.029568059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khsdd,Uid:e23da1d8-1bca-4b44-9e73-fa38bacfb830,Namespace:kube-system,Attempt:0,} returns sandbox id \"a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362\"" Sep 12 17:43:42.035935 containerd[2001]: time="2025-09-12T17:43:42.035366480Z" level=info msg="Container cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:42.053539 containerd[2001]: time="2025-09-12T17:43:42.049766334Z" level=info msg="CreateContainer within sandbox \"a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:43:42.056256 containerd[2001]: time="2025-09-12T17:43:42.056215631Z" level=info msg="CreateContainer within sandbox \"42c88128475d07f28848c4613c540bf930edd612c70daf69e5d4fc981d3fee9d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3\"" Sep 12 17:43:42.061195 containerd[2001]: time="2025-09-12T17:43:42.061148033Z" level=info msg="StartContainer for \"cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3\"" Sep 12 17:43:42.064297 containerd[2001]: time="2025-09-12T17:43:42.064133028Z" level=info msg="connecting to shim cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3" address="unix:///run/containerd/s/7dda8439fe6ab681bd1a085c4eaf8c362d7aa9f5f7ecb69654beab044154e282" protocol=ttrpc version=3 Sep 12 17:43:42.084249 containerd[2001]: time="2025-09-12T17:43:42.084113065Z" level=info msg="Container ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:42.096153 containerd[2001]: time="2025-09-12T17:43:42.096097548Z" level=info msg="CreateContainer within sandbox \"a86dd12f666a17e0b23b6acfc09055c822180fb25ccbe010092512686b7d1362\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4\"" Sep 12 17:43:42.100573 containerd[2001]: time="2025-09-12T17:43:42.100526156Z" level=info msg="StartContainer for \"ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4\"" Sep 12 17:43:42.109144 containerd[2001]: time="2025-09-12T17:43:42.109041876Z" level=info msg="connecting to shim ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4" address="unix:///run/containerd/s/3c66b3700e00847ab90d62a1a4d12c0c543763488c246ab978da6759dee2b2f4" protocol=ttrpc version=3 Sep 12 17:43:42.126043 systemd[1]: Started cri-containerd-cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3.scope - libcontainer container cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3. Sep 12 17:43:42.159686 containerd[2001]: time="2025-09-12T17:43:42.159640460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d55cd4957-hw5ks,Uid:675ef9e0-ef8b-4d35-995f-611dbe700deb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de\"" Sep 12 17:43:42.201280 systemd[1]: Started cri-containerd-ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4.scope - libcontainer container ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4. Sep 12 17:43:42.274244 containerd[2001]: time="2025-09-12T17:43:42.274179217Z" level=info msg="StartContainer for \"cf35cfad4a258990faf1a2574d004d02cf1020c0b11bc5b749a40e4a1b597dd3\" returns successfully" Sep 12 17:43:42.285031 containerd[2001]: time="2025-09-12T17:43:42.284908010Z" level=info msg="StartContainer for \"ed4a84196ebe13fd8a80a3a69aa06809a0b580c73c23e60cd16df473826932c4\" returns successfully" Sep 12 17:43:42.488045 systemd-networkd[1863]: vxlan.calico: Link UP Sep 12 17:43:42.488055 systemd-networkd[1863]: vxlan.calico: Gained carrier Sep 12 17:43:42.864926 systemd-networkd[1863]: cali966eedafe7e: Gained IPv6LL Sep 12 17:43:43.089658 kubelet[3324]: I0912 17:43:43.089338 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-kfp5q" podStartSLOduration=52.089310088 podStartE2EDuration="52.089310088s" podCreationTimestamp="2025-09-12 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:43.072124681 +0000 UTC m=+57.839021779" watchObservedRunningTime="2025-09-12 17:43:43.089310088 +0000 UTC m=+57.856207183" Sep 12 17:43:43.120994 systemd-networkd[1863]: calic61f63cbdac: Gained IPv6LL Sep 12 17:43:43.313432 systemd-networkd[1863]: calib2ca56d3bb7: Gained IPv6LL Sep 12 17:43:43.650378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2919591082.mount: Deactivated successfully. Sep 12 17:43:43.824999 systemd-networkd[1863]: vxlan.calico: Gained IPv6LL Sep 12 17:43:44.048449 kubelet[3324]: I0912 17:43:44.046703 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-khsdd" podStartSLOduration=53.046680071 podStartE2EDuration="53.046680071s" podCreationTimestamp="2025-09-12 17:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:43.11102146 +0000 UTC m=+57.877918559" watchObservedRunningTime="2025-09-12 17:43:44.046680071 +0000 UTC m=+58.813577169" Sep 12 17:43:44.818476 containerd[2001]: time="2025-09-12T17:43:44.818429180Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:43:44.934020 containerd[2001]: time="2025-09-12T17:43:44.933979388Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:44.967851 containerd[2001]: time="2025-09-12T17:43:44.967788518Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:44.970519 containerd[2001]: time="2025-09-12T17:43:44.970462317Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:44.977948 containerd[2001]: time="2025-09-12T17:43:44.977893733Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 5.847366092s" Sep 12 17:43:44.977948 containerd[2001]: time="2025-09-12T17:43:44.977949174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:43:44.990922 containerd[2001]: time="2025-09-12T17:43:44.990857814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:43:45.048825 containerd[2001]: time="2025-09-12T17:43:45.048133943Z" level=info msg="CreateContainer within sandbox \"a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:43:45.072900 containerd[2001]: time="2025-09-12T17:43:45.071649713Z" level=info msg="Container 030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:45.082730 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1852622049.mount: Deactivated successfully. Sep 12 17:43:45.108754 containerd[2001]: time="2025-09-12T17:43:45.108701231Z" level=info msg="CreateContainer within sandbox \"a06ac280d67e0a5d81c4d247c482298db680cc5649cae68c5e7d93dab0ff4d47\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\"" Sep 12 17:43:45.111744 containerd[2001]: time="2025-09-12T17:43:45.111694707Z" level=info msg="StartContainer for \"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\"" Sep 12 17:43:45.116033 containerd[2001]: time="2025-09-12T17:43:45.115986591Z" level=info msg="connecting to shim 030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de" address="unix:///run/containerd/s/dbe10de209a27fd5ed8fbb8a04e29353e325ad6b59badf4295d2f934ebd86096" protocol=ttrpc version=3 Sep 12 17:43:45.190060 systemd[1]: Started cri-containerd-030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de.scope - libcontainer container 030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de. Sep 12 17:43:45.276392 containerd[2001]: time="2025-09-12T17:43:45.276343837Z" level=info msg="StartContainer for \"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" returns successfully" Sep 12 17:43:46.129951 ntpd[1974]: Listen normally on 8 vxlan.calico 192.168.89.128:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 8 vxlan.calico 192.168.89.128:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 9 cali8f4ca90f246 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 10 calid0d4a7e7c93 [fe80::ecee:eeff:feee:eeee%5]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 11 cali123a990e1b6 [fe80::ecee:eeff:feee:eeee%6]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 12 calif1d87bdfda1 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 13 cali61ce5cabd16 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 14 calic61f63cbdac [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 15 calib2ca56d3bb7 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 16 cali966eedafe7e [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 17:43:46.131974 ntpd[1974]: 12 Sep 17:43:46 ntpd[1974]: Listen normally on 17 vxlan.calico [fe80::6450:33ff:fe46:bf5f%12]:123 Sep 12 17:43:46.130022 ntpd[1974]: Listen normally on 9 cali8f4ca90f246 [fe80::ecee:eeff:feee:eeee%4]:123 Sep 12 17:43:46.130262 ntpd[1974]: Listen normally on 10 calid0d4a7e7c93 [fe80::ecee:eeff:feee:eeee%5]:123 Sep 12 17:43:46.130300 ntpd[1974]: Listen normally on 11 cali123a990e1b6 [fe80::ecee:eeff:feee:eeee%6]:123 Sep 12 17:43:46.130334 ntpd[1974]: Listen normally on 12 calif1d87bdfda1 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 17:43:46.130360 ntpd[1974]: Listen normally on 13 cali61ce5cabd16 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 17:43:46.130385 ntpd[1974]: Listen normally on 14 calic61f63cbdac [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 17:43:46.130410 ntpd[1974]: Listen normally on 15 calib2ca56d3bb7 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 17:43:46.130437 ntpd[1974]: Listen normally on 16 cali966eedafe7e [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 17:43:46.130464 ntpd[1974]: Listen normally on 17 vxlan.calico [fe80::6450:33ff:fe46:bf5f%12]:123 Sep 12 17:43:46.139716 kubelet[3324]: I0912 17:43:46.139653 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-9hfk5" podStartSLOduration=28.117141888 podStartE2EDuration="35.139175656s" podCreationTimestamp="2025-09-12 17:43:11 +0000 UTC" firstStartedPulling="2025-09-12 17:43:37.957275401 +0000 UTC m=+52.724172478" lastFinishedPulling="2025-09-12 17:43:44.979309167 +0000 UTC m=+59.746206246" observedRunningTime="2025-09-12 17:43:46.138714371 +0000 UTC m=+60.905611461" watchObservedRunningTime="2025-09-12 17:43:46.139175656 +0000 UTC m=+60.906072753" Sep 12 17:43:46.209640 systemd[1]: Started sshd@10-172.31.17.147:22-139.178.68.195:49488.service - OpenSSH per-connection server daemon (139.178.68.195:49488). Sep 12 17:43:46.466862 sshd[5831]: Accepted publickey for core from 139.178.68.195 port 49488 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:43:46.469065 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:46.475945 systemd-logind[1987]: New session 11 of user core. Sep 12 17:43:46.483217 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:43:47.266505 containerd[2001]: time="2025-09-12T17:43:47.266467684Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"2879b920f9d111ab454b39285ba342fab4da8575442eebb8be5e69a9978743e1\" pid:5854 exited_at:{seconds:1757699027 nanos:265859593}" Sep 12 17:43:47.437030 sshd[5834]: Connection closed by 139.178.68.195 port 49488 Sep 12 17:43:47.437846 sshd-session[5831]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:47.442317 systemd-logind[1987]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:43:47.442472 systemd[1]: sshd@10-172.31.17.147:22-139.178.68.195:49488.service: Deactivated successfully. Sep 12 17:43:47.444738 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:43:47.446821 systemd-logind[1987]: Removed session 11. Sep 12 17:43:47.918387 containerd[2001]: time="2025-09-12T17:43:47.918245055Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.920060 containerd[2001]: time="2025-09-12T17:43:47.919933812Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:43:47.923973 containerd[2001]: time="2025-09-12T17:43:47.923925574Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.926107 containerd[2001]: time="2025-09-12T17:43:47.926049400Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.926763 containerd[2001]: time="2025-09-12T17:43:47.926728227Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.935812061s" Sep 12 17:43:47.926945 containerd[2001]: time="2025-09-12T17:43:47.926924733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:43:47.928328 containerd[2001]: time="2025-09-12T17:43:47.928279661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:43:47.931008 containerd[2001]: time="2025-09-12T17:43:47.930960035Z" level=info msg="CreateContainer within sandbox \"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:43:47.970808 containerd[2001]: time="2025-09-12T17:43:47.969379060Z" level=info msg="Container 0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:48.023403 containerd[2001]: time="2025-09-12T17:43:48.023089065Z" level=info msg="CreateContainer within sandbox \"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab\"" Sep 12 17:43:48.024155 containerd[2001]: time="2025-09-12T17:43:48.024111459Z" level=info msg="StartContainer for \"0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab\"" Sep 12 17:43:48.025810 containerd[2001]: time="2025-09-12T17:43:48.025740435Z" level=info msg="connecting to shim 0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab" address="unix:///run/containerd/s/32a496da959e83257bbf3512fa9e82745c6984d6212d4ba9b42596d420b888da" protocol=ttrpc version=3 Sep 12 17:43:48.063290 systemd[1]: Started cri-containerd-0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab.scope - libcontainer container 0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab. Sep 12 17:43:48.129734 containerd[2001]: time="2025-09-12T17:43:48.129662614Z" level=info msg="StartContainer for \"0b85623e4e94acf5c54e931087911fd1f34645b390c27885f92d024a0648b1ab\" returns successfully" Sep 12 17:43:50.425369 containerd[2001]: time="2025-09-12T17:43:50.425302941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:50.426579 containerd[2001]: time="2025-09-12T17:43:50.426222108Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:43:50.427920 containerd[2001]: time="2025-09-12T17:43:50.427858774Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:50.431004 containerd[2001]: time="2025-09-12T17:43:50.430346969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:50.431004 containerd[2001]: time="2025-09-12T17:43:50.430876732Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.502542408s" Sep 12 17:43:50.431004 containerd[2001]: time="2025-09-12T17:43:50.430903789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:43:50.432806 containerd[2001]: time="2025-09-12T17:43:50.432758293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:43:50.437459 containerd[2001]: time="2025-09-12T17:43:50.437425491Z" level=info msg="CreateContainer within sandbox \"6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:43:50.452038 containerd[2001]: time="2025-09-12T17:43:50.451998413Z" level=info msg="Container acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:50.463448 containerd[2001]: time="2025-09-12T17:43:50.463346485Z" level=info msg="CreateContainer within sandbox \"6e037df9d6e5939375bc5ff7260e41dba045ec436089247796695e16058f4581\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe\"" Sep 12 17:43:50.464381 containerd[2001]: time="2025-09-12T17:43:50.464354310Z" level=info msg="StartContainer for \"acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe\"" Sep 12 17:43:50.465934 containerd[2001]: time="2025-09-12T17:43:50.465806825Z" level=info msg="connecting to shim acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe" address="unix:///run/containerd/s/d29be2ac35a4f9f5abc54d1234e99ba9b88f7bee5062f5260c33f486f79c0ab0" protocol=ttrpc version=3 Sep 12 17:43:50.514065 systemd[1]: Started cri-containerd-acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe.scope - libcontainer container acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe. Sep 12 17:43:50.575700 containerd[2001]: time="2025-09-12T17:43:50.575650986Z" level=info msg="StartContainer for \"acef1765bc831912dfb528c17f1efbabb15ade7c76e5d5a45c8ed2cf3dfbdafe\" returns successfully" Sep 12 17:43:51.164262 kubelet[3324]: I0912 17:43:51.163407 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d55cd4957-jrtsh" podStartSLOduration=33.303784878 podStartE2EDuration="44.16338617s" podCreationTimestamp="2025-09-12 17:43:07 +0000 UTC" firstStartedPulling="2025-09-12 17:43:39.572993798 +0000 UTC m=+54.339890889" lastFinishedPulling="2025-09-12 17:43:50.432595087 +0000 UTC m=+65.199492181" observedRunningTime="2025-09-12 17:43:51.162943347 +0000 UTC m=+65.929840448" watchObservedRunningTime="2025-09-12 17:43:51.16338617 +0000 UTC m=+65.930283273" Sep 12 17:43:52.474946 systemd[1]: Started sshd@11-172.31.17.147:22-139.178.68.195:51682.service - OpenSSH per-connection server daemon (139.178.68.195:51682). Sep 12 17:43:52.745548 sshd[5962]: Accepted publickey for core from 139.178.68.195 port 51682 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:43:52.748799 sshd-session[5962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:52.756390 systemd-logind[1987]: New session 12 of user core. Sep 12 17:43:52.762183 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:43:54.527085 sshd[5965]: Connection closed by 139.178.68.195 port 51682 Sep 12 17:43:54.527530 sshd-session[5962]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:54.550564 systemd[1]: sshd@11-172.31.17.147:22-139.178.68.195:51682.service: Deactivated successfully. Sep 12 17:43:54.561165 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:43:54.575551 systemd-logind[1987]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:43:54.596874 systemd[1]: Started sshd@12-172.31.17.147:22-139.178.68.195:51686.service - OpenSSH per-connection server daemon (139.178.68.195:51686). Sep 12 17:43:54.604103 systemd-logind[1987]: Removed session 12. Sep 12 17:43:54.873321 sshd[6015]: Accepted publickey for core from 139.178.68.195 port 51686 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:43:54.881684 sshd-session[6015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:54.896949 systemd-logind[1987]: New session 13 of user core. Sep 12 17:43:54.899231 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:43:55.585909 containerd[2001]: time="2025-09-12T17:43:55.585823104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:55.592240 containerd[2001]: time="2025-09-12T17:43:55.591701435Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:43:55.610160 containerd[2001]: time="2025-09-12T17:43:55.609699257Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:55.615498 containerd[2001]: time="2025-09-12T17:43:55.615235880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:55.618247 containerd[2001]: time="2025-09-12T17:43:55.618195873Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 5.184310437s" Sep 12 17:43:55.618386 containerd[2001]: time="2025-09-12T17:43:55.618270183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:43:55.690756 containerd[2001]: time="2025-09-12T17:43:55.690519926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:43:55.755438 sshd[6018]: Connection closed by 139.178.68.195 port 51686 Sep 12 17:43:55.762134 sshd-session[6015]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:55.774353 systemd[1]: sshd@12-172.31.17.147:22-139.178.68.195:51686.service: Deactivated successfully. Sep 12 17:43:55.788259 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:43:55.796245 systemd-logind[1987]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:43:55.839117 systemd[1]: Started sshd@13-172.31.17.147:22-139.178.68.195:51694.service - OpenSSH per-connection server daemon (139.178.68.195:51694). Sep 12 17:43:55.860883 systemd-logind[1987]: Removed session 13. Sep 12 17:43:55.879468 containerd[2001]: time="2025-09-12T17:43:55.879403212Z" level=info msg="CreateContainer within sandbox \"8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:43:55.931120 containerd[2001]: time="2025-09-12T17:43:55.931036371Z" level=info msg="Container 95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:55.960739 containerd[2001]: time="2025-09-12T17:43:55.960694751Z" level=info msg="CreateContainer within sandbox \"8a0e8db4f7fd39beddc043d5fc91da6cfbb2a22951d0ec021fd33f56aa0b3253\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\"" Sep 12 17:43:55.962661 containerd[2001]: time="2025-09-12T17:43:55.962627217Z" level=info msg="StartContainer for \"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\"" Sep 12 17:43:55.966416 containerd[2001]: time="2025-09-12T17:43:55.966379402Z" level=info msg="connecting to shim 95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937" address="unix:///run/containerd/s/01ab5d1bd4eb078c8baa0e7e7f6eb2a7752568ea8df3c85539b650b0008648d2" protocol=ttrpc version=3 Sep 12 17:43:56.003564 systemd[1]: Started cri-containerd-95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937.scope - libcontainer container 95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937. Sep 12 17:43:56.127467 sshd[6033]: Accepted publickey for core from 139.178.68.195 port 51694 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:43:56.139466 sshd-session[6033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:56.156850 systemd-logind[1987]: New session 14 of user core. Sep 12 17:43:56.162034 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:43:56.195677 containerd[2001]: time="2025-09-12T17:43:56.195634082Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:56.195978 containerd[2001]: time="2025-09-12T17:43:56.195951959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:43:56.199418 containerd[2001]: time="2025-09-12T17:43:56.199289174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 508.725227ms" Sep 12 17:43:56.207448 containerd[2001]: time="2025-09-12T17:43:56.199591262Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:43:56.208320 containerd[2001]: time="2025-09-12T17:43:56.208207065Z" level=info msg="StartContainer for \"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" returns successfully" Sep 12 17:43:56.211206 containerd[2001]: time="2025-09-12T17:43:56.211168567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:43:56.218094 containerd[2001]: time="2025-09-12T17:43:56.218053808Z" level=info msg="CreateContainer within sandbox \"14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:43:56.234052 containerd[2001]: time="2025-09-12T17:43:56.234011697Z" level=info msg="Container 12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:56.262383 containerd[2001]: time="2025-09-12T17:43:56.262330708Z" level=info msg="CreateContainer within sandbox \"14f59ea5b75821eeb6c6d4aa87c044ead2b934ded80b3129c700b0a9595243de\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768\"" Sep 12 17:43:56.264435 containerd[2001]: time="2025-09-12T17:43:56.264399578Z" level=info msg="StartContainer for \"12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768\"" Sep 12 17:43:56.267548 containerd[2001]: time="2025-09-12T17:43:56.267038432Z" level=info msg="connecting to shim 12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768" address="unix:///run/containerd/s/d1b95a4bbf84cc0ec9ca0a2f622490d2a1f8567d4e70341ad0e5818de4280f98" protocol=ttrpc version=3 Sep 12 17:43:56.347726 systemd[1]: Started cri-containerd-12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768.scope - libcontainer container 12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768. Sep 12 17:43:56.928946 sshd[6071]: Connection closed by 139.178.68.195 port 51694 Sep 12 17:43:56.931053 sshd-session[6033]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:56.948452 systemd[1]: sshd@13-172.31.17.147:22-139.178.68.195:51694.service: Deactivated successfully. Sep 12 17:43:56.956991 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:43:56.964355 systemd-logind[1987]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:43:56.968432 systemd-logind[1987]: Removed session 14. Sep 12 17:43:57.009949 containerd[2001]: time="2025-09-12T17:43:57.009880204Z" level=info msg="StartContainer for \"12282cd03bb06bd638ce147935ed8f90fb42f52a43310e0b810188eb86cff768\" returns successfully" Sep 12 17:43:57.338886 containerd[2001]: time="2025-09-12T17:43:57.337141239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" id:\"6ff372432dc6737e72412769447d384d3aa52ad6cfbd45af67f1eb2f45081ed1\" pid:6119 exited_at:{seconds:1757699037 nanos:299833670}" Sep 12 17:43:57.447720 kubelet[3324]: I0912 17:43:57.442316 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6dd46b974c-2hw57" podStartSLOduration=30.075898477 podStartE2EDuration="45.435480497s" podCreationTimestamp="2025-09-12 17:43:12 +0000 UTC" firstStartedPulling="2025-09-12 17:43:40.299318957 +0000 UTC m=+55.066216048" lastFinishedPulling="2025-09-12 17:43:55.658900991 +0000 UTC m=+70.425798068" observedRunningTime="2025-09-12 17:43:56.563494913 +0000 UTC m=+71.330392011" watchObservedRunningTime="2025-09-12 17:43:57.435480497 +0000 UTC m=+72.202377592" Sep 12 17:43:57.903371 containerd[2001]: time="2025-09-12T17:43:57.903310514Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"94e7f6f5b3063506317231b31e6042720564d69220ac136da9ffc502bba91fb4\" pid:6003 exited_at:{seconds:1757699037 nanos:902901289}" Sep 12 17:43:59.144792 containerd[2001]: time="2025-09-12T17:43:59.144341245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:59.147788 containerd[2001]: time="2025-09-12T17:43:59.147591479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:43:59.150248 containerd[2001]: time="2025-09-12T17:43:59.150209951Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:59.157792 containerd[2001]: time="2025-09-12T17:43:59.156182385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:59.158985 containerd[2001]: time="2025-09-12T17:43:59.158937360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.94760035s" Sep 12 17:43:59.159409 containerd[2001]: time="2025-09-12T17:43:59.159385734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:43:59.288616 containerd[2001]: time="2025-09-12T17:43:59.288551960Z" level=info msg="CreateContainer within sandbox \"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:43:59.298558 kubelet[3324]: I0912 17:43:59.298495 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d55cd4957-hw5ks" podStartSLOduration=38.250639309 podStartE2EDuration="52.296354903s" podCreationTimestamp="2025-09-12 17:43:07 +0000 UTC" firstStartedPulling="2025-09-12 17:43:42.164693562 +0000 UTC m=+56.931590646" lastFinishedPulling="2025-09-12 17:43:56.210409162 +0000 UTC m=+70.977306240" observedRunningTime="2025-09-12 17:43:57.628050115 +0000 UTC m=+72.394947213" watchObservedRunningTime="2025-09-12 17:43:59.296354903 +0000 UTC m=+74.063252002" Sep 12 17:43:59.310037 containerd[2001]: time="2025-09-12T17:43:59.309986249Z" level=info msg="Container 32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:59.340374 containerd[2001]: time="2025-09-12T17:43:59.340330073Z" level=info msg="CreateContainer within sandbox \"692b9424732cf2d21aecfbcb8fe6aaf81f23b957c87f091155e86baf5d7514f8\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4\"" Sep 12 17:43:59.342107 containerd[2001]: time="2025-09-12T17:43:59.342071830Z" level=info msg="StartContainer for \"32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4\"" Sep 12 17:43:59.343416 containerd[2001]: time="2025-09-12T17:43:59.343385234Z" level=info msg="connecting to shim 32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4" address="unix:///run/containerd/s/32a496da959e83257bbf3512fa9e82745c6984d6212d4ba9b42596d420b888da" protocol=ttrpc version=3 Sep 12 17:43:59.388116 systemd[1]: Started cri-containerd-32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4.scope - libcontainer container 32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4. Sep 12 17:43:59.444960 containerd[2001]: time="2025-09-12T17:43:59.444850568Z" level=info msg="StartContainer for \"32ffd224e92fba4905e19851b860581c2b9dce5870a5f5aea2ebd8a2d27281b4\" returns successfully" Sep 12 17:44:00.867795 kubelet[3324]: I0912 17:44:00.856814 3324 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:44:00.871087 kubelet[3324]: I0912 17:44:00.870940 3324 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:44:01.553142 containerd[2001]: time="2025-09-12T17:44:01.553093728Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"53b6e3682447fcb5105d9ba2cc60f6859326747b2a3e744d45136160bb74d0c6\" pid:6203 exit_status:1 exited_at:{seconds:1757699041 nanos:551021501}" Sep 12 17:44:02.037652 systemd[1]: Started sshd@14-172.31.17.147:22-139.178.68.195:36288.service - OpenSSH per-connection server daemon (139.178.68.195:36288). Sep 12 17:44:02.683398 sshd[6218]: Accepted publickey for core from 139.178.68.195 port 36288 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:02.686155 sshd-session[6218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:02.726462 systemd-logind[1987]: New session 15 of user core. Sep 12 17:44:02.734493 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:44:04.906862 sshd[6221]: Connection closed by 139.178.68.195 port 36288 Sep 12 17:44:04.911690 sshd-session[6218]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:04.927381 systemd-logind[1987]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:44:04.927484 systemd[1]: sshd@14-172.31.17.147:22-139.178.68.195:36288.service: Deactivated successfully. Sep 12 17:44:04.932078 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:44:04.936121 systemd-logind[1987]: Removed session 15. Sep 12 17:44:09.941944 systemd[1]: Started sshd@15-172.31.17.147:22-139.178.68.195:51436.service - OpenSSH per-connection server daemon (139.178.68.195:51436). Sep 12 17:44:10.165015 sshd[6243]: Accepted publickey for core from 139.178.68.195 port 51436 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:10.166624 sshd-session[6243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:10.172869 systemd-logind[1987]: New session 16 of user core. Sep 12 17:44:10.182034 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:44:10.944278 sshd[6246]: Connection closed by 139.178.68.195 port 51436 Sep 12 17:44:10.946503 sshd-session[6243]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:10.953359 systemd[1]: sshd@15-172.31.17.147:22-139.178.68.195:51436.service: Deactivated successfully. Sep 12 17:44:10.957715 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:44:10.959578 systemd-logind[1987]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:44:10.965308 systemd-logind[1987]: Removed session 16. Sep 12 17:44:15.989881 systemd[1]: Started sshd@16-172.31.17.147:22-139.178.68.195:51452.service - OpenSSH per-connection server daemon (139.178.68.195:51452). Sep 12 17:44:16.280035 sshd[6259]: Accepted publickey for core from 139.178.68.195 port 51452 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:16.282655 sshd-session[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:16.290897 systemd-logind[1987]: New session 17 of user core. Sep 12 17:44:16.297048 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:44:17.158648 sshd[6262]: Connection closed by 139.178.68.195 port 51452 Sep 12 17:44:17.159186 sshd-session[6259]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:17.164895 systemd-logind[1987]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:44:17.165660 systemd[1]: sshd@16-172.31.17.147:22-139.178.68.195:51452.service: Deactivated successfully. Sep 12 17:44:17.167963 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:44:17.171943 systemd-logind[1987]: Removed session 17. Sep 12 17:44:17.194092 systemd[1]: Started sshd@17-172.31.17.147:22-139.178.68.195:51460.service - OpenSSH per-connection server daemon (139.178.68.195:51460). Sep 12 17:44:17.381241 sshd[6274]: Accepted publickey for core from 139.178.68.195 port 51460 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:17.382712 sshd-session[6274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:17.388620 systemd-logind[1987]: New session 18 of user core. Sep 12 17:44:17.393438 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:44:18.405499 containerd[2001]: time="2025-09-12T17:44:18.405441783Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"0a1aea801f87692196b15670b9867cba84e9997b75d939dd4442e5bd401984a2\" pid:6293 exited_at:{seconds:1757699058 nanos:343097420}" Sep 12 17:44:20.915283 sshd[6277]: Connection closed by 139.178.68.195 port 51460 Sep 12 17:44:20.917190 sshd-session[6274]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:20.921725 systemd[1]: sshd@17-172.31.17.147:22-139.178.68.195:51460.service: Deactivated successfully. Sep 12 17:44:20.924609 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:44:20.941660 systemd-logind[1987]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:44:20.953717 systemd[1]: Started sshd@18-172.31.17.147:22-139.178.68.195:43788.service - OpenSSH per-connection server daemon (139.178.68.195:43788). Sep 12 17:44:20.954821 systemd-logind[1987]: Removed session 18. Sep 12 17:44:21.202234 sshd[6311]: Accepted publickey for core from 139.178.68.195 port 43788 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:21.203873 sshd-session[6311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:21.210799 systemd-logind[1987]: New session 19 of user core. Sep 12 17:44:21.215135 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:44:24.405918 sshd[6314]: Connection closed by 139.178.68.195 port 43788 Sep 12 17:44:24.426010 sshd-session[6311]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:24.479447 systemd[1]: sshd@18-172.31.17.147:22-139.178.68.195:43788.service: Deactivated successfully. Sep 12 17:44:24.490276 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:44:24.490569 systemd[1]: session-19.scope: Consumed 830ms CPU time, 73.7M memory peak. Sep 12 17:44:24.498154 systemd-logind[1987]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:44:24.509969 systemd[1]: Started sshd@19-172.31.17.147:22-139.178.68.195:43802.service - OpenSSH per-connection server daemon (139.178.68.195:43802). Sep 12 17:44:24.518343 systemd-logind[1987]: Removed session 19. Sep 12 17:44:24.911165 kubelet[3324]: E0912 17:44:24.902065 3324 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.239s" Sep 12 17:44:24.948518 sshd[6341]: Accepted publickey for core from 139.178.68.195 port 43802 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:24.955653 sshd-session[6341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:24.971462 systemd-logind[1987]: New session 20 of user core. Sep 12 17:44:24.976008 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:44:25.097745 containerd[2001]: time="2025-09-12T17:44:25.095662907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" id:\"0e303e708ac785a0037916896fc4d4826380cf1b31c6883c1c8e670dca4841aa\" pid:6370 exited_at:{seconds:1757699065 nanos:87160829}" Sep 12 17:44:25.691711 containerd[2001]: time="2025-09-12T17:44:25.691418579Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"27b8dafc1c960ed2463273b5a63e721481ff5af452e36833bbb0a548c0940b2f\" pid:6372 exited_at:{seconds:1757699065 nanos:690388346}" Sep 12 17:44:27.327646 sshd[6383]: Connection closed by 139.178.68.195 port 43802 Sep 12 17:44:27.328062 sshd-session[6341]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:27.337088 systemd[1]: sshd@19-172.31.17.147:22-139.178.68.195:43802.service: Deactivated successfully. Sep 12 17:44:27.337366 systemd-logind[1987]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:44:27.341234 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:44:27.341522 systemd[1]: session-20.scope: Consumed 898ms CPU time, 66.8M memory peak. Sep 12 17:44:27.343634 systemd-logind[1987]: Removed session 20. Sep 12 17:44:27.360332 systemd[1]: Started sshd@20-172.31.17.147:22-139.178.68.195:43804.service - OpenSSH per-connection server daemon (139.178.68.195:43804). Sep 12 17:44:27.618046 sshd[6411]: Accepted publickey for core from 139.178.68.195 port 43804 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:27.619003 sshd-session[6411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:27.628423 systemd-logind[1987]: New session 21 of user core. Sep 12 17:44:27.636193 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:44:28.078701 sshd[6414]: Connection closed by 139.178.68.195 port 43804 Sep 12 17:44:28.079489 sshd-session[6411]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:28.084618 systemd[1]: sshd@20-172.31.17.147:22-139.178.68.195:43804.service: Deactivated successfully. Sep 12 17:44:28.087660 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:44:28.089733 systemd-logind[1987]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:44:28.091698 systemd-logind[1987]: Removed session 21. Sep 12 17:44:30.530583 containerd[2001]: time="2025-09-12T17:44:30.530520793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"c55692e39479062398565678fe908d0752c2d417692c89241e45b6f9416f9d8e\" pid:6437 exited_at:{seconds:1757699070 nanos:468511301}" Sep 12 17:44:30.902275 kubelet[3324]: I0912 17:44:30.896712 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-267zp" podStartSLOduration=57.583921741 podStartE2EDuration="1m18.857517055s" podCreationTimestamp="2025-09-12 17:43:12 +0000 UTC" firstStartedPulling="2025-09-12 17:43:37.970817462 +0000 UTC m=+52.737714547" lastFinishedPulling="2025-09-12 17:43:59.244412784 +0000 UTC m=+74.011309861" observedRunningTime="2025-09-12 17:43:59.715329095 +0000 UTC m=+74.482226193" watchObservedRunningTime="2025-09-12 17:44:30.857517055 +0000 UTC m=+105.624414155" Sep 12 17:44:33.120217 systemd[1]: Started sshd@21-172.31.17.147:22-139.178.68.195:37120.service - OpenSSH per-connection server daemon (139.178.68.195:37120). Sep 12 17:44:33.425937 sshd[6452]: Accepted publickey for core from 139.178.68.195 port 37120 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:33.428669 sshd-session[6452]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:33.435419 systemd-logind[1987]: New session 22 of user core. Sep 12 17:44:33.438000 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:44:33.910712 systemd[1]: Started sshd@22-172.31.17.147:22-135.119.96.68:55744.service - OpenSSH per-connection server daemon (135.119.96.68:55744). Sep 12 17:44:34.071505 sshd[6455]: Connection closed by 139.178.68.195 port 37120 Sep 12 17:44:34.072806 sshd-session[6452]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:34.079797 systemd-logind[1987]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:44:34.080478 systemd[1]: sshd@21-172.31.17.147:22-139.178.68.195:37120.service: Deactivated successfully. Sep 12 17:44:34.084911 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:44:34.088271 systemd-logind[1987]: Removed session 22. Sep 12 17:44:38.784127 containerd[2001]: time="2025-09-12T17:44:38.783224323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" id:\"3582be9bf219f44a742ceeb5edbd511c23dabe6bc6bae9eb559ada841860dd1d\" pid:6489 exited_at:{seconds:1757699078 nanos:782907690}" Sep 12 17:44:39.108366 systemd[1]: Started sshd@23-172.31.17.147:22-139.178.68.195:37136.service - OpenSSH per-connection server daemon (139.178.68.195:37136). Sep 12 17:44:39.405239 sshd[6500]: Accepted publickey for core from 139.178.68.195 port 37136 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:39.408428 sshd-session[6500]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:39.417313 systemd-logind[1987]: New session 23 of user core. Sep 12 17:44:39.427373 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 17:44:40.313799 sshd[6503]: Connection closed by 139.178.68.195 port 37136 Sep 12 17:44:40.314447 sshd-session[6500]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:40.320569 systemd-logind[1987]: Session 23 logged out. Waiting for processes to exit. Sep 12 17:44:40.320943 systemd[1]: sshd@23-172.31.17.147:22-139.178.68.195:37136.service: Deactivated successfully. Sep 12 17:44:40.324208 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 17:44:40.329263 systemd-logind[1987]: Removed session 23. Sep 12 17:44:44.002661 sshd[6464]: Connection closed by 135.119.96.68 port 55744 Sep 12 17:44:44.003294 systemd[1]: sshd@22-172.31.17.147:22-135.119.96.68:55744.service: Deactivated successfully. Sep 12 17:44:44.058216 systemd[1]: Started sshd@24-172.31.17.147:22-135.119.96.68:36324.service - OpenSSH per-connection server daemon (135.119.96.68:36324). Sep 12 17:44:44.106013 sshd[6517]: banner exchange: Connection from 135.119.96.68 port 36324: invalid format Sep 12 17:44:44.108333 systemd[1]: sshd@24-172.31.17.147:22-135.119.96.68:36324.service: Deactivated successfully. Sep 12 17:44:45.355093 systemd[1]: Started sshd@25-172.31.17.147:22-139.178.68.195:33200.service - OpenSSH per-connection server daemon (139.178.68.195:33200). Sep 12 17:44:45.579311 sshd[6522]: Accepted publickey for core from 139.178.68.195 port 33200 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:45.589112 sshd-session[6522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:45.601708 systemd-logind[1987]: New session 24 of user core. Sep 12 17:44:45.610050 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 17:44:46.373809 sshd[6527]: Connection closed by 139.178.68.195 port 33200 Sep 12 17:44:46.383473 sshd-session[6522]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:46.399558 systemd-logind[1987]: Session 24 logged out. Waiting for processes to exit. Sep 12 17:44:46.402452 systemd[1]: sshd@25-172.31.17.147:22-139.178.68.195:33200.service: Deactivated successfully. Sep 12 17:44:46.409476 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 17:44:46.418476 systemd-logind[1987]: Removed session 24. Sep 12 17:44:51.409407 systemd[1]: Started sshd@26-172.31.17.147:22-139.178.68.195:47448.service - OpenSSH per-connection server daemon (139.178.68.195:47448). Sep 12 17:44:51.701865 sshd[6539]: Accepted publickey for core from 139.178.68.195 port 47448 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:51.704519 sshd-session[6539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:51.716183 systemd-logind[1987]: New session 25 of user core. Sep 12 17:44:51.724276 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 17:44:52.747659 sshd[6542]: Connection closed by 139.178.68.195 port 47448 Sep 12 17:44:52.749006 sshd-session[6539]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:52.756263 systemd[1]: sshd@26-172.31.17.147:22-139.178.68.195:47448.service: Deactivated successfully. Sep 12 17:44:52.759157 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 17:44:52.760693 systemd-logind[1987]: Session 25 logged out. Waiting for processes to exit. Sep 12 17:44:52.763059 systemd-logind[1987]: Removed session 25. Sep 12 17:44:54.226802 containerd[2001]: time="2025-09-12T17:44:54.226588335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" id:\"c8100d13d7b0a998a17a8fee999b5c0df3a24d75d3cd8b492e0b3a76afdbb635\" pid:6588 exited_at:{seconds:1757699094 nanos:185992928}" Sep 12 17:44:55.004516 containerd[2001]: time="2025-09-12T17:44:55.004465553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"4731df1a925f215ad4e60055b87b03abccfd27587a8b03785c9b6b0e4d8344d3\" pid:6569 exited_at:{seconds:1757699095 nanos:4047622}" Sep 12 17:44:57.782745 systemd[1]: Started sshd@27-172.31.17.147:22-139.178.68.195:47458.service - OpenSSH per-connection server daemon (139.178.68.195:47458). Sep 12 17:44:58.054960 sshd[6603]: Accepted publickey for core from 139.178.68.195 port 47458 ssh2: RSA SHA256:W4YS5mF+BRASipQdoc83aYiclsJC9j/B8FPkaJHVZC4 Sep 12 17:44:58.057126 sshd-session[6603]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:44:58.066026 systemd-logind[1987]: New session 26 of user core. Sep 12 17:44:58.072406 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 17:44:58.871403 sshd[6606]: Connection closed by 139.178.68.195 port 47458 Sep 12 17:44:58.872133 sshd-session[6603]: pam_unix(sshd:session): session closed for user core Sep 12 17:44:58.878416 systemd-logind[1987]: Session 26 logged out. Waiting for processes to exit. Sep 12 17:44:58.881486 systemd[1]: sshd@27-172.31.17.147:22-139.178.68.195:47458.service: Deactivated successfully. Sep 12 17:44:58.885835 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 17:44:58.890688 systemd-logind[1987]: Removed session 26. Sep 12 17:45:00.716178 containerd[2001]: time="2025-09-12T17:45:00.701076285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"4cb435d4b833f7301d01a99f1a5401b9a53289525ad7d8b8dd4971c1b2538b52\" pid:6629 exited_at:{seconds:1757699100 nanos:654304493}" Sep 12 17:45:14.979193 systemd[1]: cri-containerd-b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1.scope: Deactivated successfully. Sep 12 17:45:14.979531 systemd[1]: cri-containerd-b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1.scope: Consumed 4.151s CPU time, 87.5M memory peak, 107.1M read from disk. Sep 12 17:45:15.034751 systemd[1]: cri-containerd-9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9.scope: Deactivated successfully. Sep 12 17:45:15.035188 systemd[1]: cri-containerd-9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9.scope: Consumed 12.286s CPU time, 117.5M memory peak, 96.7M read from disk. Sep 12 17:45:15.179789 containerd[2001]: time="2025-09-12T17:45:15.179717499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\" id:\"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\" pid:3900 exit_status:1 exited_at:{seconds:1757699115 nanos:99542237}" Sep 12 17:45:15.179789 containerd[2001]: time="2025-09-12T17:45:15.179761872Z" level=info msg="received exit event container_id:\"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\" id:\"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\" pid:3138 exit_status:1 exited_at:{seconds:1757699115 nanos:125053422}" Sep 12 17:45:15.180483 containerd[2001]: time="2025-09-12T17:45:15.179923673Z" level=info msg="received exit event container_id:\"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\" id:\"9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9\" pid:3900 exit_status:1 exited_at:{seconds:1757699115 nanos:99542237}" Sep 12 17:45:15.181123 containerd[2001]: time="2025-09-12T17:45:15.181086475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\" id:\"b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1\" pid:3138 exit_status:1 exited_at:{seconds:1757699115 nanos:125053422}" Sep 12 17:45:15.333979 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1-rootfs.mount: Deactivated successfully. Sep 12 17:45:15.343217 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9-rootfs.mount: Deactivated successfully. Sep 12 17:45:15.545148 update_engine[1989]: I20250912 17:45:15.545058 1989 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:45:15.545148 update_engine[1989]: I20250912 17:45:15.545132 1989 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:45:15.550040 update_engine[1989]: I20250912 17:45:15.549990 1989 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:45:15.555112 update_engine[1989]: I20250912 17:45:15.554912 1989 omaha_request_params.cc:62] Current group set to beta Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555421 1989 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555444 1989 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555467 1989 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555527 1989 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555609 1989 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555619 1989 omaha_request_action.cc:272] Request: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: Sep 12 17:45:15.556890 update_engine[1989]: I20250912 17:45:15.555626 1989 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:45:15.583818 locksmithd[2040]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:45:15.589720 update_engine[1989]: I20250912 17:45:15.589576 1989 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:45:15.592273 update_engine[1989]: I20250912 17:45:15.592160 1989 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:45:15.616070 update_engine[1989]: E20250912 17:45:15.615824 1989 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:45:15.616399 update_engine[1989]: I20250912 17:45:15.616024 1989 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:45:15.663097 kubelet[3324]: I0912 17:45:15.663034 3324 scope.go:117] "RemoveContainer" containerID="efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574" Sep 12 17:45:15.669488 kubelet[3324]: I0912 17:45:15.667586 3324 scope.go:117] "RemoveContainer" containerID="b72e18b6b8c8117101a586f4e2d6b2970fcd2894cefb0493daf3c7c024a5bdf1" Sep 12 17:45:15.697586 kubelet[3324]: I0912 17:45:15.697535 3324 scope.go:117] "RemoveContainer" containerID="9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9" Sep 12 17:45:15.749901 kubelet[3324]: E0912 17:45:15.749844 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-58fc44c59b-zsjzc_tigera-operator(cc3b6c57-7afa-4c95-ac34-d2ae4f480113)\"" pod="tigera-operator/tigera-operator-58fc44c59b-zsjzc" podUID="cc3b6c57-7afa-4c95-ac34-d2ae4f480113" Sep 12 17:45:15.819188 containerd[2001]: time="2025-09-12T17:45:15.819132439Z" level=info msg="CreateContainer within sandbox \"20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 17:45:16.012699 containerd[2001]: time="2025-09-12T17:45:16.012614448Z" level=info msg="RemoveContainer for \"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\"" Sep 12 17:45:16.068360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3338599654.mount: Deactivated successfully. Sep 12 17:45:16.071899 containerd[2001]: time="2025-09-12T17:45:16.068863636Z" level=info msg="RemoveContainer for \"efaa43fd20c3fd6ea5a375ae10f68be7e3c96b7af3a8fcb75c0f099536ca4574\" returns successfully" Sep 12 17:45:16.081959 containerd[2001]: time="2025-09-12T17:45:16.081813534Z" level=info msg="Container 77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:45:16.102489 containerd[2001]: time="2025-09-12T17:45:16.102441250Z" level=info msg="CreateContainer within sandbox \"20c6f06654d59097830b6886534e7a44b8b96b028a84ffc9dce9eb8b1e7262af\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621\"" Sep 12 17:45:16.109464 containerd[2001]: time="2025-09-12T17:45:16.109339005Z" level=info msg="StartContainer for \"77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621\"" Sep 12 17:45:16.112175 containerd[2001]: time="2025-09-12T17:45:16.112131604Z" level=info msg="connecting to shim 77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621" address="unix:///run/containerd/s/e20c758a77d07c00286b0d0ca62b238673b4670b1def6ca165f0e570155b1b81" protocol=ttrpc version=3 Sep 12 17:45:16.199846 systemd[1]: Started cri-containerd-77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621.scope - libcontainer container 77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621. Sep 12 17:45:16.324750 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2451245470.mount: Deactivated successfully. Sep 12 17:45:16.336870 containerd[2001]: time="2025-09-12T17:45:16.336793891Z" level=info msg="StartContainer for \"77dc97a4bee14ab6e74fedd106d7f5eff84a2eb3c09eee2eab2143b4bd816621\" returns successfully" Sep 12 17:45:18.151460 kubelet[3324]: E0912 17:45:18.151077 3324 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ip-172-31-17-147)" Sep 12 17:45:18.171673 containerd[2001]: time="2025-09-12T17:45:18.171619362Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"fc15d13cb7d9771353209dd268825c6c9cd480d32ff0bbaf32cae24154b8fe8c\" pid:6727 exited_at:{seconds:1757699118 nanos:171065374}" Sep 12 17:45:20.025146 systemd[1]: cri-containerd-42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b.scope: Deactivated successfully. Sep 12 17:45:20.025529 systemd[1]: cri-containerd-42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b.scope: Consumed 1.988s CPU time, 36.9M memory peak, 75.7M read from disk. Sep 12 17:45:20.029712 containerd[2001]: time="2025-09-12T17:45:20.029668169Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\" id:\"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\" pid:3175 exit_status:1 exited_at:{seconds:1757699120 nanos:28823408}" Sep 12 17:45:20.030285 containerd[2001]: time="2025-09-12T17:45:20.029730532Z" level=info msg="received exit event container_id:\"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\" id:\"42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b\" pid:3175 exit_status:1 exited_at:{seconds:1757699120 nanos:28823408}" Sep 12 17:45:20.071441 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b-rootfs.mount: Deactivated successfully. Sep 12 17:45:20.644471 kubelet[3324]: I0912 17:45:20.644430 3324 scope.go:117] "RemoveContainer" containerID="42ac698dde0bd3b2bbd61ed1e07b9e02b26ee8b79371c074b601788bda73fe0b" Sep 12 17:45:20.646861 containerd[2001]: time="2025-09-12T17:45:20.646752574Z" level=info msg="CreateContainer within sandbox \"72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 17:45:20.709804 containerd[2001]: time="2025-09-12T17:45:20.709679336Z" level=info msg="Container 63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:45:20.731384 containerd[2001]: time="2025-09-12T17:45:20.731255058Z" level=info msg="CreateContainer within sandbox \"72dd6ac37af23fbc5421f63509fdb2c33c694f1e844c8e34ae0ba1c2160686eb\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf\"" Sep 12 17:45:20.732199 containerd[2001]: time="2025-09-12T17:45:20.732110888Z" level=info msg="StartContainer for \"63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf\"" Sep 12 17:45:20.733385 containerd[2001]: time="2025-09-12T17:45:20.733338245Z" level=info msg="connecting to shim 63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf" address="unix:///run/containerd/s/6e71ce2430e72e08d33387acd87398a5474f1ded28bee32c13a0aa978309a1c4" protocol=ttrpc version=3 Sep 12 17:45:20.764059 systemd[1]: Started cri-containerd-63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf.scope - libcontainer container 63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf. Sep 12 17:45:20.822977 containerd[2001]: time="2025-09-12T17:45:20.822846793Z" level=info msg="StartContainer for \"63e13993cd76484bdb664aedf8317fd4d115fba036956717fcc33cf82a1e1ecf\" returns successfully" Sep 12 17:45:24.136498 containerd[2001]: time="2025-09-12T17:45:24.136447900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"95ae48cac8ff21647f96065a860f7c5c86512daf0257cffebc84f37a95add937\" id:\"f9965b9b5ccee7cc1868ff2697674c6e6300d9e22ddb1e8eabb17805f8ba4698\" pid:6828 exit_status:1 exited_at:{seconds:1757699124 nanos:135902334}" Sep 12 17:45:24.159256 containerd[2001]: time="2025-09-12T17:45:24.159199618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"030ac0d655eb7e91566bf9c418a3bedda07311bb799232948ccb88a26f9050de\" id:\"5bc71443c01dfe4a5e1b2dc224981550b0319625c0a9af1d5d9948378d8c0a83\" pid:6806 exited_at:{seconds:1757699124 nanos:158853035}" Sep 12 17:45:25.440244 update_engine[1989]: I20250912 17:45:25.440126 1989 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:45:25.440244 update_engine[1989]: I20250912 17:45:25.440249 1989 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:45:25.440805 update_engine[1989]: I20250912 17:45:25.440745 1989 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:45:25.470456 update_engine[1989]: E20250912 17:45:25.470377 1989 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:45:25.470831 update_engine[1989]: I20250912 17:45:25.470495 1989 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 17:45:28.189706 kubelet[3324]: E0912 17:45:28.189639 3324 controller.go:195] "Failed to update lease" err="Put \"https://172.31.17.147:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-17-147?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 12 17:45:29.367371 kubelet[3324]: I0912 17:45:29.367036 3324 scope.go:117] "RemoveContainer" containerID="9401a12dd5fedc46790e5f76a59e2c08696ed3778779a5c48140f28cb4b299d9" Sep 12 17:45:29.376464 containerd[2001]: time="2025-09-12T17:45:29.376429186Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Sep 12 17:45:29.393284 containerd[2001]: time="2025-09-12T17:45:29.392757073Z" level=info msg="Container 916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:45:29.400478 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount936953087.mount: Deactivated successfully. Sep 12 17:45:29.406095 containerd[2001]: time="2025-09-12T17:45:29.406035459Z" level=info msg="CreateContainer within sandbox \"66e2803818ddb30349dc741cd091baa5f9db91c954353cd60f18e611fe2d45f4\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb\"" Sep 12 17:45:29.406852 containerd[2001]: time="2025-09-12T17:45:29.406750802Z" level=info msg="StartContainer for \"916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb\"" Sep 12 17:45:29.407827 containerd[2001]: time="2025-09-12T17:45:29.407788366Z" level=info msg="connecting to shim 916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb" address="unix:///run/containerd/s/29ef1b7b9b9389a33f3208054def658d895ad217fe66c3676755cfe17f52ef61" protocol=ttrpc version=3 Sep 12 17:45:29.432033 systemd[1]: Started cri-containerd-916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb.scope - libcontainer container 916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb. Sep 12 17:45:29.469837 containerd[2001]: time="2025-09-12T17:45:29.469799115Z" level=info msg="StartContainer for \"916eed261329c15f9e80a837bf103f3fe24575c24c663c29dc387693115a57bb\" returns successfully" Sep 12 17:45:30.249118 containerd[2001]: time="2025-09-12T17:45:30.247981313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66cf151bffc1dd97b3e77ad82cb94f5f9cf1a0688d109a9c513558d80a13ff0f\" id:\"3237499f8893f364962b8747203233397d6e91f477a745e844b948eb0e17ecab\" pid:6880 exited_at:{seconds:1757699130 nanos:246981806}"