Aug 19 08:15:18.901910 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Aug 18 22:19:37 -00 2025 Aug 19 08:15:18.901947 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:18.901962 kernel: BIOS-provided physical RAM map: Aug 19 08:15:18.901972 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Aug 19 08:15:18.901981 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000786cdfff] usable Aug 19 08:15:18.901990 kernel: BIOS-e820: [mem 0x00000000786ce000-0x000000007894dfff] reserved Aug 19 08:15:18.902002 kernel: BIOS-e820: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Aug 19 08:15:18.902011 kernel: BIOS-e820: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Aug 19 08:15:18.902024 kernel: BIOS-e820: [mem 0x00000000789de000-0x000000007c97bfff] usable Aug 19 08:15:18.902033 kernel: BIOS-e820: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Aug 19 08:15:18.902043 kernel: NX (Execute Disable) protection: active Aug 19 08:15:18.902053 kernel: APIC: Static calls initialized Aug 19 08:15:18.902062 kernel: e820: update [mem 0x768c0018-0x768c8e57] usable ==> usable Aug 19 08:15:18.902073 kernel: extended physical RAM map: Aug 19 08:15:18.902088 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Aug 19 08:15:18.902098 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000768c0017] usable Aug 19 08:15:18.902109 kernel: reserve setup_data: [mem 0x00000000768c0018-0x00000000768c8e57] usable Aug 19 08:15:18.902120 kernel: reserve setup_data: [mem 0x00000000768c8e58-0x00000000786cdfff] usable Aug 19 08:15:18.902139 kernel: reserve setup_data: [mem 0x00000000786ce000-0x000000007894dfff] reserved Aug 19 08:15:18.902151 kernel: reserve setup_data: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Aug 19 08:15:18.902162 kernel: reserve setup_data: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Aug 19 08:15:18.902175 kernel: reserve setup_data: [mem 0x00000000789de000-0x000000007c97bfff] usable Aug 19 08:15:18.902185 kernel: reserve setup_data: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Aug 19 08:15:18.902197 kernel: efi: EFI v2.7 by EDK II Aug 19 08:15:18.902213 kernel: efi: SMBIOS=0x7886a000 ACPI=0x7895d000 ACPI 2.0=0x7895d014 MEMATTR=0x77003518 Aug 19 08:15:18.902225 kernel: secureboot: Secure boot disabled Aug 19 08:15:18.902237 kernel: SMBIOS 2.7 present. Aug 19 08:15:18.902250 kernel: DMI: Amazon EC2 t3.small/, BIOS 1.0 10/16/2017 Aug 19 08:15:18.902262 kernel: DMI: Memory slots populated: 1/1 Aug 19 08:15:18.902286 kernel: Hypervisor detected: KVM Aug 19 08:15:18.902299 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 19 08:15:18.902311 kernel: kvm-clock: using sched offset of 4801701403 cycles Aug 19 08:15:18.902324 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 19 08:15:18.902337 kernel: tsc: Detected 2499.998 MHz processor Aug 19 08:15:18.902349 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 19 08:15:18.902364 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 19 08:15:18.902377 kernel: last_pfn = 0x7c97c max_arch_pfn = 0x400000000 Aug 19 08:15:18.902390 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Aug 19 08:15:18.902402 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 19 08:15:18.902414 kernel: Using GB pages for direct mapping Aug 19 08:15:18.902432 kernel: ACPI: Early table checksum verification disabled Aug 19 08:15:18.902448 kernel: ACPI: RSDP 0x000000007895D014 000024 (v02 AMAZON) Aug 19 08:15:18.902461 kernel: ACPI: XSDT 0x000000007895C0E8 00006C (v01 AMAZON AMZNFACP 00000001 01000013) Aug 19 08:15:18.902474 kernel: ACPI: FACP 0x0000000078955000 000114 (v01 AMAZON AMZNFACP 00000001 AMZN 00000001) Aug 19 08:15:18.902488 kernel: ACPI: DSDT 0x0000000078956000 00115A (v01 AMAZON AMZNDSDT 00000001 AMZN 00000001) Aug 19 08:15:18.902501 kernel: ACPI: FACS 0x00000000789D0000 000040 Aug 19 08:15:18.902514 kernel: ACPI: WAET 0x000000007895B000 000028 (v01 AMAZON AMZNWAET 00000001 AMZN 00000001) Aug 19 08:15:18.902527 kernel: ACPI: SLIT 0x000000007895A000 00006C (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Aug 19 08:15:18.902540 kernel: ACPI: APIC 0x0000000078959000 000076 (v01 AMAZON AMZNAPIC 00000001 AMZN 00000001) Aug 19 08:15:18.902556 kernel: ACPI: SRAT 0x0000000078958000 0000A0 (v01 AMAZON AMZNSRAT 00000001 AMZN 00000001) Aug 19 08:15:18.902570 kernel: ACPI: HPET 0x0000000078954000 000038 (v01 AMAZON AMZNHPET 00000001 AMZN 00000001) Aug 19 08:15:18.902582 kernel: ACPI: SSDT 0x0000000078953000 000759 (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Aug 19 08:15:18.902595 kernel: ACPI: SSDT 0x0000000078952000 00007F (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Aug 19 08:15:18.902608 kernel: ACPI: BGRT 0x0000000078951000 000038 (v01 AMAZON AMAZON 00000002 01000013) Aug 19 08:15:18.902621 kernel: ACPI: Reserving FACP table memory at [mem 0x78955000-0x78955113] Aug 19 08:15:18.902635 kernel: ACPI: Reserving DSDT table memory at [mem 0x78956000-0x78957159] Aug 19 08:15:18.902647 kernel: ACPI: Reserving FACS table memory at [mem 0x789d0000-0x789d003f] Aug 19 08:15:18.902663 kernel: ACPI: Reserving WAET table memory at [mem 0x7895b000-0x7895b027] Aug 19 08:15:18.902677 kernel: ACPI: Reserving SLIT table memory at [mem 0x7895a000-0x7895a06b] Aug 19 08:15:18.902690 kernel: ACPI: Reserving APIC table memory at [mem 0x78959000-0x78959075] Aug 19 08:15:18.902703 kernel: ACPI: Reserving SRAT table memory at [mem 0x78958000-0x7895809f] Aug 19 08:15:18.902717 kernel: ACPI: Reserving HPET table memory at [mem 0x78954000-0x78954037] Aug 19 08:15:18.902730 kernel: ACPI: Reserving SSDT table memory at [mem 0x78953000-0x78953758] Aug 19 08:15:18.902742 kernel: ACPI: Reserving SSDT table memory at [mem 0x78952000-0x7895207e] Aug 19 08:15:18.902755 kernel: ACPI: Reserving BGRT table memory at [mem 0x78951000-0x78951037] Aug 19 08:15:18.902768 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x7fffffff] Aug 19 08:15:18.902780 kernel: NUMA: Initialized distance table, cnt=1 Aug 19 08:15:18.902796 kernel: NODE_DATA(0) allocated [mem 0x7a8eddc0-0x7a8f4fff] Aug 19 08:15:18.902809 kernel: Zone ranges: Aug 19 08:15:18.902822 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 19 08:15:18.902835 kernel: DMA32 [mem 0x0000000001000000-0x000000007c97bfff] Aug 19 08:15:18.902848 kernel: Normal empty Aug 19 08:15:18.902860 kernel: Device empty Aug 19 08:15:18.902873 kernel: Movable zone start for each node Aug 19 08:15:18.902885 kernel: Early memory node ranges Aug 19 08:15:18.902898 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Aug 19 08:15:18.902913 kernel: node 0: [mem 0x0000000000100000-0x00000000786cdfff] Aug 19 08:15:18.902926 kernel: node 0: [mem 0x00000000789de000-0x000000007c97bfff] Aug 19 08:15:18.902939 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007c97bfff] Aug 19 08:15:18.902953 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 19 08:15:18.902966 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Aug 19 08:15:18.902979 kernel: On node 0, zone DMA32: 784 pages in unavailable ranges Aug 19 08:15:18.902992 kernel: On node 0, zone DMA32: 13956 pages in unavailable ranges Aug 19 08:15:18.903005 kernel: ACPI: PM-Timer IO Port: 0xb008 Aug 19 08:15:18.903019 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 19 08:15:18.903034 kernel: IOAPIC[0]: apic_id 0, version 32, address 0xfec00000, GSI 0-23 Aug 19 08:15:18.903047 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 19 08:15:18.903060 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 19 08:15:18.903073 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 19 08:15:18.903086 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 19 08:15:18.903099 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 19 08:15:18.903112 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 19 08:15:18.903125 kernel: TSC deadline timer available Aug 19 08:15:18.903137 kernel: CPU topo: Max. logical packages: 1 Aug 19 08:15:18.903150 kernel: CPU topo: Max. logical dies: 1 Aug 19 08:15:18.903167 kernel: CPU topo: Max. dies per package: 1 Aug 19 08:15:18.903179 kernel: CPU topo: Max. threads per core: 2 Aug 19 08:15:18.903192 kernel: CPU topo: Num. cores per package: 1 Aug 19 08:15:18.903205 kernel: CPU topo: Num. threads per package: 2 Aug 19 08:15:18.903218 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Aug 19 08:15:18.903231 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 19 08:15:18.903244 kernel: [mem 0x7ca00000-0xffffffff] available for PCI devices Aug 19 08:15:18.903257 kernel: Booting paravirtualized kernel on KVM Aug 19 08:15:18.903270 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 19 08:15:18.905310 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Aug 19 08:15:18.905335 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Aug 19 08:15:18.905350 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Aug 19 08:15:18.905364 kernel: pcpu-alloc: [0] 0 1 Aug 19 08:15:18.905378 kernel: kvm-guest: PV spinlocks enabled Aug 19 08:15:18.905392 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Aug 19 08:15:18.905410 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:18.905425 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 08:15:18.905444 kernel: random: crng init done Aug 19 08:15:18.905458 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 08:15:18.905470 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 19 08:15:18.905483 kernel: Fallback order for Node 0: 0 Aug 19 08:15:18.905494 kernel: Built 1 zonelists, mobility grouping on. Total pages: 509451 Aug 19 08:15:18.905506 kernel: Policy zone: DMA32 Aug 19 08:15:18.905533 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 08:15:18.905547 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 19 08:15:18.905560 kernel: Kernel/User page tables isolation: enabled Aug 19 08:15:18.905572 kernel: ftrace: allocating 40101 entries in 157 pages Aug 19 08:15:18.905585 kernel: ftrace: allocated 157 pages with 5 groups Aug 19 08:15:18.905601 kernel: Dynamic Preempt: voluntary Aug 19 08:15:18.905614 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 08:15:18.905634 kernel: rcu: RCU event tracing is enabled. Aug 19 08:15:18.905649 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 19 08:15:18.905661 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 08:15:18.905676 kernel: Rude variant of Tasks RCU enabled. Aug 19 08:15:18.905694 kernel: Tracing variant of Tasks RCU enabled. Aug 19 08:15:18.905709 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 08:15:18.905725 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 19 08:15:18.905741 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:18.905753 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:18.905768 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 08:15:18.905784 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Aug 19 08:15:18.905799 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 08:15:18.905817 kernel: Console: colour dummy device 80x25 Aug 19 08:15:18.905831 kernel: printk: legacy console [tty0] enabled Aug 19 08:15:18.905844 kernel: printk: legacy console [ttyS0] enabled Aug 19 08:15:18.905859 kernel: ACPI: Core revision 20240827 Aug 19 08:15:18.905874 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 30580167144 ns Aug 19 08:15:18.905889 kernel: APIC: Switch to symmetric I/O mode setup Aug 19 08:15:18.905903 kernel: x2apic enabled Aug 19 08:15:18.905919 kernel: APIC: Switched APIC routing to: physical x2apic Aug 19 08:15:18.905936 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Aug 19 08:15:18.905955 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Aug 19 08:15:18.905968 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Aug 19 08:15:18.905982 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Aug 19 08:15:18.905999 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 19 08:15:18.906012 kernel: Spectre V2 : Mitigation: Retpolines Aug 19 08:15:18.906026 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 19 08:15:18.906041 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Aug 19 08:15:18.906055 kernel: RETBleed: Vulnerable Aug 19 08:15:18.906069 kernel: Speculative Store Bypass: Vulnerable Aug 19 08:15:18.906083 kernel: MDS: Vulnerable: Clear CPU buffers attempted, no microcode Aug 19 08:15:18.906099 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Aug 19 08:15:18.906118 kernel: GDS: Unknown: Dependent on hypervisor status Aug 19 08:15:18.906145 kernel: ITS: Mitigation: Aligned branch/return thunks Aug 19 08:15:18.906162 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 19 08:15:18.906177 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 19 08:15:18.906194 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 19 08:15:18.906211 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Aug 19 08:15:18.906226 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Aug 19 08:15:18.906243 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Aug 19 08:15:18.906259 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Aug 19 08:15:18.906298 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Aug 19 08:15:18.906315 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Aug 19 08:15:18.906328 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 19 08:15:18.906341 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Aug 19 08:15:18.906353 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Aug 19 08:15:18.906368 kernel: x86/fpu: xstate_offset[5]: 960, xstate_sizes[5]: 64 Aug 19 08:15:18.906382 kernel: x86/fpu: xstate_offset[6]: 1024, xstate_sizes[6]: 512 Aug 19 08:15:18.906397 kernel: x86/fpu: xstate_offset[7]: 1536, xstate_sizes[7]: 1024 Aug 19 08:15:18.906412 kernel: x86/fpu: xstate_offset[9]: 2560, xstate_sizes[9]: 8 Aug 19 08:15:18.906427 kernel: x86/fpu: Enabled xstate features 0x2ff, context size is 2568 bytes, using 'compacted' format. Aug 19 08:15:18.906442 kernel: Freeing SMP alternatives memory: 32K Aug 19 08:15:18.906457 kernel: pid_max: default: 32768 minimum: 301 Aug 19 08:15:18.906472 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 08:15:18.906490 kernel: landlock: Up and running. Aug 19 08:15:18.906505 kernel: SELinux: Initializing. Aug 19 08:15:18.906521 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 19 08:15:18.906536 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 19 08:15:18.906551 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz (family: 0x6, model: 0x55, stepping: 0x7) Aug 19 08:15:18.906566 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Aug 19 08:15:18.906581 kernel: signal: max sigframe size: 3632 Aug 19 08:15:18.906597 kernel: rcu: Hierarchical SRCU implementation. Aug 19 08:15:18.906612 kernel: rcu: Max phase no-delay instances is 400. Aug 19 08:15:18.906627 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 08:15:18.906645 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Aug 19 08:15:18.906660 kernel: smp: Bringing up secondary CPUs ... Aug 19 08:15:18.906676 kernel: smpboot: x86: Booting SMP configuration: Aug 19 08:15:18.906691 kernel: .... node #0, CPUs: #1 Aug 19 08:15:18.906707 kernel: Transient Scheduler Attacks: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Aug 19 08:15:18.906724 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Aug 19 08:15:18.906739 kernel: smp: Brought up 1 node, 2 CPUs Aug 19 08:15:18.906754 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Aug 19 08:15:18.906772 kernel: Memory: 1908052K/2037804K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54040K init, 2928K bss, 125188K reserved, 0K cma-reserved) Aug 19 08:15:18.906788 kernel: devtmpfs: initialized Aug 19 08:15:18.906803 kernel: x86/mm: Memory block size: 128MB Aug 19 08:15:18.906818 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7895e000-0x789ddfff] (524288 bytes) Aug 19 08:15:18.906834 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 08:15:18.906849 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 19 08:15:18.906864 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 08:15:18.906879 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 08:15:18.906894 kernel: audit: initializing netlink subsys (disabled) Aug 19 08:15:18.906912 kernel: audit: type=2000 audit(1755591316.493:1): state=initialized audit_enabled=0 res=1 Aug 19 08:15:18.906924 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 08:15:18.906937 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 19 08:15:18.906950 kernel: cpuidle: using governor menu Aug 19 08:15:18.906964 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 08:15:18.906978 kernel: dca service started, version 1.12.1 Aug 19 08:15:18.906993 kernel: PCI: Using configuration type 1 for base access Aug 19 08:15:18.907007 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 19 08:15:18.907021 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 08:15:18.907038 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 08:15:18.907052 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 08:15:18.907066 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 08:15:18.907080 kernel: ACPI: Added _OSI(Module Device) Aug 19 08:15:18.907094 kernel: ACPI: Added _OSI(Processor Device) Aug 19 08:15:18.907109 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 08:15:18.907123 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Aug 19 08:15:18.907137 kernel: ACPI: Interpreter enabled Aug 19 08:15:18.907151 kernel: ACPI: PM: (supports S0 S5) Aug 19 08:15:18.907169 kernel: ACPI: Using IOAPIC for interrupt routing Aug 19 08:15:18.907184 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 19 08:15:18.907197 kernel: PCI: Using E820 reservations for host bridge windows Aug 19 08:15:18.907212 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Aug 19 08:15:18.907226 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 19 08:15:18.909496 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Aug 19 08:15:18.909655 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Aug 19 08:15:18.909793 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Aug 19 08:15:18.909819 kernel: acpiphp: Slot [3] registered Aug 19 08:15:18.909835 kernel: acpiphp: Slot [4] registered Aug 19 08:15:18.909850 kernel: acpiphp: Slot [5] registered Aug 19 08:15:18.909865 kernel: acpiphp: Slot [6] registered Aug 19 08:15:18.909881 kernel: acpiphp: Slot [7] registered Aug 19 08:15:18.909896 kernel: acpiphp: Slot [8] registered Aug 19 08:15:18.909911 kernel: acpiphp: Slot [9] registered Aug 19 08:15:18.909925 kernel: acpiphp: Slot [10] registered Aug 19 08:15:18.909940 kernel: acpiphp: Slot [11] registered Aug 19 08:15:18.909959 kernel: acpiphp: Slot [12] registered Aug 19 08:15:18.909975 kernel: acpiphp: Slot [13] registered Aug 19 08:15:18.909992 kernel: acpiphp: Slot [14] registered Aug 19 08:15:18.910009 kernel: acpiphp: Slot [15] registered Aug 19 08:15:18.910026 kernel: acpiphp: Slot [16] registered Aug 19 08:15:18.910043 kernel: acpiphp: Slot [17] registered Aug 19 08:15:18.910059 kernel: acpiphp: Slot [18] registered Aug 19 08:15:18.910076 kernel: acpiphp: Slot [19] registered Aug 19 08:15:18.910092 kernel: acpiphp: Slot [20] registered Aug 19 08:15:18.910113 kernel: acpiphp: Slot [21] registered Aug 19 08:15:18.910140 kernel: acpiphp: Slot [22] registered Aug 19 08:15:18.910158 kernel: acpiphp: Slot [23] registered Aug 19 08:15:18.910174 kernel: acpiphp: Slot [24] registered Aug 19 08:15:18.910190 kernel: acpiphp: Slot [25] registered Aug 19 08:15:18.910206 kernel: acpiphp: Slot [26] registered Aug 19 08:15:18.910222 kernel: acpiphp: Slot [27] registered Aug 19 08:15:18.910239 kernel: acpiphp: Slot [28] registered Aug 19 08:15:18.910255 kernel: acpiphp: Slot [29] registered Aug 19 08:15:18.910272 kernel: acpiphp: Slot [30] registered Aug 19 08:15:18.910302 kernel: acpiphp: Slot [31] registered Aug 19 08:15:18.910315 kernel: PCI host bridge to bus 0000:00 Aug 19 08:15:18.910479 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 19 08:15:18.910602 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 19 08:15:18.910719 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 19 08:15:18.910831 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Aug 19 08:15:18.910944 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x2000ffffffff window] Aug 19 08:15:18.911076 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 19 08:15:18.911225 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Aug 19 08:15:18.913437 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint Aug 19 08:15:18.913598 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x000000 conventional PCI endpoint Aug 19 08:15:18.913737 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Aug 19 08:15:18.913872 kernel: pci 0000:00:01.3: PIIX4 devres E PIO at fff0-ffff Aug 19 08:15:18.914010 kernel: pci 0000:00:01.3: PIIX4 devres F MMIO at ffc00000-ffffffff Aug 19 08:15:18.914155 kernel: pci 0000:00:01.3: PIIX4 devres G PIO at fff0-ffff Aug 19 08:15:18.914309 kernel: pci 0000:00:01.3: PIIX4 devres H MMIO at ffc00000-ffffffff Aug 19 08:15:18.914444 kernel: pci 0000:00:01.3: PIIX4 devres I PIO at fff0-ffff Aug 19 08:15:18.914573 kernel: pci 0000:00:01.3: PIIX4 devres J PIO at fff0-ffff Aug 19 08:15:18.914717 kernel: pci 0000:00:03.0: [1d0f:1111] type 00 class 0x030000 conventional PCI endpoint Aug 19 08:15:18.914847 kernel: pci 0000:00:03.0: BAR 0 [mem 0x80000000-0x803fffff pref] Aug 19 08:15:18.914978 kernel: pci 0000:00:03.0: ROM [mem 0xffff0000-0xffffffff pref] Aug 19 08:15:18.915101 kernel: pci 0000:00:03.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 19 08:15:18.915232 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Endpoint Aug 19 08:15:18.917424 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80404000-0x80407fff] Aug 19 08:15:18.917573 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Endpoint Aug 19 08:15:18.917716 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80400000-0x80403fff] Aug 19 08:15:18.917743 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 19 08:15:18.917758 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 19 08:15:18.917773 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 19 08:15:18.917789 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 19 08:15:18.917804 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Aug 19 08:15:18.917819 kernel: iommu: Default domain type: Translated Aug 19 08:15:18.917834 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 19 08:15:18.917849 kernel: efivars: Registered efivars operations Aug 19 08:15:18.917863 kernel: PCI: Using ACPI for IRQ routing Aug 19 08:15:18.917881 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 19 08:15:18.917895 kernel: e820: reserve RAM buffer [mem 0x768c0018-0x77ffffff] Aug 19 08:15:18.917909 kernel: e820: reserve RAM buffer [mem 0x786ce000-0x7bffffff] Aug 19 08:15:18.917924 kernel: e820: reserve RAM buffer [mem 0x7c97c000-0x7fffffff] Aug 19 08:15:18.918068 kernel: pci 0000:00:03.0: vgaarb: setting as boot VGA device Aug 19 08:15:18.918218 kernel: pci 0000:00:03.0: vgaarb: bridge control possible Aug 19 08:15:18.918377 kernel: pci 0000:00:03.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 19 08:15:18.918398 kernel: vgaarb: loaded Aug 19 08:15:18.918412 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Aug 19 08:15:18.918431 kernel: hpet0: 8 comparators, 32-bit 62.500000 MHz counter Aug 19 08:15:18.918445 kernel: clocksource: Switched to clocksource kvm-clock Aug 19 08:15:18.918458 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 08:15:18.918473 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 08:15:18.918487 kernel: pnp: PnP ACPI init Aug 19 08:15:18.918501 kernel: pnp: PnP ACPI: found 5 devices Aug 19 08:15:18.918515 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 19 08:15:18.918528 kernel: NET: Registered PF_INET protocol family Aug 19 08:15:18.918542 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 08:15:18.918560 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Aug 19 08:15:18.918573 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 08:15:18.918587 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 19 08:15:18.918601 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 19 08:15:18.918615 kernel: TCP: Hash tables configured (established 16384 bind 16384) Aug 19 08:15:18.918631 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 19 08:15:18.918643 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 19 08:15:18.918658 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 08:15:18.918675 kernel: NET: Registered PF_XDP protocol family Aug 19 08:15:18.918820 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 19 08:15:18.918953 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 19 08:15:18.919081 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 19 08:15:18.919202 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Aug 19 08:15:18.924607 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x2000ffffffff window] Aug 19 08:15:18.924790 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Aug 19 08:15:18.924815 kernel: PCI: CLS 0 bytes, default 64 Aug 19 08:15:18.924838 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Aug 19 08:15:18.924854 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Aug 19 08:15:18.924867 kernel: clocksource: Switched to clocksource tsc Aug 19 08:15:18.924879 kernel: Initialise system trusted keyrings Aug 19 08:15:18.924892 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Aug 19 08:15:18.924905 kernel: Key type asymmetric registered Aug 19 08:15:18.924917 kernel: Asymmetric key parser 'x509' registered Aug 19 08:15:18.924931 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 19 08:15:18.924946 kernel: io scheduler mq-deadline registered Aug 19 08:15:18.924964 kernel: io scheduler kyber registered Aug 19 08:15:18.924978 kernel: io scheduler bfq registered Aug 19 08:15:18.924993 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 19 08:15:18.925007 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 08:15:18.925022 kernel: 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 19 08:15:18.925037 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 19 08:15:18.925052 kernel: i8042: Warning: Keylock active Aug 19 08:15:18.925068 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 19 08:15:18.925082 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 19 08:15:18.925248 kernel: rtc_cmos 00:00: RTC can wake from S4 Aug 19 08:15:18.925413 kernel: rtc_cmos 00:00: registered as rtc0 Aug 19 08:15:18.925546 kernel: rtc_cmos 00:00: setting system clock to 2025-08-19T08:15:18 UTC (1755591318) Aug 19 08:15:18.925674 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Aug 19 08:15:18.925695 kernel: intel_pstate: CPU model not supported Aug 19 08:15:18.925740 kernel: efifb: probing for efifb Aug 19 08:15:18.925761 kernel: efifb: framebuffer at 0x80000000, using 1876k, total 1875k Aug 19 08:15:18.925778 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 Aug 19 08:15:18.925798 kernel: efifb: scrolling: redraw Aug 19 08:15:18.925816 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Aug 19 08:15:18.925831 kernel: Console: switching to colour frame buffer device 100x37 Aug 19 08:15:18.925846 kernel: fb0: EFI VGA frame buffer device Aug 19 08:15:18.925861 kernel: pstore: Using crash dump compression: deflate Aug 19 08:15:18.925878 kernel: pstore: Registered efi_pstore as persistent store backend Aug 19 08:15:18.925893 kernel: NET: Registered PF_INET6 protocol family Aug 19 08:15:18.925908 kernel: Segment Routing with IPv6 Aug 19 08:15:18.925928 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 08:15:18.925946 kernel: NET: Registered PF_PACKET protocol family Aug 19 08:15:18.925970 kernel: Key type dns_resolver registered Aug 19 08:15:18.925985 kernel: IPI shorthand broadcast: enabled Aug 19 08:15:18.926001 kernel: sched_clock: Marking stable (2635002625, 143336883)->(2864934297, -86594789) Aug 19 08:15:18.926017 kernel: registered taskstats version 1 Aug 19 08:15:18.926033 kernel: Loading compiled-in X.509 certificates Aug 19 08:15:18.926049 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: 93a065b103c00d4b81cc5822e4e7f9674e63afaf' Aug 19 08:15:18.926064 kernel: Demotion targets for Node 0: null Aug 19 08:15:18.926080 kernel: Key type .fscrypt registered Aug 19 08:15:18.926099 kernel: Key type fscrypt-provisioning registered Aug 19 08:15:18.926115 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 08:15:18.926141 kernel: ima: Allocated hash algorithm: sha1 Aug 19 08:15:18.926157 kernel: ima: No architecture policies found Aug 19 08:15:18.926173 kernel: clk: Disabling unused clocks Aug 19 08:15:18.926189 kernel: Warning: unable to open an initial console. Aug 19 08:15:18.926205 kernel: Freeing unused kernel image (initmem) memory: 54040K Aug 19 08:15:18.926221 kernel: Write protecting the kernel read-only data: 24576k Aug 19 08:15:18.926236 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 19 08:15:18.926256 kernel: Run /init as init process Aug 19 08:15:18.926286 kernel: with arguments: Aug 19 08:15:18.926302 kernel: /init Aug 19 08:15:18.926317 kernel: with environment: Aug 19 08:15:18.926330 kernel: HOME=/ Aug 19 08:15:18.926347 kernel: TERM=linux Aug 19 08:15:18.926366 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 08:15:18.926383 systemd[1]: Successfully made /usr/ read-only. Aug 19 08:15:18.926405 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:15:18.926422 systemd[1]: Detected virtualization amazon. Aug 19 08:15:18.926438 systemd[1]: Detected architecture x86-64. Aug 19 08:15:18.926453 systemd[1]: Running in initrd. Aug 19 08:15:18.926470 systemd[1]: No hostname configured, using default hostname. Aug 19 08:15:18.926489 systemd[1]: Hostname set to . Aug 19 08:15:18.926505 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:15:18.926522 systemd[1]: Queued start job for default target initrd.target. Aug 19 08:15:18.926538 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:15:18.926555 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:15:18.926573 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 08:15:18.926589 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:15:18.926606 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 08:15:18.926627 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 08:15:18.926645 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 08:15:18.926662 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 08:15:18.926678 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:15:18.926695 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:15:18.926712 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:15:18.926731 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:15:18.926747 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:15:18.926764 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:15:18.926779 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:15:18.926796 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:15:18.926814 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 08:15:18.926829 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 08:15:18.926848 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:15:18.926865 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:15:18.926882 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:15:18.926897 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:15:18.926913 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 08:15:18.926928 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:15:18.926943 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 08:15:18.926959 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 08:15:18.926975 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 08:15:18.926990 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:15:18.927008 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:15:18.927024 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:18.927072 systemd-journald[208]: Collecting audit messages is disabled. Aug 19 08:15:18.927109 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 08:15:18.927130 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:15:18.927147 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 08:15:18.927164 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 08:15:18.927183 systemd-journald[208]: Journal started Aug 19 08:15:18.927220 systemd-journald[208]: Runtime Journal (/run/log/journal/ec25160a8d04a513e38d0c6ac0a7bd2f) is 4.8M, max 38.4M, 33.6M free. Aug 19 08:15:18.933379 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:15:18.933436 systemd-modules-load[210]: Inserted module 'overlay' Aug 19 08:15:18.943430 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:15:18.950619 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 08:15:18.958437 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:15:18.969893 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:18.973228 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 08:15:18.983476 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 08:15:18.991319 kernel: Bridge firewalling registered Aug 19 08:15:18.991368 systemd-modules-load[210]: Inserted module 'br_netfilter' Aug 19 08:15:18.992737 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:15:18.998011 systemd-tmpfiles[221]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 08:15:19.004182 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Aug 19 08:15:19.002676 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:15:19.005035 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:15:19.007980 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:15:19.017837 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:15:19.021475 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 08:15:19.025663 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:15:19.035479 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:15:19.052102 dracut-cmdline[244]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:15:19.094784 systemd-resolved[247]: Positive Trust Anchors: Aug 19 08:15:19.094804 systemd-resolved[247]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:15:19.094864 systemd-resolved[247]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:15:19.103008 systemd-resolved[247]: Defaulting to hostname 'linux'. Aug 19 08:15:19.106012 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:15:19.106733 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:15:19.150314 kernel: SCSI subsystem initialized Aug 19 08:15:19.159306 kernel: Loading iSCSI transport class v2.0-870. Aug 19 08:15:19.171307 kernel: iscsi: registered transport (tcp) Aug 19 08:15:19.192596 kernel: iscsi: registered transport (qla4xxx) Aug 19 08:15:19.192671 kernel: QLogic iSCSI HBA Driver Aug 19 08:15:19.211427 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:15:19.230895 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:15:19.231707 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:15:19.277122 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 08:15:19.279264 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 08:15:19.331314 kernel: raid6: avx512x4 gen() 17790 MB/s Aug 19 08:15:19.349314 kernel: raid6: avx512x2 gen() 18105 MB/s Aug 19 08:15:19.367310 kernel: raid6: avx512x1 gen() 18061 MB/s Aug 19 08:15:19.385322 kernel: raid6: avx2x4 gen() 17866 MB/s Aug 19 08:15:19.403308 kernel: raid6: avx2x2 gen() 18049 MB/s Aug 19 08:15:19.421535 kernel: raid6: avx2x1 gen() 13006 MB/s Aug 19 08:15:19.421615 kernel: raid6: using algorithm avx512x2 gen() 18105 MB/s Aug 19 08:15:19.440555 kernel: raid6: .... xor() 23917 MB/s, rmw enabled Aug 19 08:15:19.440640 kernel: raid6: using avx512x2 recovery algorithm Aug 19 08:15:19.462318 kernel: xor: automatically using best checksumming function avx Aug 19 08:15:19.630314 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 08:15:19.636792 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:15:19.638945 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:15:19.672710 systemd-udevd[456]: Using default interface naming scheme 'v255'. Aug 19 08:15:19.679032 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:15:19.682335 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 08:15:19.707891 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Aug 19 08:15:19.734620 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:15:19.736735 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:15:19.792650 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:15:19.798873 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 08:15:19.873977 kernel: ena 0000:00:05.0: ENA device version: 0.10 Aug 19 08:15:19.874270 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Aug 19 08:15:19.882823 kernel: ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy. Aug 19 08:15:19.893265 kernel: cryptd: max_cpu_qlen set to 1000 Aug 19 08:15:19.904443 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Aug 19 08:15:19.917299 kernel: AES CTR mode by8 optimization enabled Aug 19 08:15:19.931326 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80400000, mac addr 06:ac:ae:44:e3:9b Aug 19 08:15:19.938885 (udev-worker)[514]: Network interface NamePolicy= disabled on kernel command line. Aug 19 08:15:19.955953 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:15:19.956230 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:19.962908 kernel: nvme nvme0: pci function 0000:00:04.0 Aug 19 08:15:19.963168 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Aug 19 08:15:19.963385 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:19.970473 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:19.973490 kernel: nvme nvme0: 2/0/0 default/read/poll queues Aug 19 08:15:19.974052 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:15:19.981427 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 19 08:15:19.981481 kernel: GPT:9289727 != 16777215 Aug 19 08:15:19.982373 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 19 08:15:19.984432 kernel: GPT:9289727 != 16777215 Aug 19 08:15:19.984468 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 19 08:15:19.985528 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Aug 19 08:15:20.007174 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:20.019325 kernel: nvme nvme0: using unchecked data buffer Aug 19 08:15:20.146664 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Aug 19 08:15:20.159236 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Aug 19 08:15:20.160190 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 08:15:20.170485 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Aug 19 08:15:20.171063 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Aug 19 08:15:20.183356 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Aug 19 08:15:20.184091 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:15:20.185236 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:15:20.186486 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:15:20.188176 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 08:15:20.191073 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 08:15:20.216224 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:15:20.218435 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Aug 19 08:15:20.218508 disk-uuid[691]: Primary Header is updated. Aug 19 08:15:20.218508 disk-uuid[691]: Secondary Entries is updated. Aug 19 08:15:20.218508 disk-uuid[691]: Secondary Header is updated. Aug 19 08:15:21.233148 disk-uuid[698]: The operation has completed successfully. Aug 19 08:15:21.233829 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Aug 19 08:15:21.374391 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 08:15:21.374539 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 08:15:21.402812 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 08:15:21.434614 sh[959]: Success Aug 19 08:15:21.454326 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 08:15:21.454402 kernel: device-mapper: uevent: version 1.0.3 Aug 19 08:15:21.457384 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 08:15:21.468336 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Aug 19 08:15:21.562015 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 08:15:21.565357 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 08:15:21.581984 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 08:15:21.609303 kernel: BTRFS: device fsid 99050df3-5e04-4f37-acde-dec46aab7896 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (982) Aug 19 08:15:21.613826 kernel: BTRFS info (device dm-0): first mount of filesystem 99050df3-5e04-4f37-acde-dec46aab7896 Aug 19 08:15:21.613903 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:21.613917 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 08:15:21.662989 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 08:15:21.664443 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:15:21.665380 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 08:15:21.666424 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 08:15:21.667764 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 08:15:21.699340 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1013) Aug 19 08:15:21.704896 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:21.704953 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:21.704966 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:21.731305 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:21.731945 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 08:15:21.735975 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 08:15:21.768126 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:15:21.770812 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:15:21.811622 systemd-networkd[1151]: lo: Link UP Aug 19 08:15:21.811634 systemd-networkd[1151]: lo: Gained carrier Aug 19 08:15:21.813341 systemd-networkd[1151]: Enumeration completed Aug 19 08:15:21.813758 systemd-networkd[1151]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:21.813764 systemd-networkd[1151]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:15:21.814827 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:15:21.817312 systemd[1]: Reached target network.target - Network. Aug 19 08:15:21.818878 systemd-networkd[1151]: eth0: Link UP Aug 19 08:15:21.818883 systemd-networkd[1151]: eth0: Gained carrier Aug 19 08:15:21.818902 systemd-networkd[1151]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:21.833389 systemd-networkd[1151]: eth0: DHCPv4 address 172.31.23.28/20, gateway 172.31.16.1 acquired from 172.31.16.1 Aug 19 08:15:22.089853 ignition[1112]: Ignition 2.21.0 Aug 19 08:15:22.089871 ignition[1112]: Stage: fetch-offline Aug 19 08:15:22.090190 ignition[1112]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:22.090206 ignition[1112]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:22.092515 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:15:22.090610 ignition[1112]: Ignition finished successfully Aug 19 08:15:22.095487 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 19 08:15:22.123650 ignition[1161]: Ignition 2.21.0 Aug 19 08:15:22.123666 ignition[1161]: Stage: fetch Aug 19 08:15:22.124064 ignition[1161]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:22.124077 ignition[1161]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:22.124202 ignition[1161]: PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:22.145989 ignition[1161]: PUT result: OK Aug 19 08:15:22.149384 ignition[1161]: parsed url from cmdline: "" Aug 19 08:15:22.149393 ignition[1161]: no config URL provided Aug 19 08:15:22.149401 ignition[1161]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 08:15:22.149413 ignition[1161]: no config at "/usr/lib/ignition/user.ign" Aug 19 08:15:22.149436 ignition[1161]: PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:22.150303 ignition[1161]: PUT result: OK Aug 19 08:15:22.150355 ignition[1161]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Aug 19 08:15:22.151255 ignition[1161]: GET result: OK Aug 19 08:15:22.151358 ignition[1161]: parsing config with SHA512: 45e96252c4324981845830fe58f98ec9f776eead69b075ce18dec84a267c6884edea97b027d07b69abc051c695d900f76b77bc4c271e64b6f9d469f1d5c3673d Aug 19 08:15:22.155621 unknown[1161]: fetched base config from "system" Aug 19 08:15:22.155633 unknown[1161]: fetched base config from "system" Aug 19 08:15:22.155981 ignition[1161]: fetch: fetch complete Aug 19 08:15:22.155638 unknown[1161]: fetched user config from "aws" Aug 19 08:15:22.155986 ignition[1161]: fetch: fetch passed Aug 19 08:15:22.156030 ignition[1161]: Ignition finished successfully Aug 19 08:15:22.158325 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 19 08:15:22.159975 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 08:15:22.189040 ignition[1168]: Ignition 2.21.0 Aug 19 08:15:22.189062 ignition[1168]: Stage: kargs Aug 19 08:15:22.189509 ignition[1168]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:22.189524 ignition[1168]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:22.189649 ignition[1168]: PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:22.190749 ignition[1168]: PUT result: OK Aug 19 08:15:22.193258 ignition[1168]: kargs: kargs passed Aug 19 08:15:22.193359 ignition[1168]: Ignition finished successfully Aug 19 08:15:22.195866 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 08:15:22.197360 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 08:15:22.223118 ignition[1174]: Ignition 2.21.0 Aug 19 08:15:22.223133 ignition[1174]: Stage: disks Aug 19 08:15:22.223547 ignition[1174]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:22.223560 ignition[1174]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:22.223678 ignition[1174]: PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:22.224610 ignition[1174]: PUT result: OK Aug 19 08:15:22.227542 ignition[1174]: disks: disks passed Aug 19 08:15:22.227600 ignition[1174]: Ignition finished successfully Aug 19 08:15:22.229133 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 08:15:22.229693 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 08:15:22.230512 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 08:15:22.230849 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:15:22.231402 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:15:22.231914 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:15:22.233495 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 08:15:22.283743 systemd-fsck[1182]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 19 08:15:22.286850 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 08:15:22.288765 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 08:15:22.426309 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 41966107-04fa-426e-9830-6b4efa50e27b r/w with ordered data mode. Quota mode: none. Aug 19 08:15:22.426675 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 08:15:22.427594 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 08:15:22.429228 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:15:22.432364 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 08:15:22.433513 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 19 08:15:22.434906 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 08:15:22.435342 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:15:22.441513 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 08:15:22.443355 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 08:15:22.456330 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1201) Aug 19 08:15:22.459434 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:22.459562 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:22.461909 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:22.470406 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:15:22.625323 initrd-setup-root[1225]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 08:15:22.641085 initrd-setup-root[1232]: cut: /sysroot/etc/group: No such file or directory Aug 19 08:15:22.647473 initrd-setup-root[1239]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 08:15:22.652832 initrd-setup-root[1246]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 08:15:22.841588 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 08:15:22.844036 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 08:15:22.847423 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 08:15:22.862410 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 08:15:22.864788 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:22.895610 ignition[1313]: INFO : Ignition 2.21.0 Aug 19 08:15:22.895610 ignition[1313]: INFO : Stage: mount Aug 19 08:15:22.897810 ignition[1313]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:22.897810 ignition[1313]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:22.897810 ignition[1313]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:22.900558 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 08:15:22.901376 ignition[1313]: INFO : PUT result: OK Aug 19 08:15:22.904619 ignition[1313]: INFO : mount: mount passed Aug 19 08:15:22.905136 ignition[1313]: INFO : Ignition finished successfully Aug 19 08:15:22.906701 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 08:15:22.908137 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 08:15:22.929374 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:15:22.963310 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1326) Aug 19 08:15:22.967851 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:15:22.967918 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:15:22.967933 kernel: BTRFS info (device nvme0n1p6): using free-space-tree Aug 19 08:15:22.976186 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:15:23.002252 ignition[1342]: INFO : Ignition 2.21.0 Aug 19 08:15:23.002252 ignition[1342]: INFO : Stage: files Aug 19 08:15:23.003840 ignition[1342]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:23.003840 ignition[1342]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:23.003840 ignition[1342]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:23.005527 ignition[1342]: INFO : PUT result: OK Aug 19 08:15:23.008888 ignition[1342]: DEBUG : files: compiled without relabeling support, skipping Aug 19 08:15:23.010588 ignition[1342]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 08:15:23.010588 ignition[1342]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 08:15:23.013122 ignition[1342]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 08:15:23.013721 ignition[1342]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 08:15:23.013721 ignition[1342]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 08:15:23.013522 unknown[1342]: wrote ssh authorized keys file for user: core Aug 19 08:15:23.015710 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 19 08:15:23.016333 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Aug 19 08:15:23.067412 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 08:15:23.339298 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 19 08:15:23.339298 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:15:23.340817 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:15:23.345591 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:15:23.345591 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:15:23.345591 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:15:23.348106 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:15:23.348106 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:15:23.348106 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Aug 19 08:15:23.431464 systemd-networkd[1151]: eth0: Gained IPv6LL Aug 19 08:15:23.761810 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 08:15:24.058922 ignition[1342]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 19 08:15:24.058922 ignition[1342]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 08:15:24.061339 ignition[1342]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:15:24.065362 ignition[1342]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:15:24.065362 ignition[1342]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 08:15:24.065362 ignition[1342]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 19 08:15:24.069340 ignition[1342]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 08:15:24.069340 ignition[1342]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:15:24.069340 ignition[1342]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:15:24.069340 ignition[1342]: INFO : files: files passed Aug 19 08:15:24.069340 ignition[1342]: INFO : Ignition finished successfully Aug 19 08:15:24.067562 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 08:15:24.069599 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 08:15:24.073571 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 08:15:24.091264 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 08:15:24.092345 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 08:15:24.097906 initrd-setup-root-after-ignition[1372]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:15:24.097906 initrd-setup-root-after-ignition[1372]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:15:24.101564 initrd-setup-root-after-ignition[1376]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:15:24.103213 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:15:24.103902 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 08:15:24.105860 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 08:15:24.143549 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 08:15:24.143672 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 08:15:24.145034 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 08:15:24.145680 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 08:15:24.146261 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 08:15:24.147857 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 08:15:24.169471 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:15:24.171678 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 08:15:24.194373 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:15:24.195054 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:15:24.196125 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 08:15:24.197004 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 08:15:24.197234 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:15:24.198494 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 08:15:24.199398 systemd[1]: Stopped target basic.target - Basic System. Aug 19 08:15:24.200167 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 08:15:24.200973 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:15:24.201696 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 08:15:24.202658 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:15:24.203415 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 08:15:24.204197 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:15:24.205020 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 08:15:24.206045 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 08:15:24.206994 systemd[1]: Stopped target swap.target - Swaps. Aug 19 08:15:24.207733 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 08:15:24.207961 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:15:24.209014 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:15:24.209862 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:15:24.210672 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 08:15:24.210805 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:15:24.211466 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 08:15:24.211690 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 08:15:24.212982 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 08:15:24.213245 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:15:24.213972 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 08:15:24.214356 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 08:15:24.217374 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 08:15:24.217887 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 08:15:24.218067 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:15:24.223548 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 08:15:24.224522 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 08:15:24.225349 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:15:24.226980 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 08:15:24.227445 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:15:24.234598 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 08:15:24.235527 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 08:15:24.254631 ignition[1396]: INFO : Ignition 2.21.0 Aug 19 08:15:24.254631 ignition[1396]: INFO : Stage: umount Aug 19 08:15:24.256389 ignition[1396]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:15:24.256389 ignition[1396]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Aug 19 08:15:24.256389 ignition[1396]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Aug 19 08:15:24.258031 ignition[1396]: INFO : PUT result: OK Aug 19 08:15:24.259960 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 08:15:24.262696 ignition[1396]: INFO : umount: umount passed Aug 19 08:15:24.262696 ignition[1396]: INFO : Ignition finished successfully Aug 19 08:15:24.265121 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 08:15:24.265271 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 08:15:24.266535 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 08:15:24.266603 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 08:15:24.267454 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 08:15:24.267519 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 08:15:24.268117 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 19 08:15:24.268180 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 19 08:15:24.268754 systemd[1]: Stopped target network.target - Network. Aug 19 08:15:24.269378 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 08:15:24.269429 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:15:24.269775 systemd[1]: Stopped target paths.target - Path Units. Aug 19 08:15:24.270052 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 08:15:24.276380 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:15:24.276923 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 08:15:24.278065 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 08:15:24.278920 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 08:15:24.278984 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:15:24.279588 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 08:15:24.279642 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:15:24.280212 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 08:15:24.280313 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 08:15:24.282316 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 08:15:24.282395 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 08:15:24.283272 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 08:15:24.284460 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 08:15:24.290036 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 08:15:24.290239 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 08:15:24.293767 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 08:15:24.294044 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 08:15:24.294249 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 08:15:24.296224 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 08:15:24.297044 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 08:15:24.297535 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 08:15:24.297573 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:15:24.300378 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 08:15:24.300713 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 08:15:24.300771 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:15:24.301171 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 08:15:24.301215 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:15:24.301643 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 08:15:24.301682 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 08:15:24.302006 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 08:15:24.302049 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:15:24.302632 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:15:24.304316 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 08:15:24.304377 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:15:24.317725 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 08:15:24.317915 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:15:24.320965 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 08:15:24.321049 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 08:15:24.322442 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 08:15:24.322494 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:15:24.323183 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 08:15:24.323253 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:15:24.324407 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 08:15:24.324474 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 08:15:24.325577 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 08:15:24.325654 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:15:24.329814 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 08:15:24.332073 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 08:15:24.332177 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:15:24.336025 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 08:15:24.336078 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:15:24.336692 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:15:24.336760 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:24.340074 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 08:15:24.340164 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 08:15:24.340219 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:15:24.340787 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 08:15:24.340959 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 08:15:24.349421 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 08:15:24.349572 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 08:15:24.400915 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 08:15:24.401035 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 08:15:24.402489 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 08:15:24.402990 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 08:15:24.403064 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 08:15:24.405675 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 08:15:24.426780 systemd[1]: Switching root. Aug 19 08:15:24.473483 systemd-journald[208]: Journal stopped Aug 19 08:15:26.082184 systemd-journald[208]: Received SIGTERM from PID 1 (systemd). Aug 19 08:15:26.082305 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 08:15:26.082339 kernel: SELinux: policy capability open_perms=1 Aug 19 08:15:26.082358 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 08:15:26.082376 kernel: SELinux: policy capability always_check_network=0 Aug 19 08:15:26.082399 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 08:15:26.082417 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 08:15:26.082435 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 08:15:26.082457 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 08:15:26.082474 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 08:15:26.082498 kernel: audit: type=1403 audit(1755591324.811:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 08:15:26.082524 systemd[1]: Successfully loaded SELinux policy in 61.571ms. Aug 19 08:15:26.082554 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.297ms. Aug 19 08:15:26.082574 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:15:26.082594 systemd[1]: Detected virtualization amazon. Aug 19 08:15:26.082614 systemd[1]: Detected architecture x86-64. Aug 19 08:15:26.082634 systemd[1]: Detected first boot. Aug 19 08:15:26.082654 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:15:26.082673 zram_generator::config[1441]: No configuration found. Aug 19 08:15:26.082695 kernel: Guest personality initialized and is inactive Aug 19 08:15:26.082714 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 19 08:15:26.082731 kernel: Initialized host personality Aug 19 08:15:26.082749 kernel: NET: Registered PF_VSOCK protocol family Aug 19 08:15:26.082769 systemd[1]: Populated /etc with preset unit settings. Aug 19 08:15:26.082790 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 08:15:26.082809 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 08:15:26.082828 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 08:15:26.082847 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 08:15:26.082870 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 08:15:26.082889 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 08:15:26.082908 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 08:15:26.082928 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 08:15:26.082948 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 08:15:26.082967 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 08:15:26.082988 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 08:15:26.083008 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 08:15:26.083026 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:15:26.083048 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:15:26.083067 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 08:15:26.083085 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 08:15:26.083103 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 08:15:26.083125 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:15:26.083148 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 19 08:15:26.083171 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:15:26.083198 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:15:26.083221 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 08:15:26.083244 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 08:15:26.083267 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 08:15:26.083315 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 08:15:26.083337 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:15:26.083360 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:15:26.083383 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:15:26.083404 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:15:26.083422 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 08:15:26.083448 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 08:15:26.083469 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 08:15:26.083490 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:15:26.083510 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:15:26.083530 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:15:26.083551 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 08:15:26.083571 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 08:15:26.083591 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 08:15:26.083611 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 08:15:26.083636 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:26.083658 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 08:15:26.083681 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 08:15:26.083704 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 08:15:26.083728 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 08:15:26.083750 systemd[1]: Reached target machines.target - Containers. Aug 19 08:15:26.083772 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 08:15:26.083792 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:15:26.083815 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:15:26.083835 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 08:15:26.083855 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:15:26.083875 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:15:26.083895 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:15:26.083915 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 08:15:26.083935 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:15:26.083955 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 08:15:26.083978 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 08:15:26.083998 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 08:15:26.084018 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 08:15:26.084037 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 08:15:26.084059 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:15:26.084080 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:15:26.084100 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:15:26.084120 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:15:26.084141 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 08:15:26.084164 kernel: loop: module loaded Aug 19 08:15:26.084185 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 08:15:26.084205 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:15:26.084229 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 08:15:26.084248 kernel: fuse: init (API version 7.41) Aug 19 08:15:26.084271 systemd[1]: Stopped verity-setup.service. Aug 19 08:15:26.086008 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:26.086047 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 08:15:26.086072 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 08:15:26.086107 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 08:15:26.086135 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 08:15:26.086159 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 08:15:26.086184 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 08:15:26.086208 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:15:26.086232 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 08:15:26.086255 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 08:15:26.086294 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:15:26.086313 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:15:26.086331 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:15:26.086355 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:15:26.086374 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 08:15:26.086395 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 08:15:26.086417 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:15:26.086438 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:15:26.086460 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:15:26.086483 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 08:15:26.086504 kernel: ACPI: bus type drm_connector registered Aug 19 08:15:26.086526 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:15:26.086550 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:15:26.086571 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 08:15:26.086637 systemd-journald[1527]: Collecting audit messages is disabled. Aug 19 08:15:26.086680 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 08:15:26.086703 systemd-journald[1527]: Journal started Aug 19 08:15:26.086743 systemd-journald[1527]: Runtime Journal (/run/log/journal/ec25160a8d04a513e38d0c6ac0a7bd2f) is 4.8M, max 38.4M, 33.6M free. Aug 19 08:15:25.659241 systemd[1]: Queued start job for default target multi-user.target. Aug 19 08:15:25.672860 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Aug 19 08:15:25.673338 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 08:15:26.093340 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 08:15:26.104305 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 08:15:26.108326 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:15:26.116085 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 08:15:26.121309 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 08:15:26.126304 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:15:26.131306 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 08:15:26.137308 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:15:26.142464 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 08:15:26.148303 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:15:26.154311 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:15:26.162305 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 08:15:26.177311 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:15:26.174542 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 08:15:26.176346 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:15:26.177236 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 08:15:26.179484 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 08:15:26.201591 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:15:26.203970 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 08:15:26.222662 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:15:26.227315 kernel: loop0: detected capacity change from 0 to 221472 Aug 19 08:15:26.227689 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 08:15:26.228945 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:15:26.231172 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 08:15:26.236720 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 08:15:26.240394 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 08:15:26.277345 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 08:15:26.278892 systemd-journald[1527]: Time spent on flushing to /var/log/journal/ec25160a8d04a513e38d0c6ac0a7bd2f is 38.009ms for 1017 entries. Aug 19 08:15:26.278892 systemd-journald[1527]: System Journal (/var/log/journal/ec25160a8d04a513e38d0c6ac0a7bd2f) is 8M, max 195.6M, 187.6M free. Aug 19 08:15:26.335023 systemd-journald[1527]: Received client request to flush runtime journal. Aug 19 08:15:26.338516 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 08:15:26.354304 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 08:15:26.357002 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 08:15:26.361533 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:15:26.388364 kernel: loop1: detected capacity change from 0 to 72360 Aug 19 08:15:26.408172 systemd-tmpfiles[1592]: ACLs are not supported, ignoring. Aug 19 08:15:26.408630 systemd-tmpfiles[1592]: ACLs are not supported, ignoring. Aug 19 08:15:26.414258 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:15:26.477314 kernel: loop2: detected capacity change from 0 to 128016 Aug 19 08:15:26.576364 kernel: loop3: detected capacity change from 0 to 111000 Aug 19 08:15:26.675307 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 08:15:26.701305 kernel: loop4: detected capacity change from 0 to 221472 Aug 19 08:15:26.743330 kernel: loop5: detected capacity change from 0 to 72360 Aug 19 08:15:26.768370 kernel: loop6: detected capacity change from 0 to 128016 Aug 19 08:15:26.781352 kernel: loop7: detected capacity change from 0 to 111000 Aug 19 08:15:26.794682 (sd-merge)[1598]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Aug 19 08:15:26.795793 (sd-merge)[1598]: Merged extensions into '/usr'. Aug 19 08:15:26.802752 systemd[1]: Reload requested from client PID 1556 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 08:15:26.802922 systemd[1]: Reloading... Aug 19 08:15:26.934535 zram_generator::config[1620]: No configuration found. Aug 19 08:15:27.279154 systemd[1]: Reloading finished in 475 ms. Aug 19 08:15:27.296315 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 08:15:27.297147 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 08:15:27.310674 systemd[1]: Starting ensure-sysext.service... Aug 19 08:15:27.312395 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:15:27.316471 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:15:27.334121 systemd-tmpfiles[1677]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 08:15:27.334155 systemd-tmpfiles[1677]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 08:15:27.334421 systemd-tmpfiles[1677]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 08:15:27.334676 systemd-tmpfiles[1677]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 08:15:27.335502 systemd-tmpfiles[1677]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 08:15:27.335754 systemd-tmpfiles[1677]: ACLs are not supported, ignoring. Aug 19 08:15:27.335812 systemd-tmpfiles[1677]: ACLs are not supported, ignoring. Aug 19 08:15:27.342687 systemd-tmpfiles[1677]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:15:27.342700 systemd-tmpfiles[1677]: Skipping /boot Aug 19 08:15:27.348394 systemd[1]: Reload requested from client PID 1676 ('systemctl') (unit ensure-sysext.service)... Aug 19 08:15:27.348409 systemd[1]: Reloading... Aug 19 08:15:27.360444 systemd-udevd[1678]: Using default interface naming scheme 'v255'. Aug 19 08:15:27.364503 systemd-tmpfiles[1677]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:15:27.364516 systemd-tmpfiles[1677]: Skipping /boot Aug 19 08:15:27.453409 zram_generator::config[1703]: No configuration found. Aug 19 08:15:27.553163 ldconfig[1552]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 08:15:27.722155 (udev-worker)[1723]: Network interface NamePolicy= disabled on kernel command line. Aug 19 08:15:27.830314 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Aug 19 08:15:27.838315 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 08:15:27.852309 kernel: ACPI: button: Power Button [PWRF] Aug 19 08:15:27.865194 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input5 Aug 19 08:15:27.874322 kernel: ACPI: button: Sleep Button [SLPF] Aug 19 08:15:27.948309 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Aug 19 08:15:27.951842 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 19 08:15:27.952122 systemd[1]: Reloading finished in 603 ms. Aug 19 08:15:27.963111 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:15:27.966122 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 08:15:27.968341 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:15:27.998480 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:15:28.002511 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 08:15:28.005363 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 08:15:28.011522 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:15:28.016524 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:15:28.020601 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 08:15:28.029836 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.030132 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:15:28.033752 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:15:28.039725 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:15:28.054668 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:15:28.056014 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:15:28.056197 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:15:28.056366 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.067204 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 08:15:28.072247 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.072671 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:15:28.072905 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:15:28.073042 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:15:28.073183 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.083144 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.084193 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:15:28.089378 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:15:28.091010 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:15:28.091184 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:15:28.091470 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 08:15:28.092517 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:15:28.107469 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 08:15:28.109921 systemd[1]: Finished ensure-sysext.service. Aug 19 08:15:28.110843 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:15:28.111078 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:15:28.146445 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:15:28.147177 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:15:28.148233 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:15:28.148982 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:15:28.149219 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:15:28.151128 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 08:15:28.153895 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:15:28.159559 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 08:15:28.166809 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:15:28.167512 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:15:28.208518 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 08:15:28.245378 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 08:15:28.246640 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 08:15:28.250362 augenrules[1894]: No rules Aug 19 08:15:28.253977 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:15:28.254743 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:15:28.343074 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 08:15:28.398945 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Aug 19 08:15:28.412092 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 08:15:28.456694 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:15:28.471838 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 08:15:28.570646 systemd-networkd[1812]: lo: Link UP Aug 19 08:15:28.571112 systemd-networkd[1812]: lo: Gained carrier Aug 19 08:15:28.573335 systemd-networkd[1812]: Enumeration completed Aug 19 08:15:28.573585 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:15:28.577518 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 08:15:28.580580 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 08:15:28.582494 systemd-networkd[1812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:28.582505 systemd-networkd[1812]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:15:28.587347 systemd-resolved[1813]: Positive Trust Anchors: Aug 19 08:15:28.587364 systemd-resolved[1813]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:15:28.587424 systemd-resolved[1813]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:15:28.589989 systemd-networkd[1812]: eth0: Link UP Aug 19 08:15:28.590581 systemd-networkd[1812]: eth0: Gained carrier Aug 19 08:15:28.591376 systemd-networkd[1812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:15:28.600954 systemd-resolved[1813]: Defaulting to hostname 'linux'. Aug 19 08:15:28.605150 systemd-networkd[1812]: eth0: DHCPv4 address 172.31.23.28/20, gateway 172.31.16.1 acquired from 172.31.16.1 Aug 19 08:15:28.605811 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:15:28.606558 systemd[1]: Reached target network.target - Network. Aug 19 08:15:28.607826 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:15:28.619043 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:15:28.619903 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 08:15:28.621056 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:15:28.621816 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 08:15:28.622319 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 08:15:28.622720 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 19 08:15:28.623272 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 08:15:28.624045 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 08:15:28.624453 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 08:15:28.624836 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 08:15:28.624890 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:15:28.625262 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:15:28.627798 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 08:15:28.629672 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 08:15:28.632499 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 08:15:28.633244 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 08:15:28.633764 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 08:15:28.636241 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 08:15:28.637065 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 08:15:28.638312 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 08:15:28.639591 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:15:28.640010 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:15:28.640500 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:15:28.640538 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:15:28.641643 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 08:15:28.644028 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 19 08:15:28.648407 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 08:15:28.654535 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 08:15:28.660346 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 08:15:28.668197 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 08:15:28.668863 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 08:15:28.672100 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 19 08:15:28.674172 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 08:15:28.676802 jq[1962]: false Aug 19 08:15:28.681542 systemd[1]: Started ntpd.service - Network Time Service. Aug 19 08:15:28.690545 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 08:15:28.696532 systemd[1]: Starting setup-oem.service - Setup OEM... Aug 19 08:15:28.711609 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 08:15:28.726460 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 08:15:28.740010 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 08:15:28.747021 extend-filesystems[1963]: Found /dev/nvme0n1p6 Aug 19 08:15:28.743702 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 08:15:28.745020 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 08:15:28.749575 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 08:15:28.758490 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 08:15:28.765611 extend-filesystems[1963]: Found /dev/nvme0n1p9 Aug 19 08:15:28.776119 extend-filesystems[1963]: Checking size of /dev/nvme0n1p9 Aug 19 08:15:28.778432 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 08:15:28.781260 jq[1979]: true Aug 19 08:15:28.780785 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 08:15:28.786696 oslogin_cache_refresh[1964]: Refreshing passwd entry cache Aug 19 08:15:28.807575 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Refreshing passwd entry cache Aug 19 08:15:28.784108 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 08:15:28.791853 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 08:15:28.792128 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 08:15:28.824877 oslogin_cache_refresh[1964]: Failure getting users, quitting Aug 19 08:15:28.826436 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Failure getting users, quitting Aug 19 08:15:28.826436 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:15:28.826436 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Refreshing group entry cache Aug 19 08:15:28.824898 oslogin_cache_refresh[1964]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:15:28.824949 oslogin_cache_refresh[1964]: Refreshing group entry cache Aug 19 08:15:28.837808 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Failure getting groups, quitting Aug 19 08:15:28.837808 google_oslogin_nss_cache[1964]: oslogin_cache_refresh[1964]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:15:28.833954 oslogin_cache_refresh[1964]: Failure getting groups, quitting Aug 19 08:15:28.833970 oslogin_cache_refresh[1964]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:15:28.838252 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 19 08:15:28.840190 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 19 08:15:28.853024 jq[1989]: true Aug 19 08:15:28.855420 extend-filesystems[1963]: Resized partition /dev/nvme0n1p9 Aug 19 08:15:28.875034 extend-filesystems[2013]: resize2fs 1.47.2 (1-Jan-2025) Aug 19 08:15:28.892761 tar[1986]: linux-amd64/helm Aug 19 08:15:28.901329 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Aug 19 08:15:28.920780 update_engine[1978]: I20250819 08:15:28.920667 1978 main.cc:92] Flatcar Update Engine starting Aug 19 08:15:28.933597 ntpd[1966]: ntpd 4.2.8p17@1.4004-o Mon Aug 18 21:32:19 UTC 2025 (1): Starting Aug 19 08:15:28.933631 ntpd[1966]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: ntpd 4.2.8p17@1.4004-o Mon Aug 18 21:32:19 UTC 2025 (1): Starting Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: ---------------------------------------------------- Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: ntp-4 is maintained by Network Time Foundation, Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: corporation. Support and training for ntp-4 are Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: available at https://www.nwtime.org/support Aug 19 08:15:28.934010 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: ---------------------------------------------------- Aug 19 08:15:28.933641 ntpd[1966]: ---------------------------------------------------- Aug 19 08:15:28.933650 ntpd[1966]: ntp-4 is maintained by Network Time Foundation, Aug 19 08:15:28.933658 ntpd[1966]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Aug 19 08:15:28.933666 ntpd[1966]: corporation. Support and training for ntp-4 are Aug 19 08:15:28.933676 ntpd[1966]: available at https://www.nwtime.org/support Aug 19 08:15:28.933686 ntpd[1966]: ---------------------------------------------------- Aug 19 08:15:28.955428 update_engine[1978]: I20250819 08:15:28.952857 1978 update_check_scheduler.cc:74] Next update check in 4m39s Aug 19 08:15:28.955512 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: proto: precision = 0.090 usec (-23) Aug 19 08:15:28.955512 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: basedate set to 2025-08-06 Aug 19 08:15:28.955512 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: gps base set to 2025-08-10 (week 2379) Aug 19 08:15:28.946564 dbus-daemon[1960]: [system] SELinux support is enabled Aug 19 08:15:28.945842 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 08:15:28.946813 ntpd[1966]: proto: precision = 0.090 usec (-23) Aug 19 08:15:28.946143 (ntainerd)[2014]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 08:15:28.950394 ntpd[1966]: basedate set to 2025-08-06 Aug 19 08:15:28.946153 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 08:15:28.950413 ntpd[1966]: gps base set to 2025-08-10 (week 2379) Aug 19 08:15:28.947002 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 08:15:28.951492 dbus-daemon[1960]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1812 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Aug 19 08:15:28.958379 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 08:15:28.958434 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 08:15:28.961437 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 08:15:28.961465 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 08:15:28.966949 ntpd[1966]: Listen and drop on 0 v6wildcard [::]:123 Aug 19 08:15:28.969478 systemd[1]: Started update-engine.service - Update Engine. Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listen and drop on 0 v6wildcard [::]:123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listen normally on 2 lo 127.0.0.1:123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listen normally on 3 eth0 172.31.23.28:123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listen normally on 4 lo [::1]:123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: bind(21) AF_INET6 fe80::4ac:aeff:fe44:e39b%2#123 flags 0x11 failed: Cannot assign requested address Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: unable to create socket on eth0 (5) for fe80::4ac:aeff:fe44:e39b%2#123 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: failed to init interface for address fe80::4ac:aeff:fe44:e39b%2 Aug 19 08:15:28.973643 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: Listening on routing socket on fd #21 for interface updates Aug 19 08:15:28.967012 ntpd[1966]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Aug 19 08:15:28.967201 ntpd[1966]: Listen normally on 2 lo 127.0.0.1:123 Aug 19 08:15:28.967235 ntpd[1966]: Listen normally on 3 eth0 172.31.23.28:123 Aug 19 08:15:28.967289 ntpd[1966]: Listen normally on 4 lo [::1]:123 Aug 19 08:15:28.967335 ntpd[1966]: bind(21) AF_INET6 fe80::4ac:aeff:fe44:e39b%2#123 flags 0x11 failed: Cannot assign requested address Aug 19 08:15:28.967355 ntpd[1966]: unable to create socket on eth0 (5) for fe80::4ac:aeff:fe44:e39b%2#123 Aug 19 08:15:28.967368 ntpd[1966]: failed to init interface for address fe80::4ac:aeff:fe44:e39b%2 Aug 19 08:15:28.967398 ntpd[1966]: Listening on routing socket on fd #21 for interface updates Aug 19 08:15:28.970632 dbus-daemon[1960]: [system] Successfully activated service 'org.freedesktop.systemd1' Aug 19 08:15:28.980995 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Aug 19 08:15:28.992588 coreos-metadata[1959]: Aug 19 08:15:28.991 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Aug 19 08:15:28.996958 ntpd[1966]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Aug 19 08:15:29.001011 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 08:15:29.007936 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Aug 19 08:15:29.007936 ntpd[1966]: 19 Aug 08:15:28 ntpd[1966]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Aug 19 08:15:29.008014 coreos-metadata[1959]: Aug 19 08:15:28.997 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Aug 19 08:15:28.996997 ntpd[1966]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Aug 19 08:15:29.002569 systemd[1]: Finished setup-oem.service - Setup OEM. Aug 19 08:15:29.006542 systemd-logind[1976]: Watching system buttons on /dev/input/event2 (Power Button) Aug 19 08:15:29.006590 systemd-logind[1976]: Watching system buttons on /dev/input/event3 (Sleep Button) Aug 19 08:15:29.006616 systemd-logind[1976]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 19 08:15:29.007805 systemd-logind[1976]: New seat seat0. Aug 19 08:15:29.015497 coreos-metadata[1959]: Aug 19 08:15:29.011 INFO Fetch successful Aug 19 08:15:29.015497 coreos-metadata[1959]: Aug 19 08:15:29.012 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Aug 19 08:15:29.015497 coreos-metadata[1959]: Aug 19 08:15:29.013 INFO Fetch successful Aug 19 08:15:29.015497 coreos-metadata[1959]: Aug 19 08:15:29.013 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Aug 19 08:15:29.012643 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 08:15:29.017043 coreos-metadata[1959]: Aug 19 08:15:29.015 INFO Fetch successful Aug 19 08:15:29.017043 coreos-metadata[1959]: Aug 19 08:15:29.016 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Aug 19 08:15:29.025837 coreos-metadata[1959]: Aug 19 08:15:29.020 INFO Fetch successful Aug 19 08:15:29.025837 coreos-metadata[1959]: Aug 19 08:15:29.020 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Aug 19 08:15:29.027263 coreos-metadata[1959]: Aug 19 08:15:29.026 INFO Fetch failed with 404: resource not found Aug 19 08:15:29.027263 coreos-metadata[1959]: Aug 19 08:15:29.026 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Aug 19 08:15:29.030303 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Aug 19 08:15:29.037094 coreos-metadata[1959]: Aug 19 08:15:29.031 INFO Fetch successful Aug 19 08:15:29.037094 coreos-metadata[1959]: Aug 19 08:15:29.032 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Aug 19 08:15:29.037781 coreos-metadata[1959]: Aug 19 08:15:29.037 INFO Fetch successful Aug 19 08:15:29.037781 coreos-metadata[1959]: Aug 19 08:15:29.037 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Aug 19 08:15:29.046575 coreos-metadata[1959]: Aug 19 08:15:29.040 INFO Fetch successful Aug 19 08:15:29.046575 coreos-metadata[1959]: Aug 19 08:15:29.041 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Aug 19 08:15:29.046575 coreos-metadata[1959]: Aug 19 08:15:29.045 INFO Fetch successful Aug 19 08:15:29.046575 coreos-metadata[1959]: Aug 19 08:15:29.045 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Aug 19 08:15:29.051741 coreos-metadata[1959]: Aug 19 08:15:29.047 INFO Fetch successful Aug 19 08:15:29.054256 extend-filesystems[2013]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Aug 19 08:15:29.054256 extend-filesystems[2013]: old_desc_blocks = 1, new_desc_blocks = 1 Aug 19 08:15:29.054256 extend-filesystems[2013]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Aug 19 08:15:29.075468 extend-filesystems[1963]: Resized filesystem in /dev/nvme0n1p9 Aug 19 08:15:29.055332 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 08:15:29.055661 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 08:15:29.165201 bash[2053]: Updated "/home/core/.ssh/authorized_keys" Aug 19 08:15:29.155995 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 08:15:29.183979 systemd[1]: Starting sshkeys.service... Aug 19 08:15:29.269972 sshd_keygen[2008]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 08:15:29.292572 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 19 08:15:29.296898 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 08:15:29.307894 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 19 08:15:29.313501 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 19 08:15:29.420442 locksmithd[2032]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 08:15:29.444762 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 08:15:29.448177 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 08:15:29.491411 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 08:15:29.491724 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 08:15:29.496481 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 08:15:29.565374 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 08:15:29.575531 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 08:15:29.583079 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 19 08:15:29.584185 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 08:15:29.611308 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Aug 19 08:15:29.614187 dbus-daemon[1960]: [system] Successfully activated service 'org.freedesktop.hostname1' Aug 19 08:15:29.614958 dbus-daemon[1960]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=2028 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Aug 19 08:15:29.625687 systemd[1]: Starting polkit.service - Authorization Manager... Aug 19 08:15:29.633703 coreos-metadata[2123]: Aug 19 08:15:29.633 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Aug 19 08:15:29.638672 coreos-metadata[2123]: Aug 19 08:15:29.637 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Aug 19 08:15:29.638813 coreos-metadata[2123]: Aug 19 08:15:29.638 INFO Fetch successful Aug 19 08:15:29.638897 coreos-metadata[2123]: Aug 19 08:15:29.638 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Aug 19 08:15:29.640688 coreos-metadata[2123]: Aug 19 08:15:29.640 INFO Fetch successful Aug 19 08:15:29.642859 unknown[2123]: wrote ssh authorized keys file for user: core Aug 19 08:15:29.699973 update-ssh-keys[2174]: Updated "/home/core/.ssh/authorized_keys" Aug 19 08:15:29.704249 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 19 08:15:29.709851 systemd[1]: Finished sshkeys.service. Aug 19 08:15:29.802536 containerd[2014]: time="2025-08-19T08:15:29Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 08:15:29.803152 containerd[2014]: time="2025-08-19T08:15:29.803110141Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 08:15:29.814575 polkitd[2167]: Started polkitd version 126 Aug 19 08:15:29.823879 polkitd[2167]: Loading rules from directory /etc/polkit-1/rules.d Aug 19 08:15:29.824536 polkitd[2167]: Loading rules from directory /run/polkit-1/rules.d Aug 19 08:15:29.824645 polkitd[2167]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Aug 19 08:15:29.825170 polkitd[2167]: Loading rules from directory /usr/local/share/polkit-1/rules.d Aug 19 08:15:29.826440 systemd[1]: Started polkit.service - Authorization Manager. Aug 19 08:15:29.825199 polkitd[2167]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Aug 19 08:15:29.825306 polkitd[2167]: Loading rules from directory /usr/share/polkit-1/rules.d Aug 19 08:15:29.826076 polkitd[2167]: Finished loading, compiling and executing 2 rules Aug 19 08:15:29.827000 containerd[2014]: time="2025-08-19T08:15:29.826955575Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.498µs" Aug 19 08:15:29.827108 containerd[2014]: time="2025-08-19T08:15:29.827088440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 08:15:29.827182 containerd[2014]: time="2025-08-19T08:15:29.827167318Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 08:15:29.827451 containerd[2014]: time="2025-08-19T08:15:29.827426009Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 08:15:29.827566 containerd[2014]: time="2025-08-19T08:15:29.827548320Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 08:15:29.827656 containerd[2014]: time="2025-08-19T08:15:29.827641197Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:15:29.827801 containerd[2014]: time="2025-08-19T08:15:29.827779460Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:15:29.827873 containerd[2014]: time="2025-08-19T08:15:29.827857734Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:15:29.828306 containerd[2014]: time="2025-08-19T08:15:29.828245959Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:15:29.828940 dbus-daemon[1960]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Aug 19 08:15:29.829069 containerd[2014]: time="2025-08-19T08:15:29.829045261Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:15:29.829564 polkitd[2167]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Aug 19 08:15:29.829691 containerd[2014]: time="2025-08-19T08:15:29.829667600Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:15:29.829783 containerd[2014]: time="2025-08-19T08:15:29.829767025Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 08:15:29.829964 containerd[2014]: time="2025-08-19T08:15:29.829944579Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831350475Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831422774Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831441048Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831482804Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831797757Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 08:15:29.832019 containerd[2014]: time="2025-08-19T08:15:29.831879362Z" level=info msg="metadata content store policy set" policy=shared Aug 19 08:15:29.836353 containerd[2014]: time="2025-08-19T08:15:29.836252074Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 08:15:29.837198 containerd[2014]: time="2025-08-19T08:15:29.837156372Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 08:15:29.837273 containerd[2014]: time="2025-08-19T08:15:29.837229914Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 08:15:29.837273 containerd[2014]: time="2025-08-19T08:15:29.837251990Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 08:15:29.837273 containerd[2014]: time="2025-08-19T08:15:29.837268338Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 08:15:29.837393 containerd[2014]: time="2025-08-19T08:15:29.837297846Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 08:15:29.837393 containerd[2014]: time="2025-08-19T08:15:29.837318826Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 08:15:29.837393 containerd[2014]: time="2025-08-19T08:15:29.837349501Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 08:15:29.837393 containerd[2014]: time="2025-08-19T08:15:29.837365923Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 08:15:29.837393 containerd[2014]: time="2025-08-19T08:15:29.837381360Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 08:15:29.837560 containerd[2014]: time="2025-08-19T08:15:29.837396375Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 08:15:29.837560 containerd[2014]: time="2025-08-19T08:15:29.837414872Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 08:15:29.837628 containerd[2014]: time="2025-08-19T08:15:29.837558955Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 08:15:29.837628 containerd[2014]: time="2025-08-19T08:15:29.837585372Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 08:15:29.837628 containerd[2014]: time="2025-08-19T08:15:29.837612460Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837629320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837644667Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837661304Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837677392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837692012Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837708522Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 08:15:29.837736 containerd[2014]: time="2025-08-19T08:15:29.837725438Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 08:15:29.837959 containerd[2014]: time="2025-08-19T08:15:29.837740280Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 08:15:29.837959 containerd[2014]: time="2025-08-19T08:15:29.837820109Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 08:15:29.837959 containerd[2014]: time="2025-08-19T08:15:29.837838433Z" level=info msg="Start snapshots syncer" Aug 19 08:15:29.837959 containerd[2014]: time="2025-08-19T08:15:29.837871906Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 08:15:29.840971 containerd[2014]: time="2025-08-19T08:15:29.840903529Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 08:15:29.841915 containerd[2014]: time="2025-08-19T08:15:29.841367859Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 08:15:29.843083 containerd[2014]: time="2025-08-19T08:15:29.842418613Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 08:15:29.843299 containerd[2014]: time="2025-08-19T08:15:29.843232947Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 08:15:29.843299 containerd[2014]: time="2025-08-19T08:15:29.843294702Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843314682Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843330244Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843346980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843362508Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843380027Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 08:15:29.843425 containerd[2014]: time="2025-08-19T08:15:29.843421752Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843438034Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843455384Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843498873Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843521184Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843535488Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843550103Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843561744Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843576749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843599718Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843621977Z" level=info msg="runtime interface created" Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843629481Z" level=info msg="created NRI interface" Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843643244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 08:15:29.843704 containerd[2014]: time="2025-08-19T08:15:29.843665149Z" level=info msg="Connect containerd service" Aug 19 08:15:29.844590 containerd[2014]: time="2025-08-19T08:15:29.843713323Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 08:15:29.847342 containerd[2014]: time="2025-08-19T08:15:29.847302672Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 08:15:29.853454 systemd-hostnamed[2028]: Hostname set to (transient) Aug 19 08:15:29.853599 systemd-resolved[1813]: System hostname changed to 'ip-172-31-23-28'. Aug 19 08:15:29.934121 ntpd[1966]: bind(24) AF_INET6 fe80::4ac:aeff:fe44:e39b%2#123 flags 0x11 failed: Cannot assign requested address Aug 19 08:15:29.934673 ntpd[1966]: 19 Aug 08:15:29 ntpd[1966]: bind(24) AF_INET6 fe80::4ac:aeff:fe44:e39b%2#123 flags 0x11 failed: Cannot assign requested address Aug 19 08:15:29.934673 ntpd[1966]: 19 Aug 08:15:29 ntpd[1966]: unable to create socket on eth0 (6) for fe80::4ac:aeff:fe44:e39b%2#123 Aug 19 08:15:29.934673 ntpd[1966]: 19 Aug 08:15:29 ntpd[1966]: failed to init interface for address fe80::4ac:aeff:fe44:e39b%2 Aug 19 08:15:29.934167 ntpd[1966]: unable to create socket on eth0 (6) for fe80::4ac:aeff:fe44:e39b%2#123 Aug 19 08:15:29.934183 ntpd[1966]: failed to init interface for address fe80::4ac:aeff:fe44:e39b%2 Aug 19 08:15:29.959520 systemd-networkd[1812]: eth0: Gained IPv6LL Aug 19 08:15:29.966541 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 08:15:29.968377 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 08:15:29.974584 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Aug 19 08:15:29.979043 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:15:29.983957 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 08:15:30.015807 tar[1986]: linux-amd64/LICENSE Aug 19 08:15:30.017401 tar[1986]: linux-amd64/README.md Aug 19 08:15:30.056360 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 08:15:30.060836 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 08:15:30.114509 containerd[2014]: time="2025-08-19T08:15:30.114242227Z" level=info msg="Start subscribing containerd event" Aug 19 08:15:30.114509 containerd[2014]: time="2025-08-19T08:15:30.114320034Z" level=info msg="Start recovering state" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.114838843Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115000587Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115908580Z" level=info msg="Start event monitor" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115936514Z" level=info msg="Start cni network conf syncer for default" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115946858Z" level=info msg="Start streaming server" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115969351Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115979693Z" level=info msg="runtime interface starting up..." Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.115987477Z" level=info msg="starting plugins..." Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.116014817Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 08:15:30.116314 containerd[2014]: time="2025-08-19T08:15:30.116164626Z" level=info msg="containerd successfully booted in 0.314460s" Aug 19 08:15:30.116430 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 08:15:30.135403 amazon-ssm-agent[2196]: Initializing new seelog logger Aug 19 08:15:30.135764 amazon-ssm-agent[2196]: New Seelog Logger Creation Complete Aug 19 08:15:30.135764 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.135764 amazon-ssm-agent[2196]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 processing appconfig overrides Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 processing appconfig overrides Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.136988 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 processing appconfig overrides Aug 19 08:15:30.137419 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1363 INFO Proxy environment variables: Aug 19 08:15:30.139566 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.139566 amazon-ssm-agent[2196]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.139566 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 processing appconfig overrides Aug 19 08:15:30.237593 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1364 INFO https_proxy: Aug 19 08:15:30.335810 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1364 INFO http_proxy: Aug 19 08:15:30.415290 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.415290 amazon-ssm-agent[2196]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Aug 19 08:15:30.415442 amazon-ssm-agent[2196]: 2025/08/19 08:15:30 processing appconfig overrides Aug 19 08:15:30.433909 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1364 INFO no_proxy: Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1365 INFO Checking if agent identity type OnPrem can be assumed Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1367 INFO Checking if agent identity type EC2 can be assumed Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1697 INFO Agent will take identity from EC2 Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1710 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1710 INFO [amazon-ssm-agent] OS: linux, Arch: amd64 Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1710 INFO [amazon-ssm-agent] Starting Core Agent Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1710 INFO [amazon-ssm-agent] Registrar detected. Attempting registration Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1710 INFO [Registrar] Starting registrar module Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1722 INFO [EC2Identity] Checking disk for registration info Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1722 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.1722 INFO [EC2Identity] Generating registration keypair Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.3736 INFO [EC2Identity] Checking write access before registering Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.3742 INFO [EC2Identity] Registering EC2 instance with Systems Manager Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4150 INFO [EC2Identity] EC2 registration was successful. Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4151 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4152 INFO [CredentialRefresher] credentialRefresher has started Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4152 INFO [CredentialRefresher] Starting credentials refresher loop Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4425 INFO EC2RoleProvider Successfully connected with instance profile role credentials Aug 19 08:15:30.442990 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4428 INFO [CredentialRefresher] Credentials ready Aug 19 08:15:30.531539 amazon-ssm-agent[2196]: 2025-08-19 08:15:30.4430 INFO [CredentialRefresher] Next credential rotation will be in 29.999992424216668 minutes Aug 19 08:15:31.457633 amazon-ssm-agent[2196]: 2025-08-19 08:15:31.4575 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Aug 19 08:15:31.558771 amazon-ssm-agent[2196]: 2025-08-19 08:15:31.4603 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2224) started Aug 19 08:15:31.659142 amazon-ssm-agent[2196]: 2025-08-19 08:15:31.4603 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Aug 19 08:15:32.069469 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:15:32.070853 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 08:15:32.074471 systemd[1]: Startup finished in 2.741s (kernel) + 6.128s (initrd) + 7.323s (userspace) = 16.193s. Aug 19 08:15:32.080210 (kubelet)[2241]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:15:32.203908 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 08:15:32.205352 systemd[1]: Started sshd@0-172.31.23.28:22-139.178.89.65:38882.service - OpenSSH per-connection server daemon (139.178.89.65:38882). Aug 19 08:15:32.414309 sshd[2247]: Accepted publickey for core from 139.178.89.65 port 38882 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:32.416811 sshd-session[2247]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:32.425757 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 08:15:32.427055 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 08:15:32.440777 systemd-logind[1976]: New session 1 of user core. Aug 19 08:15:32.453875 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 08:15:32.457751 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 08:15:32.472112 (systemd)[2256]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 08:15:32.476226 systemd-logind[1976]: New session c1 of user core. Aug 19 08:15:32.652455 systemd[2256]: Queued start job for default target default.target. Aug 19 08:15:32.669411 systemd[2256]: Created slice app.slice - User Application Slice. Aug 19 08:15:32.669446 systemd[2256]: Reached target paths.target - Paths. Aug 19 08:15:32.669490 systemd[2256]: Reached target timers.target - Timers. Aug 19 08:15:32.671015 systemd[2256]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 08:15:32.683537 systemd[2256]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 08:15:32.683789 systemd[2256]: Reached target sockets.target - Sockets. Aug 19 08:15:32.683928 systemd[2256]: Reached target basic.target - Basic System. Aug 19 08:15:32.683981 systemd[2256]: Reached target default.target - Main User Target. Aug 19 08:15:32.684010 systemd[2256]: Startup finished in 200ms. Aug 19 08:15:32.684126 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 08:15:32.690570 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 08:15:32.837480 systemd[1]: Started sshd@1-172.31.23.28:22-139.178.89.65:38894.service - OpenSSH per-connection server daemon (139.178.89.65:38894). Aug 19 08:15:32.934179 ntpd[1966]: Listen normally on 7 eth0 [fe80::4ac:aeff:fe44:e39b%2]:123 Aug 19 08:15:32.934591 ntpd[1966]: 19 Aug 08:15:32 ntpd[1966]: Listen normally on 7 eth0 [fe80::4ac:aeff:fe44:e39b%2]:123 Aug 19 08:15:32.998696 sshd[2267]: Accepted publickey for core from 139.178.89.65 port 38894 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:33.000234 sshd-session[2267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:33.007299 systemd-logind[1976]: New session 2 of user core. Aug 19 08:15:33.012492 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 08:15:33.140712 sshd[2271]: Connection closed by 139.178.89.65 port 38894 Aug 19 08:15:33.142472 sshd-session[2267]: pam_unix(sshd:session): session closed for user core Aug 19 08:15:33.147272 systemd-logind[1976]: Session 2 logged out. Waiting for processes to exit. Aug 19 08:15:33.147656 systemd[1]: sshd@1-172.31.23.28:22-139.178.89.65:38894.service: Deactivated successfully. Aug 19 08:15:33.149518 systemd[1]: session-2.scope: Deactivated successfully. Aug 19 08:15:33.152144 systemd-logind[1976]: Removed session 2. Aug 19 08:15:33.167648 kubelet[2241]: E0819 08:15:33.167610 2241 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:15:33.170916 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:15:33.171053 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:15:33.171553 systemd[1]: kubelet.service: Consumed 1.042s CPU time, 266.6M memory peak. Aug 19 08:15:33.173813 systemd[1]: Started sshd@2-172.31.23.28:22-139.178.89.65:38910.service - OpenSSH per-connection server daemon (139.178.89.65:38910). Aug 19 08:15:33.340653 sshd[2278]: Accepted publickey for core from 139.178.89.65 port 38910 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:33.341904 sshd-session[2278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:33.347309 systemd-logind[1976]: New session 3 of user core. Aug 19 08:15:33.353494 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 08:15:33.471456 sshd[2281]: Connection closed by 139.178.89.65 port 38910 Aug 19 08:15:33.472018 sshd-session[2278]: pam_unix(sshd:session): session closed for user core Aug 19 08:15:33.475878 systemd[1]: sshd@2-172.31.23.28:22-139.178.89.65:38910.service: Deactivated successfully. Aug 19 08:15:33.477672 systemd[1]: session-3.scope: Deactivated successfully. Aug 19 08:15:33.478590 systemd-logind[1976]: Session 3 logged out. Waiting for processes to exit. Aug 19 08:15:33.479966 systemd-logind[1976]: Removed session 3. Aug 19 08:15:33.504127 systemd[1]: Started sshd@3-172.31.23.28:22-139.178.89.65:38924.service - OpenSSH per-connection server daemon (139.178.89.65:38924). Aug 19 08:15:33.677139 sshd[2287]: Accepted publickey for core from 139.178.89.65 port 38924 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:33.678589 sshd-session[2287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:33.683870 systemd-logind[1976]: New session 4 of user core. Aug 19 08:15:33.686467 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 08:15:33.810880 sshd[2290]: Connection closed by 139.178.89.65 port 38924 Aug 19 08:15:33.811415 sshd-session[2287]: pam_unix(sshd:session): session closed for user core Aug 19 08:15:33.814860 systemd[1]: sshd@3-172.31.23.28:22-139.178.89.65:38924.service: Deactivated successfully. Aug 19 08:15:33.816511 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 08:15:33.818506 systemd-logind[1976]: Session 4 logged out. Waiting for processes to exit. Aug 19 08:15:33.819348 systemd-logind[1976]: Removed session 4. Aug 19 08:15:33.844139 systemd[1]: Started sshd@4-172.31.23.28:22-139.178.89.65:38934.service - OpenSSH per-connection server daemon (139.178.89.65:38934). Aug 19 08:15:34.021916 sshd[2296]: Accepted publickey for core from 139.178.89.65 port 38934 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:34.023429 sshd-session[2296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:34.028793 systemd-logind[1976]: New session 5 of user core. Aug 19 08:15:34.034492 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 08:15:34.155865 sudo[2300]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 08:15:34.156142 sudo[2300]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:15:34.168607 sudo[2300]: pam_unix(sudo:session): session closed for user root Aug 19 08:15:34.191249 sshd[2299]: Connection closed by 139.178.89.65 port 38934 Aug 19 08:15:34.191936 sshd-session[2296]: pam_unix(sshd:session): session closed for user core Aug 19 08:15:34.195634 systemd[1]: sshd@4-172.31.23.28:22-139.178.89.65:38934.service: Deactivated successfully. Aug 19 08:15:34.197233 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 08:15:34.199061 systemd-logind[1976]: Session 5 logged out. Waiting for processes to exit. Aug 19 08:15:34.200476 systemd-logind[1976]: Removed session 5. Aug 19 08:15:34.224155 systemd[1]: Started sshd@5-172.31.23.28:22-139.178.89.65:38948.service - OpenSSH per-connection server daemon (139.178.89.65:38948). Aug 19 08:15:34.391237 sshd[2306]: Accepted publickey for core from 139.178.89.65 port 38948 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:34.392654 sshd-session[2306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:34.397619 systemd-logind[1976]: New session 6 of user core. Aug 19 08:15:34.407526 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 08:15:34.503139 sudo[2311]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 08:15:34.503543 sudo[2311]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:15:34.512245 sudo[2311]: pam_unix(sudo:session): session closed for user root Aug 19 08:15:34.518147 sudo[2310]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 08:15:34.518553 sudo[2310]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:15:34.529146 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:15:34.575629 augenrules[2333]: No rules Aug 19 08:15:34.576989 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:15:34.577263 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:15:34.578457 sudo[2310]: pam_unix(sudo:session): session closed for user root Aug 19 08:15:34.601226 sshd[2309]: Connection closed by 139.178.89.65 port 38948 Aug 19 08:15:34.601767 sshd-session[2306]: pam_unix(sshd:session): session closed for user core Aug 19 08:15:34.605657 systemd[1]: sshd@5-172.31.23.28:22-139.178.89.65:38948.service: Deactivated successfully. Aug 19 08:15:34.607450 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 08:15:34.608216 systemd-logind[1976]: Session 6 logged out. Waiting for processes to exit. Aug 19 08:15:34.609459 systemd-logind[1976]: Removed session 6. Aug 19 08:15:34.638384 systemd[1]: Started sshd@6-172.31.23.28:22-139.178.89.65:38956.service - OpenSSH per-connection server daemon (139.178.89.65:38956). Aug 19 08:15:34.809649 sshd[2342]: Accepted publickey for core from 139.178.89.65 port 38956 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:15:34.811234 sshd-session[2342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:15:34.817573 systemd-logind[1976]: New session 7 of user core. Aug 19 08:15:34.825590 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 08:15:34.920798 sudo[2346]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 08:15:34.921071 sudo[2346]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:15:35.451370 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 08:15:35.461721 (dockerd)[2364]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 08:15:35.851563 dockerd[2364]: time="2025-08-19T08:15:35.851265402Z" level=info msg="Starting up" Aug 19 08:15:35.854889 dockerd[2364]: time="2025-08-19T08:15:35.854849538Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 08:15:35.866655 dockerd[2364]: time="2025-08-19T08:15:35.866612817Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 08:15:36.724231 systemd-resolved[1813]: Clock change detected. Flushing caches. Aug 19 08:15:36.780991 dockerd[2364]: time="2025-08-19T08:15:36.780793596Z" level=info msg="Loading containers: start." Aug 19 08:15:36.792071 kernel: Initializing XFRM netlink socket Aug 19 08:15:37.049337 (udev-worker)[2386]: Network interface NamePolicy= disabled on kernel command line. Aug 19 08:15:37.097427 systemd-networkd[1812]: docker0: Link UP Aug 19 08:15:37.101862 dockerd[2364]: time="2025-08-19T08:15:37.101815875Z" level=info msg="Loading containers: done." Aug 19 08:15:37.117091 dockerd[2364]: time="2025-08-19T08:15:37.116686674Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 08:15:37.117091 dockerd[2364]: time="2025-08-19T08:15:37.116792503Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 08:15:37.117091 dockerd[2364]: time="2025-08-19T08:15:37.116895572Z" level=info msg="Initializing buildkit" Aug 19 08:15:37.141453 dockerd[2364]: time="2025-08-19T08:15:37.141405979Z" level=info msg="Completed buildkit initialization" Aug 19 08:15:37.148488 dockerd[2364]: time="2025-08-19T08:15:37.148416441Z" level=info msg="Daemon has completed initialization" Aug 19 08:15:37.148488 dockerd[2364]: time="2025-08-19T08:15:37.148477885Z" level=info msg="API listen on /run/docker.sock" Aug 19 08:15:37.148883 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 08:15:38.292315 containerd[2014]: time="2025-08-19T08:15:38.292277007Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Aug 19 08:15:38.832270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4109131200.mount: Deactivated successfully. Aug 19 08:15:40.304066 containerd[2014]: time="2025-08-19T08:15:40.304001023Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:40.305203 containerd[2014]: time="2025-08-19T08:15:40.304961530Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Aug 19 08:15:40.306112 containerd[2014]: time="2025-08-19T08:15:40.306075436Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:40.308590 containerd[2014]: time="2025-08-19T08:15:40.308554398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:40.310049 containerd[2014]: time="2025-08-19T08:15:40.309617016Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 2.017300888s" Aug 19 08:15:40.310049 containerd[2014]: time="2025-08-19T08:15:40.309659481Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Aug 19 08:15:40.310341 containerd[2014]: time="2025-08-19T08:15:40.310303923Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Aug 19 08:15:41.943430 containerd[2014]: time="2025-08-19T08:15:41.943378182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:41.945548 containerd[2014]: time="2025-08-19T08:15:41.945491381Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Aug 19 08:15:41.948175 containerd[2014]: time="2025-08-19T08:15:41.948130723Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:41.951629 containerd[2014]: time="2025-08-19T08:15:41.951571993Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:41.952487 containerd[2014]: time="2025-08-19T08:15:41.952457979Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 1.642121335s" Aug 19 08:15:41.952553 containerd[2014]: time="2025-08-19T08:15:41.952492380Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Aug 19 08:15:41.953135 containerd[2014]: time="2025-08-19T08:15:41.953113119Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Aug 19 08:15:43.407956 containerd[2014]: time="2025-08-19T08:15:43.407877647Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:43.415542 containerd[2014]: time="2025-08-19T08:15:43.415481357Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Aug 19 08:15:43.423116 containerd[2014]: time="2025-08-19T08:15:43.422975392Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:43.435927 containerd[2014]: time="2025-08-19T08:15:43.435614994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:43.436954 containerd[2014]: time="2025-08-19T08:15:43.436640669Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 1.483485894s" Aug 19 08:15:43.436954 containerd[2014]: time="2025-08-19T08:15:43.436695705Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Aug 19 08:15:43.437656 containerd[2014]: time="2025-08-19T08:15:43.437543643Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Aug 19 08:15:44.133179 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 08:15:44.137289 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:15:44.450687 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:15:44.461999 (kubelet)[2649]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:15:44.532432 kubelet[2649]: E0819 08:15:44.532343 2649 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:15:44.538237 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:15:44.538430 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:15:44.539140 systemd[1]: kubelet.service: Consumed 216ms CPU time, 110.2M memory peak. Aug 19 08:15:44.730275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1590541220.mount: Deactivated successfully. Aug 19 08:15:45.324858 containerd[2014]: time="2025-08-19T08:15:45.324784023Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:45.326801 containerd[2014]: time="2025-08-19T08:15:45.326740376Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Aug 19 08:15:45.329279 containerd[2014]: time="2025-08-19T08:15:45.329221535Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:45.332357 containerd[2014]: time="2025-08-19T08:15:45.332290088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:45.333140 containerd[2014]: time="2025-08-19T08:15:45.332763129Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 1.895113896s" Aug 19 08:15:45.333140 containerd[2014]: time="2025-08-19T08:15:45.332806776Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Aug 19 08:15:45.333262 containerd[2014]: time="2025-08-19T08:15:45.333234844Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 19 08:15:45.869021 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1109618525.mount: Deactivated successfully. Aug 19 08:15:46.840992 containerd[2014]: time="2025-08-19T08:15:46.840936633Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:46.842887 containerd[2014]: time="2025-08-19T08:15:46.842848781Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Aug 19 08:15:46.845432 containerd[2014]: time="2025-08-19T08:15:46.845376357Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:46.850215 containerd[2014]: time="2025-08-19T08:15:46.850168245Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:46.851674 containerd[2014]: time="2025-08-19T08:15:46.851331158Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.518065183s" Aug 19 08:15:46.851674 containerd[2014]: time="2025-08-19T08:15:46.851371085Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 19 08:15:46.852229 containerd[2014]: time="2025-08-19T08:15:46.852204328Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 08:15:47.317910 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2931754722.mount: Deactivated successfully. Aug 19 08:15:47.330345 containerd[2014]: time="2025-08-19T08:15:47.330275816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:15:47.332704 containerd[2014]: time="2025-08-19T08:15:47.332645895Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 19 08:15:47.335141 containerd[2014]: time="2025-08-19T08:15:47.335066078Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:15:47.338870 containerd[2014]: time="2025-08-19T08:15:47.338793306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:15:47.339617 containerd[2014]: time="2025-08-19T08:15:47.339454877Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 487.21062ms" Aug 19 08:15:47.339617 containerd[2014]: time="2025-08-19T08:15:47.339498835Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 19 08:15:47.340535 containerd[2014]: time="2025-08-19T08:15:47.340228439Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Aug 19 08:15:47.879545 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4003940039.mount: Deactivated successfully. Aug 19 08:15:49.937533 containerd[2014]: time="2025-08-19T08:15:49.937463594Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:49.939579 containerd[2014]: time="2025-08-19T08:15:49.939526495Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Aug 19 08:15:49.942189 containerd[2014]: time="2025-08-19T08:15:49.942121294Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:49.946121 containerd[2014]: time="2025-08-19T08:15:49.946061284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:15:49.947959 containerd[2014]: time="2025-08-19T08:15:49.947361795Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.607101147s" Aug 19 08:15:49.947959 containerd[2014]: time="2025-08-19T08:15:49.947404168Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Aug 19 08:15:52.843974 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:15:52.844820 systemd[1]: kubelet.service: Consumed 216ms CPU time, 110.2M memory peak. Aug 19 08:15:52.847207 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:15:52.879387 systemd[1]: Reload requested from client PID 2798 ('systemctl') (unit session-7.scope)... Aug 19 08:15:52.879411 systemd[1]: Reloading... Aug 19 08:15:52.997134 zram_generator::config[2839]: No configuration found. Aug 19 08:15:53.320378 systemd[1]: Reloading finished in 440 ms. Aug 19 08:15:53.388613 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 08:15:53.388719 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 08:15:53.389128 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:15:53.389190 systemd[1]: kubelet.service: Consumed 141ms CPU time, 98M memory peak. Aug 19 08:15:53.390924 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:15:53.626617 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:15:53.638498 (kubelet)[2906]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:15:53.683870 kubelet[2906]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:15:53.683870 kubelet[2906]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 19 08:15:53.683870 kubelet[2906]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:15:53.685962 kubelet[2906]: I0819 08:15:53.685896 2906 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:15:53.890055 kubelet[2906]: I0819 08:15:53.889997 2906 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 19 08:15:53.890055 kubelet[2906]: I0819 08:15:53.890053 2906 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:15:53.890318 kubelet[2906]: I0819 08:15:53.890304 2906 server.go:934] "Client rotation is on, will bootstrap in background" Aug 19 08:15:53.923312 kubelet[2906]: I0819 08:15:53.923209 2906 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:15:53.928297 kubelet[2906]: E0819 08:15:53.928228 2906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.23.28:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:53.940078 kubelet[2906]: I0819 08:15:53.940023 2906 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:15:53.945682 kubelet[2906]: I0819 08:15:53.945647 2906 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:15:53.947375 kubelet[2906]: I0819 08:15:53.947331 2906 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 19 08:15:53.947564 kubelet[2906]: I0819 08:15:53.947533 2906 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:15:53.947912 kubelet[2906]: I0819 08:15:53.947561 2906 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-23-28","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:15:53.947912 kubelet[2906]: I0819 08:15:53.947906 2906 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:15:53.947912 kubelet[2906]: I0819 08:15:53.947918 2906 container_manager_linux.go:300] "Creating device plugin manager" Aug 19 08:15:53.948954 kubelet[2906]: I0819 08:15:53.948920 2906 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:15:53.952760 kubelet[2906]: I0819 08:15:53.952538 2906 kubelet.go:408] "Attempting to sync node with API server" Aug 19 08:15:53.952760 kubelet[2906]: I0819 08:15:53.952569 2906 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:15:53.952760 kubelet[2906]: I0819 08:15:53.952602 2906 kubelet.go:314] "Adding apiserver pod source" Aug 19 08:15:53.952760 kubelet[2906]: I0819 08:15:53.952625 2906 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:15:53.958901 kubelet[2906]: I0819 08:15:53.958876 2906 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:15:53.962565 kubelet[2906]: I0819 08:15:53.962533 2906 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:15:53.963368 kubelet[2906]: W0819 08:15:53.963148 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.23.28:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:53.963368 kubelet[2906]: E0819 08:15:53.963211 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.23.28:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:53.963368 kubelet[2906]: W0819 08:15:53.963273 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.23.28:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-23-28&limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:53.963368 kubelet[2906]: E0819 08:15:53.963296 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.23.28:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-23-28&limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:53.963368 kubelet[2906]: W0819 08:15:53.963301 2906 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 08:15:53.964112 kubelet[2906]: I0819 08:15:53.964091 2906 server.go:1274] "Started kubelet" Aug 19 08:15:53.966101 kubelet[2906]: I0819 08:15:53.966060 2906 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:15:53.976057 kubelet[2906]: I0819 08:15:53.975396 2906 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:15:53.976057 kubelet[2906]: I0819 08:15:53.975878 2906 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:15:53.976057 kubelet[2906]: I0819 08:15:53.975434 2906 server.go:449] "Adding debug handlers to kubelet server" Aug 19 08:15:53.978093 kubelet[2906]: I0819 08:15:53.978064 2906 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:15:53.979826 kubelet[2906]: I0819 08:15:53.979586 2906 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:15:53.980768 kubelet[2906]: I0819 08:15:53.980749 2906 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 19 08:15:53.981084 kubelet[2906]: E0819 08:15:53.981053 2906 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-23-28\" not found" Aug 19 08:15:53.981534 kubelet[2906]: I0819 08:15:53.981499 2906 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 19 08:15:53.981614 kubelet[2906]: I0819 08:15:53.981554 2906 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:15:53.982272 kubelet[2906]: W0819 08:15:53.982122 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.23.28:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:53.982272 kubelet[2906]: E0819 08:15:53.982172 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.23.28:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:53.982272 kubelet[2906]: E0819 08:15:53.982224 2906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.28:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-28?timeout=10s\": dial tcp 172.31.23.28:6443: connect: connection refused" interval="200ms" Aug 19 08:15:53.985605 kubelet[2906]: E0819 08:15:53.984110 2906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.23.28:6443/api/v1/namespaces/default/events\": dial tcp 172.31.23.28:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-23-28.185d1d08efc687a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-23-28,UID:ip-172-31-23-28,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-23-28,},FirstTimestamp:2025-08-19 08:15:53.964070823 +0000 UTC m=+0.321715287,LastTimestamp:2025-08-19 08:15:53.964070823 +0000 UTC m=+0.321715287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-23-28,}" Aug 19 08:15:53.994066 kubelet[2906]: I0819 08:15:53.993048 2906 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:15:53.994066 kubelet[2906]: I0819 08:15:53.993350 2906 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:15:53.996977 kubelet[2906]: I0819 08:15:53.996954 2906 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:15:54.007860 kubelet[2906]: I0819 08:15:54.006129 2906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:15:54.007860 kubelet[2906]: I0819 08:15:54.007294 2906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:15:54.007860 kubelet[2906]: I0819 08:15:54.007314 2906 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 19 08:15:54.007860 kubelet[2906]: I0819 08:15:54.007337 2906 kubelet.go:2321] "Starting kubelet main sync loop" Aug 19 08:15:54.007860 kubelet[2906]: E0819 08:15:54.007373 2906 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:15:54.015491 kubelet[2906]: W0819 08:15:54.015433 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.23.28:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:54.015588 kubelet[2906]: E0819 08:15:54.015497 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.23.28:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:54.026710 kubelet[2906]: E0819 08:15:54.026593 2906 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:15:54.033243 kubelet[2906]: I0819 08:15:54.033218 2906 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 19 08:15:54.033243 kubelet[2906]: I0819 08:15:54.033236 2906 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 19 08:15:54.033457 kubelet[2906]: I0819 08:15:54.033265 2906 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:15:54.038364 kubelet[2906]: I0819 08:15:54.038324 2906 policy_none.go:49] "None policy: Start" Aug 19 08:15:54.039064 kubelet[2906]: I0819 08:15:54.039042 2906 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 19 08:15:54.039064 kubelet[2906]: I0819 08:15:54.039068 2906 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:15:54.048224 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 08:15:54.061711 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 08:15:54.065594 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 08:15:54.073093 kubelet[2906]: I0819 08:15:54.073062 2906 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:15:54.073324 kubelet[2906]: I0819 08:15:54.073306 2906 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:15:54.073379 kubelet[2906]: I0819 08:15:54.073328 2906 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:15:54.073804 kubelet[2906]: I0819 08:15:54.073785 2906 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:15:54.076245 kubelet[2906]: E0819 08:15:54.076223 2906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-23-28\" not found" Aug 19 08:15:54.121366 systemd[1]: Created slice kubepods-burstable-pod418a60d6507e4316ba82f3c0751baa98.slice - libcontainer container kubepods-burstable-pod418a60d6507e4316ba82f3c0751baa98.slice. Aug 19 08:15:54.136847 systemd[1]: Created slice kubepods-burstable-pod6cc71c0d891372edb9d7fde18c0b6778.slice - libcontainer container kubepods-burstable-pod6cc71c0d891372edb9d7fde18c0b6778.slice. Aug 19 08:15:54.142159 systemd[1]: Created slice kubepods-burstable-pod9b591e08c32c4c116ca6af541a055571.slice - libcontainer container kubepods-burstable-pod9b591e08c32c4c116ca6af541a055571.slice. Aug 19 08:15:54.176250 kubelet[2906]: I0819 08:15:54.176218 2906 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:15:54.176574 kubelet[2906]: E0819 08:15:54.176540 2906 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.23.28:6443/api/v1/nodes\": dial tcp 172.31.23.28:6443: connect: connection refused" node="ip-172-31-23-28" Aug 19 08:15:54.183266 kubelet[2906]: I0819 08:15:54.182946 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:15:54.183266 kubelet[2906]: I0819 08:15:54.182993 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6cc71c0d891372edb9d7fde18c0b6778-kubeconfig\") pod \"kube-scheduler-ip-172-31-23-28\" (UID: \"6cc71c0d891372edb9d7fde18c0b6778\") " pod="kube-system/kube-scheduler-ip-172-31-23-28" Aug 19 08:15:54.183266 kubelet[2906]: I0819 08:15:54.183012 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:15:54.183266 kubelet[2906]: I0819 08:15:54.183045 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-ca-certs\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:15:54.183266 kubelet[2906]: I0819 08:15:54.183065 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:15:54.183480 kubelet[2906]: I0819 08:15:54.183082 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-k8s-certs\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:15:54.183480 kubelet[2906]: I0819 08:15:54.183098 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-kubeconfig\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:15:54.183480 kubelet[2906]: I0819 08:15:54.183115 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-ca-certs\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:15:54.183480 kubelet[2906]: I0819 08:15:54.183152 2906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-k8s-certs\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:15:54.183480 kubelet[2906]: E0819 08:15:54.183235 2906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.28:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-28?timeout=10s\": dial tcp 172.31.23.28:6443: connect: connection refused" interval="400ms" Aug 19 08:15:54.378656 kubelet[2906]: I0819 08:15:54.378627 2906 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:15:54.379112 kubelet[2906]: E0819 08:15:54.379056 2906 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.23.28:6443/api/v1/nodes\": dial tcp 172.31.23.28:6443: connect: connection refused" node="ip-172-31-23-28" Aug 19 08:15:54.435952 containerd[2014]: time="2025-08-19T08:15:54.435843875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-23-28,Uid:418a60d6507e4316ba82f3c0751baa98,Namespace:kube-system,Attempt:0,}" Aug 19 08:15:54.444995 containerd[2014]: time="2025-08-19T08:15:54.444948830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-23-28,Uid:6cc71c0d891372edb9d7fde18c0b6778,Namespace:kube-system,Attempt:0,}" Aug 19 08:15:54.446485 containerd[2014]: time="2025-08-19T08:15:54.446448469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-23-28,Uid:9b591e08c32c4c116ca6af541a055571,Namespace:kube-system,Attempt:0,}" Aug 19 08:15:54.572630 containerd[2014]: time="2025-08-19T08:15:54.572531951Z" level=info msg="connecting to shim 17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6" address="unix:///run/containerd/s/d66ecf4f5e3f4b6fd42dbb6cadf8e68c76cfc40fce2cb3d23385ab24b25857c4" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:15:54.574131 containerd[2014]: time="2025-08-19T08:15:54.573905419Z" level=info msg="connecting to shim bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc" address="unix:///run/containerd/s/84ec6d575c1a36d097e53ae3292fa6e67a5cef0c1c909a16f28469bbfa5fdbbc" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:15:54.580254 containerd[2014]: time="2025-08-19T08:15:54.580212850Z" level=info msg="connecting to shim f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818" address="unix:///run/containerd/s/bab843e31cfaa8a765cefa333c93a27ee98a1dc51159e0847dff480a0d68f586" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:15:54.584231 kubelet[2906]: E0819 08:15:54.584171 2906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.28:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-28?timeout=10s\": dial tcp 172.31.23.28:6443: connect: connection refused" interval="800ms" Aug 19 08:15:54.688275 systemd[1]: Started cri-containerd-17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6.scope - libcontainer container 17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6. Aug 19 08:15:54.690814 systemd[1]: Started cri-containerd-bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc.scope - libcontainer container bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc. Aug 19 08:15:54.694009 systemd[1]: Started cri-containerd-f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818.scope - libcontainer container f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818. Aug 19 08:15:54.787070 kubelet[2906]: I0819 08:15:54.786699 2906 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:15:54.789520 kubelet[2906]: E0819 08:15:54.789461 2906 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.23.28:6443/api/v1/nodes\": dial tcp 172.31.23.28:6443: connect: connection refused" node="ip-172-31-23-28" Aug 19 08:15:54.822055 containerd[2014]: time="2025-08-19T08:15:54.821964489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-23-28,Uid:418a60d6507e4316ba82f3c0751baa98,Namespace:kube-system,Attempt:0,} returns sandbox id \"bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc\"" Aug 19 08:15:54.833274 containerd[2014]: time="2025-08-19T08:15:54.833208665Z" level=info msg="CreateContainer within sandbox \"bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 08:15:54.854047 containerd[2014]: time="2025-08-19T08:15:54.853975643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-23-28,Uid:9b591e08c32c4c116ca6af541a055571,Namespace:kube-system,Attempt:0,} returns sandbox id \"17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6\"" Aug 19 08:15:54.858491 containerd[2014]: time="2025-08-19T08:15:54.858185126Z" level=info msg="CreateContainer within sandbox \"17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 08:15:54.862844 containerd[2014]: time="2025-08-19T08:15:54.862801371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-23-28,Uid:6cc71c0d891372edb9d7fde18c0b6778,Namespace:kube-system,Attempt:0,} returns sandbox id \"f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818\"" Aug 19 08:15:54.862978 containerd[2014]: time="2025-08-19T08:15:54.862954316Z" level=info msg="Container 03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:15:54.868836 containerd[2014]: time="2025-08-19T08:15:54.868797251Z" level=info msg="CreateContainer within sandbox \"f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 08:15:54.870917 kubelet[2906]: W0819 08:15:54.870849 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.23.28:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:54.871080 kubelet[2906]: E0819 08:15:54.870935 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.23.28:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:54.887602 containerd[2014]: time="2025-08-19T08:15:54.887530992Z" level=info msg="Container 75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:15:54.902231 containerd[2014]: time="2025-08-19T08:15:54.902189740Z" level=info msg="CreateContainer within sandbox \"bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\"" Aug 19 08:15:54.903057 containerd[2014]: time="2025-08-19T08:15:54.902779224Z" level=info msg="StartContainer for \"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\"" Aug 19 08:15:54.903888 containerd[2014]: time="2025-08-19T08:15:54.903850957Z" level=info msg="connecting to shim 03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666" address="unix:///run/containerd/s/84ec6d575c1a36d097e53ae3292fa6e67a5cef0c1c909a16f28469bbfa5fdbbc" protocol=ttrpc version=3 Aug 19 08:15:54.913527 containerd[2014]: time="2025-08-19T08:15:54.913479369Z" level=info msg="Container 8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:15:54.921426 containerd[2014]: time="2025-08-19T08:15:54.921385862Z" level=info msg="CreateContainer within sandbox \"17154821447eda8f4fc283829feb6787717b3648aee7c443530589ed8c4d0dc6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677\"" Aug 19 08:15:54.923388 containerd[2014]: time="2025-08-19T08:15:54.923366216Z" level=info msg="StartContainer for \"75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677\"" Aug 19 08:15:54.924606 containerd[2014]: time="2025-08-19T08:15:54.924510336Z" level=info msg="connecting to shim 75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677" address="unix:///run/containerd/s/d66ecf4f5e3f4b6fd42dbb6cadf8e68c76cfc40fce2cb3d23385ab24b25857c4" protocol=ttrpc version=3 Aug 19 08:15:54.925381 systemd[1]: Started cri-containerd-03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666.scope - libcontainer container 03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666. Aug 19 08:15:54.936989 containerd[2014]: time="2025-08-19T08:15:54.936848601Z" level=info msg="CreateContainer within sandbox \"f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\"" Aug 19 08:15:54.937810 containerd[2014]: time="2025-08-19T08:15:54.937787459Z" level=info msg="StartContainer for \"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\"" Aug 19 08:15:54.938779 containerd[2014]: time="2025-08-19T08:15:54.938752380Z" level=info msg="connecting to shim 8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901" address="unix:///run/containerd/s/bab843e31cfaa8a765cefa333c93a27ee98a1dc51159e0847dff480a0d68f586" protocol=ttrpc version=3 Aug 19 08:15:54.951581 systemd[1]: Started cri-containerd-75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677.scope - libcontainer container 75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677. Aug 19 08:15:54.967151 systemd[1]: Started cri-containerd-8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901.scope - libcontainer container 8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901. Aug 19 08:15:55.015587 containerd[2014]: time="2025-08-19T08:15:55.015550503Z" level=info msg="StartContainer for \"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\" returns successfully" Aug 19 08:15:55.064070 containerd[2014]: time="2025-08-19T08:15:55.063944053Z" level=info msg="StartContainer for \"75abb18ce425d40c908899d1a0cafea6f3451a02e016eb15756aa039b3b58677\" returns successfully" Aug 19 08:15:55.068945 containerd[2014]: time="2025-08-19T08:15:55.068907608Z" level=info msg="StartContainer for \"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\" returns successfully" Aug 19 08:15:55.115968 kubelet[2906]: W0819 08:15:55.115673 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.23.28:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-23-28&limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:55.115968 kubelet[2906]: E0819 08:15:55.115940 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.23.28:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-23-28&limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:55.385118 kubelet[2906]: E0819 08:15:55.385074 2906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.23.28:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-23-28?timeout=10s\": dial tcp 172.31.23.28:6443: connect: connection refused" interval="1.6s" Aug 19 08:15:55.555832 kubelet[2906]: W0819 08:15:55.555326 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.23.28:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:55.556024 kubelet[2906]: E0819 08:15:55.555990 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.23.28:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:55.570494 kubelet[2906]: W0819 08:15:55.569630 2906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.23.28:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.23.28:6443: connect: connection refused Aug 19 08:15:55.570494 kubelet[2906]: E0819 08:15:55.570458 2906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.23.28:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.23.28:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:15:55.592346 kubelet[2906]: I0819 08:15:55.592260 2906 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:15:55.593440 kubelet[2906]: E0819 08:15:55.592795 2906 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.23.28:6443/api/v1/nodes\": dial tcp 172.31.23.28:6443: connect: connection refused" node="ip-172-31-23-28" Aug 19 08:15:57.196780 kubelet[2906]: I0819 08:15:57.196230 2906 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:15:57.393218 kubelet[2906]: E0819 08:15:57.393189 2906 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-23-28\" not found" node="ip-172-31-23-28" Aug 19 08:15:57.452219 kubelet[2906]: I0819 08:15:57.452121 2906 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-23-28" Aug 19 08:15:57.452219 kubelet[2906]: E0819 08:15:57.452160 2906 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ip-172-31-23-28\": node \"ip-172-31-23-28\" not found" Aug 19 08:15:57.965575 kubelet[2906]: I0819 08:15:57.965526 2906 apiserver.go:52] "Watching apiserver" Aug 19 08:15:57.981793 kubelet[2906]: I0819 08:15:57.981737 2906 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 19 08:15:58.083831 kubelet[2906]: E0819 08:15:58.083790 2906 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-23-28\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:15:59.589693 systemd[1]: Reload requested from client PID 3176 ('systemctl') (unit session-7.scope)... Aug 19 08:15:59.589712 systemd[1]: Reloading... Aug 19 08:15:59.702067 zram_generator::config[3223]: No configuration found. Aug 19 08:15:59.964154 systemd[1]: Reloading finished in 373 ms. Aug 19 08:15:59.994312 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:00.031891 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 08:16:00.032387 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:00.032462 systemd[1]: kubelet.service: Consumed 716ms CPU time, 128.1M memory peak. Aug 19 08:16:00.064113 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:00.427198 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:00.441141 (kubelet)[3281]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:16:00.519692 kubelet[3281]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:16:00.520998 kubelet[3281]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 19 08:16:00.520998 kubelet[3281]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:16:00.520998 kubelet[3281]: I0819 08:16:00.520649 3281 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:16:00.535375 kubelet[3281]: I0819 08:16:00.535319 3281 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 19 08:16:00.535375 kubelet[3281]: I0819 08:16:00.535351 3281 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:16:00.535834 kubelet[3281]: I0819 08:16:00.535804 3281 server.go:934] "Client rotation is on, will bootstrap in background" Aug 19 08:16:00.537930 kubelet[3281]: I0819 08:16:00.537902 3281 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 19 08:16:00.541226 kubelet[3281]: I0819 08:16:00.541051 3281 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:16:00.545686 kubelet[3281]: I0819 08:16:00.545665 3281 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:16:00.550223 kubelet[3281]: I0819 08:16:00.550068 3281 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:16:00.550375 kubelet[3281]: I0819 08:16:00.550227 3281 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 19 08:16:00.550450 kubelet[3281]: I0819 08:16:00.550408 3281 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:16:00.550658 kubelet[3281]: I0819 08:16:00.550444 3281 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-23-28","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:16:00.550787 kubelet[3281]: I0819 08:16:00.550661 3281 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:16:00.550787 kubelet[3281]: I0819 08:16:00.550677 3281 container_manager_linux.go:300] "Creating device plugin manager" Aug 19 08:16:00.550787 kubelet[3281]: I0819 08:16:00.550711 3281 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:16:00.551299 kubelet[3281]: I0819 08:16:00.551270 3281 kubelet.go:408] "Attempting to sync node with API server" Aug 19 08:16:00.551299 kubelet[3281]: I0819 08:16:00.551293 3281 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:16:00.551427 kubelet[3281]: I0819 08:16:00.551332 3281 kubelet.go:314] "Adding apiserver pod source" Aug 19 08:16:00.551427 kubelet[3281]: I0819 08:16:00.551346 3281 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:16:00.558545 kubelet[3281]: I0819 08:16:00.558510 3281 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:16:00.559452 kubelet[3281]: I0819 08:16:00.559424 3281 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:16:00.561728 kubelet[3281]: I0819 08:16:00.561705 3281 server.go:1274] "Started kubelet" Aug 19 08:16:00.567082 kubelet[3281]: I0819 08:16:00.566608 3281 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:16:00.567082 kubelet[3281]: I0819 08:16:00.567003 3281 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:16:00.572226 kubelet[3281]: I0819 08:16:00.572201 3281 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:16:00.577121 kubelet[3281]: I0819 08:16:00.577067 3281 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:16:00.578444 kubelet[3281]: I0819 08:16:00.578414 3281 server.go:449] "Adding debug handlers to kubelet server" Aug 19 08:16:00.586990 kubelet[3281]: I0819 08:16:00.583709 3281 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:16:00.586990 kubelet[3281]: I0819 08:16:00.585907 3281 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 19 08:16:00.586990 kubelet[3281]: E0819 08:16:00.586169 3281 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-23-28\" not found" Aug 19 08:16:00.601909 kubelet[3281]: I0819 08:16:00.601774 3281 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 19 08:16:00.602532 kubelet[3281]: I0819 08:16:00.602185 3281 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:16:00.604091 kubelet[3281]: I0819 08:16:00.604068 3281 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:16:00.607312 kubelet[3281]: I0819 08:16:00.607227 3281 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:16:00.607312 kubelet[3281]: I0819 08:16:00.607248 3281 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:16:00.612452 kubelet[3281]: I0819 08:16:00.610246 3281 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:16:00.612452 kubelet[3281]: I0819 08:16:00.611710 3281 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:16:00.612452 kubelet[3281]: I0819 08:16:00.611737 3281 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 19 08:16:00.612452 kubelet[3281]: I0819 08:16:00.611763 3281 kubelet.go:2321] "Starting kubelet main sync loop" Aug 19 08:16:00.612452 kubelet[3281]: E0819 08:16:00.611812 3281 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:16:00.664598 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Aug 19 08:16:00.693218 kubelet[3281]: I0819 08:16:00.693099 3281 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 19 08:16:00.693218 kubelet[3281]: I0819 08:16:00.693119 3281 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 19 08:16:00.693218 kubelet[3281]: I0819 08:16:00.693142 3281 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:16:00.693422 kubelet[3281]: I0819 08:16:00.693330 3281 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 08:16:00.693422 kubelet[3281]: I0819 08:16:00.693343 3281 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 08:16:00.693422 kubelet[3281]: I0819 08:16:00.693369 3281 policy_none.go:49] "None policy: Start" Aug 19 08:16:00.694220 kubelet[3281]: I0819 08:16:00.694191 3281 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 19 08:16:00.694220 kubelet[3281]: I0819 08:16:00.694220 3281 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:16:00.694988 kubelet[3281]: I0819 08:16:00.694398 3281 state_mem.go:75] "Updated machine memory state" Aug 19 08:16:00.704923 kubelet[3281]: I0819 08:16:00.704891 3281 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:16:00.705137 kubelet[3281]: I0819 08:16:00.705120 3281 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:16:00.705218 kubelet[3281]: I0819 08:16:00.705139 3281 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:16:00.706474 kubelet[3281]: I0819 08:16:00.705765 3281 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:16:00.734476 kubelet[3281]: E0819 08:16:00.734440 3281 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ip-172-31-23-28\" already exists" pod="kube-system/kube-scheduler-ip-172-31-23-28" Aug 19 08:16:00.805868 kubelet[3281]: I0819 08:16:00.805829 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-ca-certs\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:16:00.806125 kubelet[3281]: I0819 08:16:00.805992 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:16:00.806125 kubelet[3281]: I0819 08:16:00.806044 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-k8s-certs\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:16:00.806125 kubelet[3281]: I0819 08:16:00.806064 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6cc71c0d891372edb9d7fde18c0b6778-kubeconfig\") pod \"kube-scheduler-ip-172-31-23-28\" (UID: \"6cc71c0d891372edb9d7fde18c0b6778\") " pod="kube-system/kube-scheduler-ip-172-31-23-28" Aug 19 08:16:00.806125 kubelet[3281]: I0819 08:16:00.806078 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-k8s-certs\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:16:00.806125 kubelet[3281]: I0819 08:16:00.806103 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9b591e08c32c4c116ca6af541a055571-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-23-28\" (UID: \"9b591e08c32c4c116ca6af541a055571\") " pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:16:00.806410 kubelet[3281]: I0819 08:16:00.806123 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-ca-certs\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:16:00.806410 kubelet[3281]: I0819 08:16:00.806139 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-kubeconfig\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:16:00.806410 kubelet[3281]: I0819 08:16:00.806200 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/418a60d6507e4316ba82f3c0751baa98-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-23-28\" (UID: \"418a60d6507e4316ba82f3c0751baa98\") " pod="kube-system/kube-controller-manager-ip-172-31-23-28" Aug 19 08:16:00.831114 kubelet[3281]: I0819 08:16:00.831077 3281 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-23-28" Aug 19 08:16:00.840662 kubelet[3281]: I0819 08:16:00.840627 3281 kubelet_node_status.go:111] "Node was previously registered" node="ip-172-31-23-28" Aug 19 08:16:00.840815 kubelet[3281]: I0819 08:16:00.840717 3281 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-23-28" Aug 19 08:16:01.554063 kubelet[3281]: I0819 08:16:01.553802 3281 apiserver.go:52] "Watching apiserver" Aug 19 08:16:01.603637 kubelet[3281]: I0819 08:16:01.603597 3281 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 19 08:16:01.679494 kubelet[3281]: E0819 08:16:01.679110 3281 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-23-28\" already exists" pod="kube-system/kube-apiserver-ip-172-31-23-28" Aug 19 08:16:01.729923 kubelet[3281]: I0819 08:16:01.729819 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-23-28" podStartSLOduration=1.729753321 podStartE2EDuration="1.729753321s" podCreationTimestamp="2025-08-19 08:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:01.726112626 +0000 UTC m=+1.276948969" watchObservedRunningTime="2025-08-19 08:16:01.729753321 +0000 UTC m=+1.280589662" Aug 19 08:16:01.770267 kubelet[3281]: I0819 08:16:01.770203 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-23-28" podStartSLOduration=1.7701634560000001 podStartE2EDuration="1.770163456s" podCreationTimestamp="2025-08-19 08:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:01.756604795 +0000 UTC m=+1.307441134" watchObservedRunningTime="2025-08-19 08:16:01.770163456 +0000 UTC m=+1.320999796" Aug 19 08:16:01.789743 kubelet[3281]: I0819 08:16:01.789564 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-23-28" podStartSLOduration=3.789540423 podStartE2EDuration="3.789540423s" podCreationTimestamp="2025-08-19 08:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:01.770621284 +0000 UTC m=+1.321457623" watchObservedRunningTime="2025-08-19 08:16:01.789540423 +0000 UTC m=+1.340376764" Aug 19 08:16:05.524293 kubelet[3281]: I0819 08:16:05.524250 3281 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 08:16:05.526279 containerd[2014]: time="2025-08-19T08:16:05.525447314Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 08:16:05.527399 kubelet[3281]: I0819 08:16:05.525820 3281 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 08:16:05.932561 systemd[1]: Created slice kubepods-besteffort-pod6a07bdef_022b_480e_adae_a51035d4214e.slice - libcontainer container kubepods-besteffort-pod6a07bdef_022b_480e_adae_a51035d4214e.slice. Aug 19 08:16:05.950400 kubelet[3281]: I0819 08:16:05.950216 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6a07bdef-022b-480e-adae-a51035d4214e-kube-proxy\") pod \"kube-proxy-xw4zb\" (UID: \"6a07bdef-022b-480e-adae-a51035d4214e\") " pod="kube-system/kube-proxy-xw4zb" Aug 19 08:16:05.950400 kubelet[3281]: I0819 08:16:05.950257 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a07bdef-022b-480e-adae-a51035d4214e-lib-modules\") pod \"kube-proxy-xw4zb\" (UID: \"6a07bdef-022b-480e-adae-a51035d4214e\") " pod="kube-system/kube-proxy-xw4zb" Aug 19 08:16:05.950400 kubelet[3281]: I0819 08:16:05.950280 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4w5\" (UniqueName: \"kubernetes.io/projected/6a07bdef-022b-480e-adae-a51035d4214e-kube-api-access-6p4w5\") pod \"kube-proxy-xw4zb\" (UID: \"6a07bdef-022b-480e-adae-a51035d4214e\") " pod="kube-system/kube-proxy-xw4zb" Aug 19 08:16:05.950400 kubelet[3281]: I0819 08:16:05.950310 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6a07bdef-022b-480e-adae-a51035d4214e-xtables-lock\") pod \"kube-proxy-xw4zb\" (UID: \"6a07bdef-022b-480e-adae-a51035d4214e\") " pod="kube-system/kube-proxy-xw4zb" Aug 19 08:16:06.060326 kubelet[3281]: E0819 08:16:06.060281 3281 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Aug 19 08:16:06.060326 kubelet[3281]: E0819 08:16:06.060322 3281 projected.go:194] Error preparing data for projected volume kube-api-access-6p4w5 for pod kube-system/kube-proxy-xw4zb: configmap "kube-root-ca.crt" not found Aug 19 08:16:06.060488 kubelet[3281]: E0819 08:16:06.060385 3281 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a07bdef-022b-480e-adae-a51035d4214e-kube-api-access-6p4w5 podName:6a07bdef-022b-480e-adae-a51035d4214e nodeName:}" failed. No retries permitted until 2025-08-19 08:16:06.56036412 +0000 UTC m=+6.111200454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6p4w5" (UniqueName: "kubernetes.io/projected/6a07bdef-022b-480e-adae-a51035d4214e-kube-api-access-6p4w5") pod "kube-proxy-xw4zb" (UID: "6a07bdef-022b-480e-adae-a51035d4214e") : configmap "kube-root-ca.crt" not found Aug 19 08:16:06.650263 systemd[1]: Created slice kubepods-besteffort-podf2e71f62_ba08_4859_a5da_04d67fbe6b10.slice - libcontainer container kubepods-besteffort-podf2e71f62_ba08_4859_a5da_04d67fbe6b10.slice. Aug 19 08:16:06.656180 kubelet[3281]: I0819 08:16:06.656139 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f2e71f62-ba08-4859-a5da-04d67fbe6b10-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-5ljwz\" (UID: \"f2e71f62-ba08-4859-a5da-04d67fbe6b10\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-5ljwz" Aug 19 08:16:06.656180 kubelet[3281]: I0819 08:16:06.656183 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjslt\" (UniqueName: \"kubernetes.io/projected/f2e71f62-ba08-4859-a5da-04d67fbe6b10-kube-api-access-mjslt\") pod \"tigera-operator-5bf8dfcb4-5ljwz\" (UID: \"f2e71f62-ba08-4859-a5da-04d67fbe6b10\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-5ljwz" Aug 19 08:16:06.844179 containerd[2014]: time="2025-08-19T08:16:06.844139159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xw4zb,Uid:6a07bdef-022b-480e-adae-a51035d4214e,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:06.876591 containerd[2014]: time="2025-08-19T08:16:06.876547590Z" level=info msg="connecting to shim ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b" address="unix:///run/containerd/s/b19a656e87a08801ccb1ff4fa5406b5d6d178725bdd2cceb905b2810df42f7a2" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:06.911270 systemd[1]: Started cri-containerd-ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b.scope - libcontainer container ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b. Aug 19 08:16:06.944182 containerd[2014]: time="2025-08-19T08:16:06.944141730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xw4zb,Uid:6a07bdef-022b-480e-adae-a51035d4214e,Namespace:kube-system,Attempt:0,} returns sandbox id \"ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b\"" Aug 19 08:16:06.946867 containerd[2014]: time="2025-08-19T08:16:06.946833644Z" level=info msg="CreateContainer within sandbox \"ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 08:16:06.955006 containerd[2014]: time="2025-08-19T08:16:06.954940289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-5ljwz,Uid:f2e71f62-ba08-4859-a5da-04d67fbe6b10,Namespace:tigera-operator,Attempt:0,}" Aug 19 08:16:06.967359 containerd[2014]: time="2025-08-19T08:16:06.967243964Z" level=info msg="Container fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:06.981553 containerd[2014]: time="2025-08-19T08:16:06.981509166Z" level=info msg="CreateContainer within sandbox \"ccbcffdaf783264c9ba6cfebb10f32613fda5762659e9472a464ec140aa8d51b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f\"" Aug 19 08:16:06.982641 containerd[2014]: time="2025-08-19T08:16:06.982269334Z" level=info msg="StartContainer for \"fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f\"" Aug 19 08:16:06.984217 containerd[2014]: time="2025-08-19T08:16:06.983944328Z" level=info msg="connecting to shim fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f" address="unix:///run/containerd/s/b19a656e87a08801ccb1ff4fa5406b5d6d178725bdd2cceb905b2810df42f7a2" protocol=ttrpc version=3 Aug 19 08:16:07.003276 containerd[2014]: time="2025-08-19T08:16:07.002508791Z" level=info msg="connecting to shim 26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892" address="unix:///run/containerd/s/d823c6a8d3c42dd2f1e8118bdea90928666ca81138f939cb6bd0a15cf4e653e6" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:07.016267 systemd[1]: Started cri-containerd-fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f.scope - libcontainer container fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f. Aug 19 08:16:07.043317 systemd[1]: Started cri-containerd-26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892.scope - libcontainer container 26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892. Aug 19 08:16:07.100177 containerd[2014]: time="2025-08-19T08:16:07.100122114Z" level=info msg="StartContainer for \"fc7c6b4f56aad79eb1c6391123dedd1afab6be6f4709f064743275009752037f\" returns successfully" Aug 19 08:16:07.122319 containerd[2014]: time="2025-08-19T08:16:07.122272447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-5ljwz,Uid:f2e71f62-ba08-4859-a5da-04d67fbe6b10,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892\"" Aug 19 08:16:07.124879 containerd[2014]: time="2025-08-19T08:16:07.124828102Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 08:16:07.669151 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3729978166.mount: Deactivated successfully. Aug 19 08:16:07.715225 kubelet[3281]: I0819 08:16:07.715006 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xw4zb" podStartSLOduration=2.7149844180000002 podStartE2EDuration="2.714984418s" podCreationTimestamp="2025-08-19 08:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:07.714804006 +0000 UTC m=+7.265640348" watchObservedRunningTime="2025-08-19 08:16:07.714984418 +0000 UTC m=+7.265820753" Aug 19 08:16:08.718262 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1174738482.mount: Deactivated successfully. Aug 19 08:16:12.015490 containerd[2014]: time="2025-08-19T08:16:12.015434795Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:12.019496 containerd[2014]: time="2025-08-19T08:16:12.019425631Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 19 08:16:12.022201 containerd[2014]: time="2025-08-19T08:16:12.022149127Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:12.028056 containerd[2014]: time="2025-08-19T08:16:12.026977323Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:12.028056 containerd[2014]: time="2025-08-19T08:16:12.027940183Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 4.903082141s" Aug 19 08:16:12.028056 containerd[2014]: time="2025-08-19T08:16:12.027969215Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 19 08:16:12.031105 containerd[2014]: time="2025-08-19T08:16:12.031067789Z" level=info msg="CreateContainer within sandbox \"26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 08:16:12.047431 containerd[2014]: time="2025-08-19T08:16:12.047402099Z" level=info msg="Container 814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:12.060960 containerd[2014]: time="2025-08-19T08:16:12.060909903Z" level=info msg="CreateContainer within sandbox \"26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\"" Aug 19 08:16:12.061621 containerd[2014]: time="2025-08-19T08:16:12.061598399Z" level=info msg="StartContainer for \"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\"" Aug 19 08:16:12.062691 containerd[2014]: time="2025-08-19T08:16:12.062652449Z" level=info msg="connecting to shim 814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923" address="unix:///run/containerd/s/d823c6a8d3c42dd2f1e8118bdea90928666ca81138f939cb6bd0a15cf4e653e6" protocol=ttrpc version=3 Aug 19 08:16:12.093272 systemd[1]: Started cri-containerd-814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923.scope - libcontainer container 814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923. Aug 19 08:16:12.129318 containerd[2014]: time="2025-08-19T08:16:12.129281899Z" level=info msg="StartContainer for \"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\" returns successfully" Aug 19 08:16:12.711299 kubelet[3281]: I0819 08:16:12.711147 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-5ljwz" podStartSLOduration=1.806014569 podStartE2EDuration="6.711129822s" podCreationTimestamp="2025-08-19 08:16:06 +0000 UTC" firstStartedPulling="2025-08-19 08:16:07.124218157 +0000 UTC m=+6.675054476" lastFinishedPulling="2025-08-19 08:16:12.029333408 +0000 UTC m=+11.580169729" observedRunningTime="2025-08-19 08:16:12.710286818 +0000 UTC m=+12.261123157" watchObservedRunningTime="2025-08-19 08:16:12.711129822 +0000 UTC m=+12.261966214" Aug 19 08:16:14.675055 update_engine[1978]: I20250819 08:16:14.673084 1978 update_attempter.cc:509] Updating boot flags... Aug 19 08:16:19.762481 sudo[2346]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:19.788244 sshd[2345]: Connection closed by 139.178.89.65 port 38956 Aug 19 08:16:19.788697 sshd-session[2342]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:19.797531 systemd[1]: sshd@6-172.31.23.28:22-139.178.89.65:38956.service: Deactivated successfully. Aug 19 08:16:19.805114 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 08:16:19.805808 systemd[1]: session-7.scope: Consumed 5.184s CPU time, 150.7M memory peak. Aug 19 08:16:19.812616 systemd-logind[1976]: Session 7 logged out. Waiting for processes to exit. Aug 19 08:16:19.817264 systemd-logind[1976]: Removed session 7. Aug 19 08:16:24.436081 systemd[1]: Created slice kubepods-besteffort-pode69d1b5e_2ab8_474b_b167_924fdd03e92f.slice - libcontainer container kubepods-besteffort-pode69d1b5e_2ab8_474b_b167_924fdd03e92f.slice. Aug 19 08:16:24.494252 kubelet[3281]: I0819 08:16:24.494204 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e69d1b5e-2ab8-474b-b167-924fdd03e92f-tigera-ca-bundle\") pod \"calico-typha-f4459b4c7-xxlqp\" (UID: \"e69d1b5e-2ab8-474b-b167-924fdd03e92f\") " pod="calico-system/calico-typha-f4459b4c7-xxlqp" Aug 19 08:16:24.494634 kubelet[3281]: I0819 08:16:24.494368 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e69d1b5e-2ab8-474b-b167-924fdd03e92f-typha-certs\") pod \"calico-typha-f4459b4c7-xxlqp\" (UID: \"e69d1b5e-2ab8-474b-b167-924fdd03e92f\") " pod="calico-system/calico-typha-f4459b4c7-xxlqp" Aug 19 08:16:24.494634 kubelet[3281]: I0819 08:16:24.494391 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4p2c\" (UniqueName: \"kubernetes.io/projected/e69d1b5e-2ab8-474b-b167-924fdd03e92f-kube-api-access-p4p2c\") pod \"calico-typha-f4459b4c7-xxlqp\" (UID: \"e69d1b5e-2ab8-474b-b167-924fdd03e92f\") " pod="calico-system/calico-typha-f4459b4c7-xxlqp" Aug 19 08:16:24.714722 systemd[1]: Created slice kubepods-besteffort-podd72d9c81_b232_4641_a27c_96e645f5fbaf.slice - libcontainer container kubepods-besteffort-podd72d9c81_b232_4641_a27c_96e645f5fbaf.slice. Aug 19 08:16:24.750596 containerd[2014]: time="2025-08-19T08:16:24.750555656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f4459b4c7-xxlqp,Uid:e69d1b5e-2ab8-474b-b167-924fdd03e92f,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:24.796439 kubelet[3281]: I0819 08:16:24.796398 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-flexvol-driver-host\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796439 kubelet[3281]: I0819 08:16:24.796443 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-cni-log-dir\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796591 kubelet[3281]: I0819 08:16:24.796462 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-cni-net-dir\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796591 kubelet[3281]: I0819 08:16:24.796477 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-var-lib-calico\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796591 kubelet[3281]: I0819 08:16:24.796493 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-cni-bin-dir\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796591 kubelet[3281]: I0819 08:16:24.796510 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d72d9c81-b232-4641-a27c-96e645f5fbaf-node-certs\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796591 kubelet[3281]: I0819 08:16:24.796525 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-policysync\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796715 kubelet[3281]: I0819 08:16:24.796540 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72d9c81-b232-4641-a27c-96e645f5fbaf-tigera-ca-bundle\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796715 kubelet[3281]: I0819 08:16:24.796556 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frczc\" (UniqueName: \"kubernetes.io/projected/d72d9c81-b232-4641-a27c-96e645f5fbaf-kube-api-access-frczc\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796715 kubelet[3281]: I0819 08:16:24.796572 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-var-run-calico\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796715 kubelet[3281]: I0819 08:16:24.796586 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-xtables-lock\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.796715 kubelet[3281]: I0819 08:16:24.796600 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d72d9c81-b232-4641-a27c-96e645f5fbaf-lib-modules\") pod \"calico-node-hkk2w\" (UID: \"d72d9c81-b232-4641-a27c-96e645f5fbaf\") " pod="calico-system/calico-node-hkk2w" Aug 19 08:16:24.799046 containerd[2014]: time="2025-08-19T08:16:24.798970131Z" level=info msg="connecting to shim 0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a" address="unix:///run/containerd/s/587bf5915f301acce187934b5e11f7c2622bb807ca32c02d880c1cdc7a5275ed" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:24.829585 systemd[1]: Started cri-containerd-0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a.scope - libcontainer container 0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a. Aug 19 08:16:24.910550 containerd[2014]: time="2025-08-19T08:16:24.909631983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f4459b4c7-xxlqp,Uid:e69d1b5e-2ab8-474b-b167-924fdd03e92f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a\"" Aug 19 08:16:24.916044 kubelet[3281]: E0819 08:16:24.915998 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:24.916216 kubelet[3281]: W0819 08:16:24.916194 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:24.916425 kubelet[3281]: E0819 08:16:24.916382 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:24.920160 containerd[2014]: time="2025-08-19T08:16:24.918990011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 08:16:24.929455 kubelet[3281]: E0819 08:16:24.929221 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:24.930094 kubelet[3281]: W0819 08:16:24.930055 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:24.930225 kubelet[3281]: E0819 08:16:24.930207 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.017133 kubelet[3281]: E0819 08:16:25.016611 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:25.027367 containerd[2014]: time="2025-08-19T08:16:25.027316460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkk2w,Uid:d72d9c81-b232-4641-a27c-96e645f5fbaf,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:25.084066 kubelet[3281]: E0819 08:16:25.083982 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.084066 kubelet[3281]: W0819 08:16:25.084014 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.084607 kubelet[3281]: E0819 08:16:25.084074 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.084607 kubelet[3281]: E0819 08:16:25.084378 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.084607 kubelet[3281]: W0819 08:16:25.084391 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.084607 kubelet[3281]: E0819 08:16:25.084409 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.084612 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.085252 kubelet[3281]: W0819 08:16:25.084622 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.084635 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.084836 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.085252 kubelet[3281]: W0819 08:16:25.084849 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.084861 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.085094 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.085252 kubelet[3281]: W0819 08:16:25.085106 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.085252 kubelet[3281]: E0819 08:16:25.085120 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.085316 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.086499 kubelet[3281]: W0819 08:16:25.085327 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.085338 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.085590 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.086499 kubelet[3281]: W0819 08:16:25.085601 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.085615 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.086067 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.086499 kubelet[3281]: W0819 08:16:25.086080 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.086096 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.086499 kubelet[3281]: E0819 08:16:25.086305 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.088307 kubelet[3281]: W0819 08:16:25.086316 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.086329 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.087063 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.088307 kubelet[3281]: W0819 08:16:25.087076 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.087091 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.087343 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.088307 kubelet[3281]: W0819 08:16:25.087357 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.087370 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.088307 kubelet[3281]: E0819 08:16:25.087913 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.088307 kubelet[3281]: W0819 08:16:25.087925 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.089570 kubelet[3281]: E0819 08:16:25.087940 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.089570 kubelet[3281]: E0819 08:16:25.088750 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.089570 kubelet[3281]: W0819 08:16:25.088763 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.089570 kubelet[3281]: E0819 08:16:25.088777 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.089570 kubelet[3281]: E0819 08:16:25.089364 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.089570 kubelet[3281]: W0819 08:16:25.089376 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.089570 kubelet[3281]: E0819 08:16:25.089392 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.090377 kubelet[3281]: E0819 08:16:25.090296 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.090377 kubelet[3281]: W0819 08:16:25.090311 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.090377 kubelet[3281]: E0819 08:16:25.090325 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.090891 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.091724 kubelet[3281]: W0819 08:16:25.090904 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.090917 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.091159 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.091724 kubelet[3281]: W0819 08:16:25.091170 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.091183 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.091369 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.091724 kubelet[3281]: W0819 08:16:25.091379 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.091392 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.091724 kubelet[3281]: E0819 08:16:25.091592 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.092104 kubelet[3281]: W0819 08:16:25.091602 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.092104 kubelet[3281]: E0819 08:16:25.091614 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.092509 kubelet[3281]: E0819 08:16:25.092373 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.092509 kubelet[3281]: W0819 08:16:25.092387 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.092509 kubelet[3281]: E0819 08:16:25.092406 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.099845 kubelet[3281]: E0819 08:16:25.099558 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.100184 kubelet[3281]: W0819 08:16:25.100002 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.100184 kubelet[3281]: E0819 08:16:25.100061 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.100184 kubelet[3281]: I0819 08:16:25.100106 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fd28c4a4-f3f6-4029-ab76-882058ba96b8-varrun\") pod \"csi-node-driver-fhzgx\" (UID: \"fd28c4a4-f3f6-4029-ab76-882058ba96b8\") " pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:25.100978 kubelet[3281]: E0819 08:16:25.100930 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.100978 kubelet[3281]: W0819 08:16:25.100952 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.100978 kubelet[3281]: E0819 08:16:25.100979 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.101307 kubelet[3281]: I0819 08:16:25.101008 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd28c4a4-f3f6-4029-ab76-882058ba96b8-kubelet-dir\") pod \"csi-node-driver-fhzgx\" (UID: \"fd28c4a4-f3f6-4029-ab76-882058ba96b8\") " pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:25.102451 kubelet[3281]: E0819 08:16:25.101370 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.102451 kubelet[3281]: W0819 08:16:25.101387 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.102451 kubelet[3281]: E0819 08:16:25.101407 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.102451 kubelet[3281]: I0819 08:16:25.101432 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwpgl\" (UniqueName: \"kubernetes.io/projected/fd28c4a4-f3f6-4029-ab76-882058ba96b8-kube-api-access-vwpgl\") pod \"csi-node-driver-fhzgx\" (UID: \"fd28c4a4-f3f6-4029-ab76-882058ba96b8\") " pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:25.102451 kubelet[3281]: E0819 08:16:25.102274 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.102451 kubelet[3281]: W0819 08:16:25.102291 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.102451 kubelet[3281]: E0819 08:16:25.102308 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.102451 kubelet[3281]: I0819 08:16:25.102336 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd28c4a4-f3f6-4029-ab76-882058ba96b8-socket-dir\") pod \"csi-node-driver-fhzgx\" (UID: \"fd28c4a4-f3f6-4029-ab76-882058ba96b8\") " pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:25.103052 kubelet[3281]: E0819 08:16:25.103004 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.103948 kubelet[3281]: W0819 08:16:25.103022 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.104198 kubelet[3281]: E0819 08:16:25.104062 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.104198 kubelet[3281]: I0819 08:16:25.104120 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd28c4a4-f3f6-4029-ab76-882058ba96b8-registration-dir\") pod \"csi-node-driver-fhzgx\" (UID: \"fd28c4a4-f3f6-4029-ab76-882058ba96b8\") " pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:25.104323 kubelet[3281]: E0819 08:16:25.104214 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.104323 kubelet[3281]: W0819 08:16:25.104224 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.104475 kubelet[3281]: E0819 08:16:25.104423 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.104475 kubelet[3281]: W0819 08:16:25.104434 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.104475 kubelet[3281]: E0819 08:16:25.104440 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.104475 kubelet[3281]: E0819 08:16:25.104459 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.104819 kubelet[3281]: E0819 08:16:25.104802 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.104819 kubelet[3281]: W0819 08:16:25.104817 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.105183 kubelet[3281]: E0819 08:16:25.105160 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.105537 kubelet[3281]: E0819 08:16:25.105522 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.106053 kubelet[3281]: W0819 08:16:25.105540 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.106053 kubelet[3281]: E0819 08:16:25.105926 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.106053 kubelet[3281]: E0819 08:16:25.105994 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.106053 kubelet[3281]: W0819 08:16:25.106005 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.106292 kubelet[3281]: E0819 08:16:25.106085 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.106292 kubelet[3281]: E0819 08:16:25.106273 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.106292 kubelet[3281]: W0819 08:16:25.106283 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.106919 kubelet[3281]: E0819 08:16:25.106296 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.106919 kubelet[3281]: E0819 08:16:25.106521 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.106919 kubelet[3281]: W0819 08:16:25.106533 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.106919 kubelet[3281]: E0819 08:16:25.106546 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.106919 kubelet[3281]: E0819 08:16:25.106765 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.106919 kubelet[3281]: W0819 08:16:25.106776 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.106919 kubelet[3281]: E0819 08:16:25.106788 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.107483 kubelet[3281]: E0819 08:16:25.106993 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.107483 kubelet[3281]: W0819 08:16:25.107004 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.107483 kubelet[3281]: E0819 08:16:25.107019 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.108181 kubelet[3281]: E0819 08:16:25.107780 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.108181 kubelet[3281]: W0819 08:16:25.107795 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.108181 kubelet[3281]: E0819 08:16:25.107811 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.119850 containerd[2014]: time="2025-08-19T08:16:25.119720135Z" level=info msg="connecting to shim 851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212" address="unix:///run/containerd/s/be324eefb339ce210d8b4490ab67d8fdb4d1b740f0bea24e787ceab8ae2dfe7d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:25.165443 systemd[1]: Started cri-containerd-851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212.scope - libcontainer container 851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212. Aug 19 08:16:25.205332 kubelet[3281]: E0819 08:16:25.205248 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.205332 kubelet[3281]: W0819 08:16:25.205279 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.205534 kubelet[3281]: E0819 08:16:25.205405 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.207485 kubelet[3281]: E0819 08:16:25.207373 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.207485 kubelet[3281]: W0819 08:16:25.207400 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.207485 kubelet[3281]: E0819 08:16:25.207462 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.207816 kubelet[3281]: E0819 08:16:25.207797 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.207816 kubelet[3281]: W0819 08:16:25.207814 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.207940 kubelet[3281]: E0819 08:16:25.207857 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.209468 kubelet[3281]: E0819 08:16:25.209423 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.209747 kubelet[3281]: W0819 08:16:25.209668 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.209979 kubelet[3281]: E0819 08:16:25.209942 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.210595 kubelet[3281]: E0819 08:16:25.210457 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.210595 kubelet[3281]: W0819 08:16:25.210471 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.210907 kubelet[3281]: E0819 08:16:25.210820 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.211458 kubelet[3281]: E0819 08:16:25.211320 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.211458 kubelet[3281]: W0819 08:16:25.211435 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.211713 kubelet[3281]: E0819 08:16:25.211679 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.212475 kubelet[3281]: E0819 08:16:25.212423 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.212668 kubelet[3281]: W0819 08:16:25.212438 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.212979 kubelet[3281]: E0819 08:16:25.212802 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.213975 kubelet[3281]: E0819 08:16:25.213947 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.214188 kubelet[3281]: W0819 08:16:25.214132 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.214405 kubelet[3281]: E0819 08:16:25.214350 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.215007 kubelet[3281]: E0819 08:16:25.214992 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.215152 kubelet[3281]: W0819 08:16:25.215109 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.215371 kubelet[3281]: E0819 08:16:25.215341 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.216063 kubelet[3281]: E0819 08:16:25.215691 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.216063 kubelet[3281]: W0819 08:16:25.215907 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.216341 kubelet[3281]: E0819 08:16:25.216193 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.216624 kubelet[3281]: E0819 08:16:25.216573 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.216624 kubelet[3281]: W0819 08:16:25.216586 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.217192 kubelet[3281]: E0819 08:16:25.217174 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.217637 kubelet[3281]: E0819 08:16:25.217471 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.217637 kubelet[3281]: W0819 08:16:25.217613 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.217835 kubelet[3281]: E0819 08:16:25.217821 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.218472 kubelet[3281]: E0819 08:16:25.218431 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.218472 kubelet[3281]: W0819 08:16:25.218446 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.218798 kubelet[3281]: E0819 08:16:25.218694 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.219324 containerd[2014]: time="2025-08-19T08:16:25.219177721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkk2w,Uid:d72d9c81-b232-4641-a27c-96e645f5fbaf,Namespace:calico-system,Attempt:0,} returns sandbox id \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\"" Aug 19 08:16:25.219407 kubelet[3281]: E0819 08:16:25.219293 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.219407 kubelet[3281]: W0819 08:16:25.219305 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.219776 kubelet[3281]: E0819 08:16:25.219675 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.220415 kubelet[3281]: E0819 08:16:25.220401 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.220550 kubelet[3281]: W0819 08:16:25.220535 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.220844 kubelet[3281]: E0819 08:16:25.220826 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.221164 kubelet[3281]: E0819 08:16:25.221123 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.222007 kubelet[3281]: W0819 08:16:25.221863 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.222317 kubelet[3281]: E0819 08:16:25.222201 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.222317 kubelet[3281]: W0819 08:16:25.222214 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.223403 kubelet[3281]: E0819 08:16:25.223348 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.223403 kubelet[3281]: E0819 08:16:25.223386 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.223727 kubelet[3281]: E0819 08:16:25.223581 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.223727 kubelet[3281]: W0819 08:16:25.223594 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.224431 kubelet[3281]: E0819 08:16:25.224364 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.224612 kubelet[3281]: W0819 08:16:25.224518 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.224745 kubelet[3281]: E0819 08:16:25.224731 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.224832 kubelet[3281]: E0819 08:16:25.224823 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.224998 kubelet[3281]: E0819 08:16:25.224911 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.224998 kubelet[3281]: W0819 08:16:25.224956 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.225190 kubelet[3281]: E0819 08:16:25.225095 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.225449 kubelet[3281]: E0819 08:16:25.225418 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.225449 kubelet[3281]: W0819 08:16:25.225432 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.225626 kubelet[3281]: E0819 08:16:25.225611 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.225928 kubelet[3281]: E0819 08:16:25.225896 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.225928 kubelet[3281]: W0819 08:16:25.225910 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.226126 kubelet[3281]: E0819 08:16:25.226076 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.226471 kubelet[3281]: E0819 08:16:25.226440 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.226471 kubelet[3281]: W0819 08:16:25.226454 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.226643 kubelet[3281]: E0819 08:16:25.226595 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.226950 kubelet[3281]: E0819 08:16:25.226920 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.226950 kubelet[3281]: W0819 08:16:25.226933 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.227158 kubelet[3281]: E0819 08:16:25.227110 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.227682 kubelet[3281]: E0819 08:16:25.227626 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.227682 kubelet[3281]: W0819 08:16:25.227640 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.227682 kubelet[3281]: E0819 08:16:25.227654 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:25.252557 kubelet[3281]: E0819 08:16:25.252526 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:25.252557 kubelet[3281]: W0819 08:16:25.252554 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:25.252742 kubelet[3281]: E0819 08:16:25.252579 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:26.614073 kubelet[3281]: E0819 08:16:26.613010 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:27.754784 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1691199926.mount: Deactivated successfully. Aug 19 08:16:28.615694 kubelet[3281]: E0819 08:16:28.615359 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:28.873529 containerd[2014]: time="2025-08-19T08:16:28.873311818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:28.874410 containerd[2014]: time="2025-08-19T08:16:28.874342266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 19 08:16:28.876218 containerd[2014]: time="2025-08-19T08:16:28.876119415Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:28.879305 containerd[2014]: time="2025-08-19T08:16:28.879227033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:28.880052 containerd[2014]: time="2025-08-19T08:16:28.879975290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 3.960944714s" Aug 19 08:16:28.880052 containerd[2014]: time="2025-08-19T08:16:28.880010057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 19 08:16:28.881324 containerd[2014]: time="2025-08-19T08:16:28.881291648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 08:16:28.904657 containerd[2014]: time="2025-08-19T08:16:28.904602942Z" level=info msg="CreateContainer within sandbox \"0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 08:16:28.917056 containerd[2014]: time="2025-08-19T08:16:28.914885830Z" level=info msg="Container 96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:28.926849 containerd[2014]: time="2025-08-19T08:16:28.926790746Z" level=info msg="CreateContainer within sandbox \"0262f960a62eedca1f1fe3cd12d6be12aa9892a55acc394aac49e9d248893b9a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082\"" Aug 19 08:16:28.927899 containerd[2014]: time="2025-08-19T08:16:28.927871060Z" level=info msg="StartContainer for \"96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082\"" Aug 19 08:16:28.930067 containerd[2014]: time="2025-08-19T08:16:28.929262811Z" level=info msg="connecting to shim 96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082" address="unix:///run/containerd/s/587bf5915f301acce187934b5e11f7c2622bb807ca32c02d880c1cdc7a5275ed" protocol=ttrpc version=3 Aug 19 08:16:28.957223 systemd[1]: Started cri-containerd-96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082.scope - libcontainer container 96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082. Aug 19 08:16:29.028526 containerd[2014]: time="2025-08-19T08:16:29.028483751Z" level=info msg="StartContainer for \"96aba5ec07eb692a874cb05f0f8bfa7463935bc81ecadff9fcca5ca3f091d082\" returns successfully" Aug 19 08:16:29.827234 kubelet[3281]: E0819 08:16:29.827198 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.827234 kubelet[3281]: W0819 08:16:29.827232 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.827919 kubelet[3281]: E0819 08:16:29.827257 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.827919 kubelet[3281]: E0819 08:16:29.827609 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.827919 kubelet[3281]: W0819 08:16:29.827620 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.827919 kubelet[3281]: E0819 08:16:29.827634 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.827919 kubelet[3281]: E0819 08:16:29.827800 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.827919 kubelet[3281]: W0819 08:16:29.827808 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.827919 kubelet[3281]: E0819 08:16:29.827816 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828223 kubelet[3281]: E0819 08:16:29.827949 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828223 kubelet[3281]: W0819 08:16:29.827955 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828223 kubelet[3281]: E0819 08:16:29.827962 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828223 kubelet[3281]: E0819 08:16:29.828118 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828223 kubelet[3281]: W0819 08:16:29.828125 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828223 kubelet[3281]: E0819 08:16:29.828132 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828479 kubelet[3281]: E0819 08:16:29.828265 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828479 kubelet[3281]: W0819 08:16:29.828278 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828479 kubelet[3281]: E0819 08:16:29.828287 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828479 kubelet[3281]: E0819 08:16:29.828418 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828479 kubelet[3281]: W0819 08:16:29.828424 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828479 kubelet[3281]: E0819 08:16:29.828431 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828678 kubelet[3281]: E0819 08:16:29.828552 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828678 kubelet[3281]: W0819 08:16:29.828558 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828678 kubelet[3281]: E0819 08:16:29.828564 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828802 kubelet[3281]: E0819 08:16:29.828690 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828802 kubelet[3281]: W0819 08:16:29.828696 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828802 kubelet[3281]: E0819 08:16:29.828702 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.828908 kubelet[3281]: E0819 08:16:29.828871 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.828908 kubelet[3281]: W0819 08:16:29.828878 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.828908 kubelet[3281]: E0819 08:16:29.828887 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.829086 kubelet[3281]: E0819 08:16:29.829065 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.829086 kubelet[3281]: W0819 08:16:29.829082 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.829179 kubelet[3281]: E0819 08:16:29.829091 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.829240 kubelet[3281]: E0819 08:16:29.829225 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.829240 kubelet[3281]: W0819 08:16:29.829237 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.829324 kubelet[3281]: E0819 08:16:29.829244 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.829391 kubelet[3281]: E0819 08:16:29.829376 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.829391 kubelet[3281]: W0819 08:16:29.829385 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.829391 kubelet[3281]: E0819 08:16:29.829391 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.829567 kubelet[3281]: E0819 08:16:29.829511 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.829567 kubelet[3281]: W0819 08:16:29.829519 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.829567 kubelet[3281]: E0819 08:16:29.829525 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.829680 kubelet[3281]: E0819 08:16:29.829662 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.829680 kubelet[3281]: W0819 08:16:29.829676 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.829753 kubelet[3281]: E0819 08:16:29.829683 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.855574 kubelet[3281]: E0819 08:16:29.855282 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.855574 kubelet[3281]: W0819 08:16:29.855305 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.855574 kubelet[3281]: E0819 08:16:29.855325 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.855820 kubelet[3281]: E0819 08:16:29.855808 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.855902 kubelet[3281]: W0819 08:16:29.855871 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.855902 kubelet[3281]: E0819 08:16:29.855897 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.856144 kubelet[3281]: E0819 08:16:29.856108 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.856144 kubelet[3281]: W0819 08:16:29.856115 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.856144 kubelet[3281]: E0819 08:16:29.856130 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.856313 kubelet[3281]: E0819 08:16:29.856297 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.856313 kubelet[3281]: W0819 08:16:29.856311 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.856386 kubelet[3281]: E0819 08:16:29.856324 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.856526 kubelet[3281]: E0819 08:16:29.856503 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.856526 kubelet[3281]: W0819 08:16:29.856521 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.856612 kubelet[3281]: E0819 08:16:29.856547 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.857010 kubelet[3281]: E0819 08:16:29.856974 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.857010 kubelet[3281]: W0819 08:16:29.856988 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.857295 kubelet[3281]: E0819 08:16:29.857227 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.857506 kubelet[3281]: E0819 08:16:29.857453 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.857506 kubelet[3281]: W0819 08:16:29.857465 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.857506 kubelet[3281]: E0819 08:16:29.857486 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.858159 kubelet[3281]: E0819 08:16:29.858094 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.858159 kubelet[3281]: W0819 08:16:29.858107 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.858159 kubelet[3281]: E0819 08:16:29.858135 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.858458 kubelet[3281]: E0819 08:16:29.858300 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.858458 kubelet[3281]: W0819 08:16:29.858306 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.858458 kubelet[3281]: E0819 08:16:29.858320 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.859271 kubelet[3281]: E0819 08:16:29.859194 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.859271 kubelet[3281]: W0819 08:16:29.859207 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.859271 kubelet[3281]: E0819 08:16:29.859234 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.859623 kubelet[3281]: E0819 08:16:29.859366 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.859623 kubelet[3281]: W0819 08:16:29.859372 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.859623 kubelet[3281]: E0819 08:16:29.859558 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.859854 kubelet[3281]: E0819 08:16:29.859684 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.859854 kubelet[3281]: W0819 08:16:29.859692 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.859854 kubelet[3281]: E0819 08:16:29.859708 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.860227 kubelet[3281]: E0819 08:16:29.860201 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.860227 kubelet[3281]: W0819 08:16:29.860217 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.860288 kubelet[3281]: E0819 08:16:29.860232 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.860608 kubelet[3281]: E0819 08:16:29.860594 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.860608 kubelet[3281]: W0819 08:16:29.860606 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.860796 kubelet[3281]: E0819 08:16:29.860675 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.861571 kubelet[3281]: E0819 08:16:29.861553 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.861571 kubelet[3281]: W0819 08:16:29.861569 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.861663 kubelet[3281]: E0819 08:16:29.861584 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.862063 kubelet[3281]: E0819 08:16:29.861748 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.862063 kubelet[3281]: W0819 08:16:29.861757 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.862063 kubelet[3281]: E0819 08:16:29.861765 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.863002 kubelet[3281]: E0819 08:16:29.862981 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.863002 kubelet[3281]: W0819 08:16:29.862997 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.863104 kubelet[3281]: E0819 08:16:29.863015 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:29.863917 kubelet[3281]: E0819 08:16:29.863897 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:29.863917 kubelet[3281]: W0819 08:16:29.863911 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:29.864202 kubelet[3281]: E0819 08:16:29.863923 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.612970 kubelet[3281]: E0819 08:16:30.612499 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:30.721751 containerd[2014]: time="2025-08-19T08:16:30.721703395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:30.727119 containerd[2014]: time="2025-08-19T08:16:30.725780973Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:30.727119 containerd[2014]: time="2025-08-19T08:16:30.726262646Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 19 08:16:30.730259 containerd[2014]: time="2025-08-19T08:16:30.730194195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:30.732752 containerd[2014]: time="2025-08-19T08:16:30.732711805Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.850245705s" Aug 19 08:16:30.732914 containerd[2014]: time="2025-08-19T08:16:30.732896542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 19 08:16:30.742585 containerd[2014]: time="2025-08-19T08:16:30.742539823Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 08:16:30.756363 containerd[2014]: time="2025-08-19T08:16:30.756230041Z" level=info msg="Container e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:30.773171 containerd[2014]: time="2025-08-19T08:16:30.773088915Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\"" Aug 19 08:16:30.774394 containerd[2014]: time="2025-08-19T08:16:30.774349811Z" level=info msg="StartContainer for \"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\"" Aug 19 08:16:30.777088 containerd[2014]: time="2025-08-19T08:16:30.776964909Z" level=info msg="connecting to shim e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9" address="unix:///run/containerd/s/be324eefb339ce210d8b4490ab67d8fdb4d1b740f0bea24e787ceab8ae2dfe7d" protocol=ttrpc version=3 Aug 19 08:16:30.782608 kubelet[3281]: I0819 08:16:30.782581 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:16:30.799477 systemd[1]: Started cri-containerd-e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9.scope - libcontainer container e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9. Aug 19 08:16:30.837121 kubelet[3281]: E0819 08:16:30.837073 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.837121 kubelet[3281]: W0819 08:16:30.837098 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.837547 kubelet[3281]: E0819 08:16:30.837280 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.838595 kubelet[3281]: E0819 08:16:30.838553 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.838595 kubelet[3281]: W0819 08:16:30.838571 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.838595 kubelet[3281]: E0819 08:16:30.838587 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.839625 kubelet[3281]: E0819 08:16:30.839603 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.840450 kubelet[3281]: W0819 08:16:30.839718 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.840450 kubelet[3281]: E0819 08:16:30.839735 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.840450 kubelet[3281]: E0819 08:16:30.840117 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.840450 kubelet[3281]: W0819 08:16:30.840127 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.840957 kubelet[3281]: E0819 08:16:30.840467 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.840957 kubelet[3281]: E0819 08:16:30.840660 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.840957 kubelet[3281]: W0819 08:16:30.840667 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.840957 kubelet[3281]: E0819 08:16:30.840677 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.841848 kubelet[3281]: E0819 08:16:30.841831 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.842209 kubelet[3281]: W0819 08:16:30.841945 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.842209 kubelet[3281]: E0819 08:16:30.841961 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.842641 kubelet[3281]: E0819 08:16:30.842529 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.842641 kubelet[3281]: W0819 08:16:30.842541 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.842641 kubelet[3281]: E0819 08:16:30.842551 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.842916 kubelet[3281]: E0819 08:16:30.842867 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.842916 kubelet[3281]: W0819 08:16:30.842876 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.842916 kubelet[3281]: E0819 08:16:30.842886 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.843698 kubelet[3281]: E0819 08:16:30.843644 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.843698 kubelet[3281]: W0819 08:16:30.843663 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.843698 kubelet[3281]: E0819 08:16:30.843675 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.843904 kubelet[3281]: E0819 08:16:30.843865 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.843904 kubelet[3281]: W0819 08:16:30.843873 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.843904 kubelet[3281]: E0819 08:16:30.843882 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.844139 kubelet[3281]: E0819 08:16:30.844038 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.844139 kubelet[3281]: W0819 08:16:30.844044 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.844139 kubelet[3281]: E0819 08:16:30.844051 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844196 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.845783 kubelet[3281]: W0819 08:16:30.844202 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844208 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844375 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.845783 kubelet[3281]: W0819 08:16:30.844387 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844402 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844596 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.845783 kubelet[3281]: W0819 08:16:30.844606 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844615 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.845783 kubelet[3281]: E0819 08:16:30.844837 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.846989 kubelet[3281]: W0819 08:16:30.844844 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.846989 kubelet[3281]: E0819 08:16:30.844853 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.863566 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866086 kubelet[3281]: W0819 08:16:30.863588 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.863609 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.863842 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866086 kubelet[3281]: W0819 08:16:30.863866 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.863886 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.864115 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866086 kubelet[3281]: W0819 08:16:30.864124 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.864143 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866086 kubelet[3281]: E0819 08:16:30.864372 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866478 kubelet[3281]: W0819 08:16:30.864379 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.864394 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.864592 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866478 kubelet[3281]: W0819 08:16:30.864604 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.864624 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.864817 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866478 kubelet[3281]: W0819 08:16:30.864825 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.864842 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.866478 kubelet[3281]: E0819 08:16:30.865013 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.866478 kubelet[3281]: W0819 08:16:30.865021 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868088 kubelet[3281]: E0819 08:16:30.865050 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.868088 kubelet[3281]: E0819 08:16:30.866857 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.868088 kubelet[3281]: W0819 08:16:30.866872 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868088 kubelet[3281]: E0819 08:16:30.867169 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.868224 kubelet[3281]: E0819 08:16:30.868207 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.868224 kubelet[3281]: W0819 08:16:30.868218 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868381 kubelet[3281]: E0819 08:16:30.868312 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.868415 kubelet[3281]: E0819 08:16:30.868390 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.868415 kubelet[3281]: W0819 08:16:30.868395 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868480 kubelet[3281]: E0819 08:16:30.868469 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.868599 kubelet[3281]: E0819 08:16:30.868585 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.868599 kubelet[3281]: W0819 08:16:30.868596 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868675 kubelet[3281]: E0819 08:16:30.868614 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.868807 kubelet[3281]: E0819 08:16:30.868793 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.868807 kubelet[3281]: W0819 08:16:30.868802 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.868884 kubelet[3281]: E0819 08:16:30.868821 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.869563 kubelet[3281]: E0819 08:16:30.869547 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.869758 kubelet[3281]: W0819 08:16:30.869611 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.869758 kubelet[3281]: E0819 08:16:30.869633 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.871106 kubelet[3281]: E0819 08:16:30.871092 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.872490 kubelet[3281]: W0819 08:16:30.871181 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.872490 kubelet[3281]: E0819 08:16:30.871214 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.872843 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.873851 kubelet[3281]: W0819 08:16:30.872856 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.872892 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.873118 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.873851 kubelet[3281]: W0819 08:16:30.873126 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.873210 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.873443 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.873851 kubelet[3281]: W0819 08:16:30.873451 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.873460 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.873851 kubelet[3281]: E0819 08:16:30.873686 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:16:30.874167 kubelet[3281]: W0819 08:16:30.873694 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:16:30.874167 kubelet[3281]: E0819 08:16:30.873702 3281 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:16:30.877530 systemd[1]: cri-containerd-e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9.scope: Deactivated successfully. Aug 19 08:16:30.882293 containerd[2014]: time="2025-08-19T08:16:30.882254838Z" level=info msg="StartContainer for \"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\" returns successfully" Aug 19 08:16:30.921620 containerd[2014]: time="2025-08-19T08:16:30.921524619Z" level=info msg="received exit event container_id:\"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\" id:\"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\" pid:4220 exited_at:{seconds:1755591390 nanos:883893208}" Aug 19 08:16:30.956660 containerd[2014]: time="2025-08-19T08:16:30.956545402Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\" id:\"e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9\" pid:4220 exited_at:{seconds:1755591390 nanos:883893208}" Aug 19 08:16:30.975530 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e965f27265a6e248641503d6c68424c0667c566a6072a773ee9435edcf953ce9-rootfs.mount: Deactivated successfully. Aug 19 08:16:31.787528 containerd[2014]: time="2025-08-19T08:16:31.787352750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 08:16:31.805288 kubelet[3281]: I0819 08:16:31.804552 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f4459b4c7-xxlqp" podStartSLOduration=3.841508826 podStartE2EDuration="7.804533751s" podCreationTimestamp="2025-08-19 08:16:24 +0000 UTC" firstStartedPulling="2025-08-19 08:16:24.917973745 +0000 UTC m=+24.468810071" lastFinishedPulling="2025-08-19 08:16:28.880998676 +0000 UTC m=+28.431834996" observedRunningTime="2025-08-19 08:16:29.765667338 +0000 UTC m=+29.316503676" watchObservedRunningTime="2025-08-19 08:16:31.804533751 +0000 UTC m=+31.355370083" Aug 19 08:16:32.614172 kubelet[3281]: E0819 08:16:32.614136 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:34.615867 kubelet[3281]: E0819 08:16:34.615805 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:35.667813 containerd[2014]: time="2025-08-19T08:16:35.667746928Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:35.669916 containerd[2014]: time="2025-08-19T08:16:35.669633979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 19 08:16:35.672190 containerd[2014]: time="2025-08-19T08:16:35.672082460Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:35.676285 containerd[2014]: time="2025-08-19T08:16:35.675992022Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:35.676955 containerd[2014]: time="2025-08-19T08:16:35.676917736Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.889448276s" Aug 19 08:16:35.676955 containerd[2014]: time="2025-08-19T08:16:35.676956451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 19 08:16:35.681146 containerd[2014]: time="2025-08-19T08:16:35.680968280Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 08:16:35.705443 containerd[2014]: time="2025-08-19T08:16:35.703252933Z" level=info msg="Container 074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:35.708050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1945446007.mount: Deactivated successfully. Aug 19 08:16:35.725571 containerd[2014]: time="2025-08-19T08:16:35.725526185Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\"" Aug 19 08:16:35.726504 containerd[2014]: time="2025-08-19T08:16:35.726244988Z" level=info msg="StartContainer for \"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\"" Aug 19 08:16:35.728475 containerd[2014]: time="2025-08-19T08:16:35.728447037Z" level=info msg="connecting to shim 074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2" address="unix:///run/containerd/s/be324eefb339ce210d8b4490ab67d8fdb4d1b740f0bea24e787ceab8ae2dfe7d" protocol=ttrpc version=3 Aug 19 08:16:35.763277 systemd[1]: Started cri-containerd-074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2.scope - libcontainer container 074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2. Aug 19 08:16:35.815471 containerd[2014]: time="2025-08-19T08:16:35.815433023Z" level=info msg="StartContainer for \"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\" returns successfully" Aug 19 08:16:36.465318 systemd[1]: cri-containerd-074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2.scope: Deactivated successfully. Aug 19 08:16:36.465681 systemd[1]: cri-containerd-074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2.scope: Consumed 505ms CPU time, 162.5M memory peak, 4.2M read from disk, 171.2M written to disk. Aug 19 08:16:36.486166 containerd[2014]: time="2025-08-19T08:16:36.486056921Z" level=info msg="received exit event container_id:\"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\" id:\"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\" pid:4313 exited_at:{seconds:1755591396 nanos:485787965}" Aug 19 08:16:36.490050 containerd[2014]: time="2025-08-19T08:16:36.489611020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\" id:\"074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2\" pid:4313 exited_at:{seconds:1755591396 nanos:485787965}" Aug 19 08:16:36.550556 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-074f5ef8595bd5532b29acbd2d1d45a12cfeae0a453a36bf08a3a190edaba1b2-rootfs.mount: Deactivated successfully. Aug 19 08:16:36.557055 kubelet[3281]: I0819 08:16:36.557013 3281 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Aug 19 08:16:36.621003 systemd[1]: Created slice kubepods-besteffort-podfd28c4a4_f3f6_4029_ab76_882058ba96b8.slice - libcontainer container kubepods-besteffort-podfd28c4a4_f3f6_4029_ab76_882058ba96b8.slice. Aug 19 08:16:36.638221 containerd[2014]: time="2025-08-19T08:16:36.638156569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fhzgx,Uid:fd28c4a4-f3f6-4029-ab76-882058ba96b8,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:36.868910 systemd[1]: Created slice kubepods-burstable-pod407658c8_3f54_45e6_834a_4675e288a13c.slice - libcontainer container kubepods-burstable-pod407658c8_3f54_45e6_834a_4675e288a13c.slice. Aug 19 08:16:36.875292 systemd[1]: Created slice kubepods-besteffort-pod234924fd_06d9_4c69_a752_1aa0b5b48653.slice - libcontainer container kubepods-besteffort-pod234924fd_06d9_4c69_a752_1aa0b5b48653.slice. Aug 19 08:16:36.886542 systemd[1]: Created slice kubepods-besteffort-podd00cdf73_4cda_404a_bf73_b97dabb3c18b.slice - libcontainer container kubepods-besteffort-podd00cdf73_4cda_404a_bf73_b97dabb3c18b.slice. Aug 19 08:16:36.896445 systemd[1]: Created slice kubepods-besteffort-podc131fa7d_d233_47a1_b711_976a55e028ab.slice - libcontainer container kubepods-besteffort-podc131fa7d_d233_47a1_b711_976a55e028ab.slice. Aug 19 08:16:36.904892 systemd[1]: Created slice kubepods-burstable-pod48eab14c_74c0_4682_912d_dd41c3aaf383.slice - libcontainer container kubepods-burstable-pod48eab14c_74c0_4682_912d_dd41c3aaf383.slice. Aug 19 08:16:36.912242 systemd[1]: Created slice kubepods-besteffort-pode0e4745e_1317_4dcd_9549_1d73509f7275.slice - libcontainer container kubepods-besteffort-pode0e4745e_1317_4dcd_9549_1d73509f7275.slice. Aug 19 08:16:36.932396 systemd[1]: Created slice kubepods-besteffort-pod97b6de00_458f_491a_a402_be43782ea6a4.slice - libcontainer container kubepods-besteffort-pod97b6de00_458f_491a_a402_be43782ea6a4.slice. Aug 19 08:16:36.961450 kubelet[3281]: I0819 08:16:36.961194 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48eab14c-74c0-4682-912d-dd41c3aaf383-config-volume\") pod \"coredns-7c65d6cfc9-5dnf2\" (UID: \"48eab14c-74c0-4682-912d-dd41c3aaf383\") " pod="kube-system/coredns-7c65d6cfc9-5dnf2" Aug 19 08:16:36.961450 kubelet[3281]: I0819 08:16:36.961263 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm8wn\" (UniqueName: \"kubernetes.io/projected/48eab14c-74c0-4682-912d-dd41c3aaf383-kube-api-access-wm8wn\") pod \"coredns-7c65d6cfc9-5dnf2\" (UID: \"48eab14c-74c0-4682-912d-dd41c3aaf383\") " pod="kube-system/coredns-7c65d6cfc9-5dnf2" Aug 19 08:16:36.961450 kubelet[3281]: I0819 08:16:36.961283 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2msgx\" (UniqueName: \"kubernetes.io/projected/d00cdf73-4cda-404a-bf73-b97dabb3c18b-kube-api-access-2msgx\") pod \"goldmane-58fd7646b9-s59gg\" (UID: \"d00cdf73-4cda-404a-bf73-b97dabb3c18b\") " pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:36.961450 kubelet[3281]: I0819 08:16:36.961300 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8xj\" (UniqueName: \"kubernetes.io/projected/c131fa7d-d233-47a1-b711-976a55e028ab-kube-api-access-rr8xj\") pod \"whisker-59fcc5fc77-frlsc\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " pod="calico-system/whisker-59fcc5fc77-frlsc" Aug 19 08:16:36.963583 kubelet[3281]: I0819 08:16:36.962047 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d00cdf73-4cda-404a-bf73-b97dabb3c18b-config\") pod \"goldmane-58fd7646b9-s59gg\" (UID: \"d00cdf73-4cda-404a-bf73-b97dabb3c18b\") " pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:36.963583 kubelet[3281]: I0819 08:16:36.962083 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/234924fd-06d9-4c69-a752-1aa0b5b48653-calico-apiserver-certs\") pod \"calico-apiserver-f4577c465-x44gs\" (UID: \"234924fd-06d9-4c69-a752-1aa0b5b48653\") " pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" Aug 19 08:16:36.963583 kubelet[3281]: I0819 08:16:36.962100 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8bh\" (UniqueName: \"kubernetes.io/projected/407658c8-3f54-45e6-834a-4675e288a13c-kube-api-access-tb8bh\") pod \"coredns-7c65d6cfc9-8hp9n\" (UID: \"407658c8-3f54-45e6-834a-4675e288a13c\") " pod="kube-system/coredns-7c65d6cfc9-8hp9n" Aug 19 08:16:36.963583 kubelet[3281]: I0819 08:16:36.962126 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d00cdf73-4cda-404a-bf73-b97dabb3c18b-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-s59gg\" (UID: \"d00cdf73-4cda-404a-bf73-b97dabb3c18b\") " pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:36.963583 kubelet[3281]: I0819 08:16:36.962147 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b6de00-458f-491a-a402-be43782ea6a4-tigera-ca-bundle\") pod \"calico-kube-controllers-85b4b76999-hwqxm\" (UID: \"97b6de00-458f-491a-a402-be43782ea6a4\") " pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" Aug 19 08:16:36.963781 kubelet[3281]: I0819 08:16:36.962168 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d2h2\" (UniqueName: \"kubernetes.io/projected/97b6de00-458f-491a-a402-be43782ea6a4-kube-api-access-5d2h2\") pod \"calico-kube-controllers-85b4b76999-hwqxm\" (UID: \"97b6de00-458f-491a-a402-be43782ea6a4\") " pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" Aug 19 08:16:36.963781 kubelet[3281]: I0819 08:16:36.962189 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-ca-bundle\") pod \"whisker-59fcc5fc77-frlsc\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " pod="calico-system/whisker-59fcc5fc77-frlsc" Aug 19 08:16:36.963781 kubelet[3281]: I0819 08:16:36.962207 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9tmw\" (UniqueName: \"kubernetes.io/projected/234924fd-06d9-4c69-a752-1aa0b5b48653-kube-api-access-f9tmw\") pod \"calico-apiserver-f4577c465-x44gs\" (UID: \"234924fd-06d9-4c69-a752-1aa0b5b48653\") " pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" Aug 19 08:16:36.963781 kubelet[3281]: I0819 08:16:36.962224 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d00cdf73-4cda-404a-bf73-b97dabb3c18b-goldmane-key-pair\") pod \"goldmane-58fd7646b9-s59gg\" (UID: \"d00cdf73-4cda-404a-bf73-b97dabb3c18b\") " pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:36.963781 kubelet[3281]: I0819 08:16:36.962241 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrd4\" (UniqueName: \"kubernetes.io/projected/e0e4745e-1317-4dcd-9549-1d73509f7275-kube-api-access-gmrd4\") pod \"calico-apiserver-f4577c465-9dqqf\" (UID: \"e0e4745e-1317-4dcd-9549-1d73509f7275\") " pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" Aug 19 08:16:36.963912 kubelet[3281]: I0819 08:16:36.962263 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-backend-key-pair\") pod \"whisker-59fcc5fc77-frlsc\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " pod="calico-system/whisker-59fcc5fc77-frlsc" Aug 19 08:16:36.963912 kubelet[3281]: I0819 08:16:36.962280 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/407658c8-3f54-45e6-834a-4675e288a13c-config-volume\") pod \"coredns-7c65d6cfc9-8hp9n\" (UID: \"407658c8-3f54-45e6-834a-4675e288a13c\") " pod="kube-system/coredns-7c65d6cfc9-8hp9n" Aug 19 08:16:36.963912 kubelet[3281]: I0819 08:16:36.962297 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e0e4745e-1317-4dcd-9549-1d73509f7275-calico-apiserver-certs\") pod \"calico-apiserver-f4577c465-9dqqf\" (UID: \"e0e4745e-1317-4dcd-9549-1d73509f7275\") " pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" Aug 19 08:16:37.175732 containerd[2014]: time="2025-08-19T08:16:37.175691515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hp9n,Uid:407658c8-3f54-45e6-834a-4675e288a13c,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:37.186656 containerd[2014]: time="2025-08-19T08:16:37.186611146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-x44gs,Uid:234924fd-06d9-4c69-a752-1aa0b5b48653,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:16:37.191796 containerd[2014]: time="2025-08-19T08:16:37.190586585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-s59gg,Uid:d00cdf73-4cda-404a-bf73-b97dabb3c18b,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:37.203155 containerd[2014]: time="2025-08-19T08:16:37.203121427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59fcc5fc77-frlsc,Uid:c131fa7d-d233-47a1-b711-976a55e028ab,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:37.210856 containerd[2014]: time="2025-08-19T08:16:37.210639899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5dnf2,Uid:48eab14c-74c0-4682-912d-dd41c3aaf383,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:37.226410 containerd[2014]: time="2025-08-19T08:16:37.226370522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-9dqqf,Uid:e0e4745e-1317-4dcd-9549-1d73509f7275,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:16:37.257284 containerd[2014]: time="2025-08-19T08:16:37.257217000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4b76999-hwqxm,Uid:97b6de00-458f-491a-a402-be43782ea6a4,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:37.501489 containerd[2014]: time="2025-08-19T08:16:37.501007213Z" level=error msg="Failed to destroy network for sandbox \"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.509931 containerd[2014]: time="2025-08-19T08:16:37.504749653Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-s59gg,Uid:d00cdf73-4cda-404a-bf73-b97dabb3c18b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.513606 kubelet[3281]: E0819 08:16:37.511964 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.513606 kubelet[3281]: E0819 08:16:37.512076 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:37.513606 kubelet[3281]: E0819 08:16:37.512103 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-s59gg" Aug 19 08:16:37.513871 kubelet[3281]: E0819 08:16:37.513200 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-s59gg_calico-system(d00cdf73-4cda-404a-bf73-b97dabb3c18b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-s59gg_calico-system(d00cdf73-4cda-404a-bf73-b97dabb3c18b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22c9125531b6fe60ff6a0a8a59f53fb8578976130aa1e6fe38e7d1c4a0a32010\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-s59gg" podUID="d00cdf73-4cda-404a-bf73-b97dabb3c18b" Aug 19 08:16:37.578429 containerd[2014]: time="2025-08-19T08:16:37.578364992Z" level=error msg="Failed to destroy network for sandbox \"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.583749 containerd[2014]: time="2025-08-19T08:16:37.583680058Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hp9n,Uid:407658c8-3f54-45e6-834a-4675e288a13c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.584476 kubelet[3281]: E0819 08:16:37.583974 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.584476 kubelet[3281]: E0819 08:16:37.584068 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8hp9n" Aug 19 08:16:37.584476 kubelet[3281]: E0819 08:16:37.584097 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8hp9n" Aug 19 08:16:37.585392 kubelet[3281]: E0819 08:16:37.584160 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8hp9n_kube-system(407658c8-3f54-45e6-834a-4675e288a13c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8hp9n_kube-system(407658c8-3f54-45e6-834a-4675e288a13c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8ba6999c4a0dc9e6f31c9e7b27e85c4c31e9c9a9446464018f8cc5ad74f4a42a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8hp9n" podUID="407658c8-3f54-45e6-834a-4675e288a13c" Aug 19 08:16:37.605049 containerd[2014]: time="2025-08-19T08:16:37.604977103Z" level=error msg="Failed to destroy network for sandbox \"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.608660 containerd[2014]: time="2025-08-19T08:16:37.608605080Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fhzgx,Uid:fd28c4a4-f3f6-4029-ab76-882058ba96b8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.609854 kubelet[3281]: E0819 08:16:37.609811 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.610266 kubelet[3281]: E0819 08:16:37.610145 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:37.610421 kubelet[3281]: E0819 08:16:37.610400 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fhzgx" Aug 19 08:16:37.610581 kubelet[3281]: E0819 08:16:37.610552 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fhzgx_calico-system(fd28c4a4-f3f6-4029-ab76-882058ba96b8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fhzgx_calico-system(fd28c4a4-f3f6-4029-ab76-882058ba96b8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b28f5fbeee40fa16a24af296f1270bc6492a9b37370f343a521ed04f0a2fd38\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fhzgx" podUID="fd28c4a4-f3f6-4029-ab76-882058ba96b8" Aug 19 08:16:37.616236 containerd[2014]: time="2025-08-19T08:16:37.616173826Z" level=error msg="Failed to destroy network for sandbox \"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.621855 containerd[2014]: time="2025-08-19T08:16:37.621797145Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-x44gs,Uid:234924fd-06d9-4c69-a752-1aa0b5b48653,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.622599 kubelet[3281]: E0819 08:16:37.622057 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.622599 kubelet[3281]: E0819 08:16:37.622122 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" Aug 19 08:16:37.622599 kubelet[3281]: E0819 08:16:37.622150 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" Aug 19 08:16:37.622803 kubelet[3281]: E0819 08:16:37.622200 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f4577c465-x44gs_calico-apiserver(234924fd-06d9-4c69-a752-1aa0b5b48653)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f4577c465-x44gs_calico-apiserver(234924fd-06d9-4c69-a752-1aa0b5b48653)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3910d7dffa229200d9ebc49656e497a37dcbf8671faa26a87d957495a272d16b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" podUID="234924fd-06d9-4c69-a752-1aa0b5b48653" Aug 19 08:16:37.624739 containerd[2014]: time="2025-08-19T08:16:37.624692504Z" level=error msg="Failed to destroy network for sandbox \"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.626219 containerd[2014]: time="2025-08-19T08:16:37.626173020Z" level=error msg="Failed to destroy network for sandbox \"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.627529 containerd[2014]: time="2025-08-19T08:16:37.627432924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59fcc5fc77-frlsc,Uid:c131fa7d-d233-47a1-b711-976a55e028ab,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.627788 kubelet[3281]: E0819 08:16:37.627718 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.627876 kubelet[3281]: E0819 08:16:37.627842 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59fcc5fc77-frlsc" Aug 19 08:16:37.627938 kubelet[3281]: E0819 08:16:37.627890 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59fcc5fc77-frlsc" Aug 19 08:16:37.628188 kubelet[3281]: E0819 08:16:37.628001 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-59fcc5fc77-frlsc_calico-system(c131fa7d-d233-47a1-b711-976a55e028ab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-59fcc5fc77-frlsc_calico-system(c131fa7d-d233-47a1-b711-976a55e028ab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e10b5fe6266e505c56a43b61418df6272d53bdc7288104cfa876d22ed052ddfa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-59fcc5fc77-frlsc" podUID="c131fa7d-d233-47a1-b711-976a55e028ab" Aug 19 08:16:37.630453 containerd[2014]: time="2025-08-19T08:16:37.630399451Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4b76999-hwqxm,Uid:97b6de00-458f-491a-a402-be43782ea6a4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.631276 kubelet[3281]: E0819 08:16:37.631084 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.631276 kubelet[3281]: E0819 08:16:37.631146 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" Aug 19 08:16:37.631276 kubelet[3281]: E0819 08:16:37.631176 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" Aug 19 08:16:37.631476 kubelet[3281]: E0819 08:16:37.631229 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-85b4b76999-hwqxm_calico-system(97b6de00-458f-491a-a402-be43782ea6a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-85b4b76999-hwqxm_calico-system(97b6de00-458f-491a-a402-be43782ea6a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"99a8bc35ad12aa6dfbe33a0e239b36c8a2f442cb1b0435b574951a8ae0401a4d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" podUID="97b6de00-458f-491a-a402-be43782ea6a4" Aug 19 08:16:37.645780 containerd[2014]: time="2025-08-19T08:16:37.645635844Z" level=error msg="Failed to destroy network for sandbox \"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.648469 containerd[2014]: time="2025-08-19T08:16:37.648392620Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5dnf2,Uid:48eab14c-74c0-4682-912d-dd41c3aaf383,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.649164 kubelet[3281]: E0819 08:16:37.649090 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.649662 kubelet[3281]: E0819 08:16:37.649159 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5dnf2" Aug 19 08:16:37.649662 kubelet[3281]: E0819 08:16:37.649189 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5dnf2" Aug 19 08:16:37.650084 kubelet[3281]: E0819 08:16:37.649238 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-5dnf2_kube-system(48eab14c-74c0-4682-912d-dd41c3aaf383)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-5dnf2_kube-system(48eab14c-74c0-4682-912d-dd41c3aaf383)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6127f3c9af0cfd953a0fe953e645fd3754f0f089a94a6bd72181ed56e2b907e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-5dnf2" podUID="48eab14c-74c0-4682-912d-dd41c3aaf383" Aug 19 08:16:37.659978 containerd[2014]: time="2025-08-19T08:16:37.659930872Z" level=error msg="Failed to destroy network for sandbox \"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.662351 containerd[2014]: time="2025-08-19T08:16:37.662267138Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-9dqqf,Uid:e0e4745e-1317-4dcd-9549-1d73509f7275,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.662611 kubelet[3281]: E0819 08:16:37.662552 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:16:37.662611 kubelet[3281]: E0819 08:16:37.662606 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" Aug 19 08:16:37.662690 kubelet[3281]: E0819 08:16:37.662625 3281 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" Aug 19 08:16:37.662690 kubelet[3281]: E0819 08:16:37.662669 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f4577c465-9dqqf_calico-apiserver(e0e4745e-1317-4dcd-9549-1d73509f7275)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f4577c465-9dqqf_calico-apiserver(e0e4745e-1317-4dcd-9549-1d73509f7275)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4cabf396cee65abfa823eecce50b5c34447c016a0d3934097d5aa90e8cabe32e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" podUID="e0e4745e-1317-4dcd-9549-1d73509f7275" Aug 19 08:16:37.868966 containerd[2014]: time="2025-08-19T08:16:37.867793635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 08:16:38.086870 systemd[1]: run-netns-cni\x2d2e948d18\x2deda1\x2d3b2f\x2d7bc5\x2deb106bef74f3.mount: Deactivated successfully. Aug 19 08:16:45.604159 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4159032764.mount: Deactivated successfully. Aug 19 08:16:45.708125 containerd[2014]: time="2025-08-19T08:16:45.707239395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.730156 containerd[2014]: time="2025-08-19T08:16:45.730101107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 19 08:16:45.735474 containerd[2014]: time="2025-08-19T08:16:45.735393398Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.739459 containerd[2014]: time="2025-08-19T08:16:45.739387251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:45.742471 containerd[2014]: time="2025-08-19T08:16:45.742419260Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 7.872116907s" Aug 19 08:16:45.742471 containerd[2014]: time="2025-08-19T08:16:45.742460674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 19 08:16:45.771969 containerd[2014]: time="2025-08-19T08:16:45.771919896Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 08:16:45.827072 containerd[2014]: time="2025-08-19T08:16:45.826712252Z" level=info msg="Container bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:45.831194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount894840937.mount: Deactivated successfully. Aug 19 08:16:45.939711 containerd[2014]: time="2025-08-19T08:16:45.939650181Z" level=info msg="CreateContainer within sandbox \"851940023145abdd1eea3b2b1310cf6732fa90b7fbf9cc28f2cb59b088f46212\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\"" Aug 19 08:16:45.940373 containerd[2014]: time="2025-08-19T08:16:45.940337644Z" level=info msg="StartContainer for \"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\"" Aug 19 08:16:45.954191 containerd[2014]: time="2025-08-19T08:16:45.954141810Z" level=info msg="connecting to shim bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28" address="unix:///run/containerd/s/be324eefb339ce210d8b4490ab67d8fdb4d1b740f0bea24e787ceab8ae2dfe7d" protocol=ttrpc version=3 Aug 19 08:16:46.118389 systemd[1]: Started cri-containerd-bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28.scope - libcontainer container bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28. Aug 19 08:16:46.181355 containerd[2014]: time="2025-08-19T08:16:46.181312970Z" level=info msg="StartContainer for \"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" returns successfully" Aug 19 08:16:46.435407 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 08:16:46.436766 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 08:16:46.838886 kubelet[3281]: I0819 08:16:46.838778 3281 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-ca-bundle\") pod \"c131fa7d-d233-47a1-b711-976a55e028ab\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " Aug 19 08:16:46.838886 kubelet[3281]: I0819 08:16:46.838833 3281 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr8xj\" (UniqueName: \"kubernetes.io/projected/c131fa7d-d233-47a1-b711-976a55e028ab-kube-api-access-rr8xj\") pod \"c131fa7d-d233-47a1-b711-976a55e028ab\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " Aug 19 08:16:46.838886 kubelet[3281]: I0819 08:16:46.838857 3281 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-backend-key-pair\") pod \"c131fa7d-d233-47a1-b711-976a55e028ab\" (UID: \"c131fa7d-d233-47a1-b711-976a55e028ab\") " Aug 19 08:16:46.840510 kubelet[3281]: I0819 08:16:46.840445 3281 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c131fa7d-d233-47a1-b711-976a55e028ab" (UID: "c131fa7d-d233-47a1-b711-976a55e028ab"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 19 08:16:46.848170 kubelet[3281]: I0819 08:16:46.848115 3281 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c131fa7d-d233-47a1-b711-976a55e028ab" (UID: "c131fa7d-d233-47a1-b711-976a55e028ab"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 19 08:16:46.849075 systemd[1]: var-lib-kubelet-pods-c131fa7d\x2dd233\x2d47a1\x2db711\x2d976a55e028ab-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 08:16:46.852394 systemd[1]: var-lib-kubelet-pods-c131fa7d\x2dd233\x2d47a1\x2db711\x2d976a55e028ab-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drr8xj.mount: Deactivated successfully. Aug 19 08:16:46.854075 kubelet[3281]: I0819 08:16:46.853997 3281 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c131fa7d-d233-47a1-b711-976a55e028ab-kube-api-access-rr8xj" (OuterVolumeSpecName: "kube-api-access-rr8xj") pod "c131fa7d-d233-47a1-b711-976a55e028ab" (UID: "c131fa7d-d233-47a1-b711-976a55e028ab"). InnerVolumeSpecName "kube-api-access-rr8xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 19 08:16:46.907599 systemd[1]: Removed slice kubepods-besteffort-podc131fa7d_d233_47a1_b711_976a55e028ab.slice - libcontainer container kubepods-besteffort-podc131fa7d_d233_47a1_b711_976a55e028ab.slice. Aug 19 08:16:46.939451 kubelet[3281]: I0819 08:16:46.939306 3281 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr8xj\" (UniqueName: \"kubernetes.io/projected/c131fa7d-d233-47a1-b711-976a55e028ab-kube-api-access-rr8xj\") on node \"ip-172-31-23-28\" DevicePath \"\"" Aug 19 08:16:46.941333 kubelet[3281]: I0819 08:16:46.941307 3281 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-backend-key-pair\") on node \"ip-172-31-23-28\" DevicePath \"\"" Aug 19 08:16:46.941562 kubelet[3281]: I0819 08:16:46.941545 3281 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c131fa7d-d233-47a1-b711-976a55e028ab-whisker-ca-bundle\") on node \"ip-172-31-23-28\" DevicePath \"\"" Aug 19 08:16:46.953169 kubelet[3281]: I0819 08:16:46.952815 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hkk2w" podStartSLOduration=2.4282846129999998 podStartE2EDuration="22.952791738s" podCreationTimestamp="2025-08-19 08:16:24 +0000 UTC" firstStartedPulling="2025-08-19 08:16:25.224058145 +0000 UTC m=+24.774894472" lastFinishedPulling="2025-08-19 08:16:45.748565276 +0000 UTC m=+45.299401597" observedRunningTime="2025-08-19 08:16:46.929758698 +0000 UTC m=+46.480595038" watchObservedRunningTime="2025-08-19 08:16:46.952791738 +0000 UTC m=+46.503628082" Aug 19 08:16:47.028352 systemd[1]: Created slice kubepods-besteffort-podf0391106_c5bd_4e01_921c_237d094904c1.slice - libcontainer container kubepods-besteffort-podf0391106_c5bd_4e01_921c_237d094904c1.slice. Aug 19 08:16:47.042588 kubelet[3281]: I0819 08:16:47.042453 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sptxp\" (UniqueName: \"kubernetes.io/projected/f0391106-c5bd-4e01-921c-237d094904c1-kube-api-access-sptxp\") pod \"whisker-5c878cc6c5-8rfs9\" (UID: \"f0391106-c5bd-4e01-921c-237d094904c1\") " pod="calico-system/whisker-5c878cc6c5-8rfs9" Aug 19 08:16:47.042588 kubelet[3281]: I0819 08:16:47.042492 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f0391106-c5bd-4e01-921c-237d094904c1-whisker-backend-key-pair\") pod \"whisker-5c878cc6c5-8rfs9\" (UID: \"f0391106-c5bd-4e01-921c-237d094904c1\") " pod="calico-system/whisker-5c878cc6c5-8rfs9" Aug 19 08:16:47.042588 kubelet[3281]: I0819 08:16:47.042517 3281 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0391106-c5bd-4e01-921c-237d094904c1-whisker-ca-bundle\") pod \"whisker-5c878cc6c5-8rfs9\" (UID: \"f0391106-c5bd-4e01-921c-237d094904c1\") " pod="calico-system/whisker-5c878cc6c5-8rfs9" Aug 19 08:16:47.332457 containerd[2014]: time="2025-08-19T08:16:47.332419604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c878cc6c5-8rfs9,Uid:f0391106-c5bd-4e01-921c-237d094904c1,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:47.863293 (udev-worker)[4615]: Network interface NamePolicy= disabled on kernel command line. Aug 19 08:16:47.866888 systemd-networkd[1812]: calie9543d3a887: Link UP Aug 19 08:16:47.868314 systemd-networkd[1812]: calie9543d3a887: Gained carrier Aug 19 08:16:47.888553 containerd[2014]: 2025-08-19 08:16:47.380 [INFO][4647] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:47.888553 containerd[2014]: 2025-08-19 08:16:47.439 [INFO][4647] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0 whisker-5c878cc6c5- calico-system f0391106-c5bd-4e01-921c-237d094904c1 875 0 2025-08-19 08:16:46 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5c878cc6c5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-23-28 whisker-5c878cc6c5-8rfs9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie9543d3a887 [] [] }} ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-" Aug 19 08:16:47.888553 containerd[2014]: 2025-08-19 08:16:47.439 [INFO][4647] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.888553 containerd[2014]: 2025-08-19 08:16:47.780 [INFO][4655] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" HandleID="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Workload="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.784 [INFO][4655] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" HandleID="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Workload="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000122250), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-28", "pod":"whisker-5c878cc6c5-8rfs9", "timestamp":"2025-08-19 08:16:47.779981984 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.784 [INFO][4655] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.785 [INFO][4655] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.786 [INFO][4655] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.802 [INFO][4655] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" host="ip-172-31-23-28" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.815 [INFO][4655] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.821 [INFO][4655] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.823 [INFO][4655] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.826 [INFO][4655] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:47.888807 containerd[2014]: 2025-08-19 08:16:47.826 [INFO][4655] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" host="ip-172-31-23-28" Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.828 [INFO][4655] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.835 [INFO][4655] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" host="ip-172-31-23-28" Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.843 [INFO][4655] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.1/26] block=192.168.80.0/26 handle="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" host="ip-172-31-23-28" Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.843 [INFO][4655] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.1/26] handle="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" host="ip-172-31-23-28" Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.844 [INFO][4655] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:47.889085 containerd[2014]: 2025-08-19 08:16:47.844 [INFO][4655] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.1/26] IPv6=[] ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" HandleID="k8s-pod-network.cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Workload="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.889221 containerd[2014]: 2025-08-19 08:16:47.846 [INFO][4647] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0", GenerateName:"whisker-5c878cc6c5-", Namespace:"calico-system", SelfLink:"", UID:"f0391106-c5bd-4e01-921c-237d094904c1", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c878cc6c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"whisker-5c878cc6c5-8rfs9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.80.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9543d3a887", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:47.889221 containerd[2014]: 2025-08-19 08:16:47.847 [INFO][4647] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.1/32] ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.889302 containerd[2014]: 2025-08-19 08:16:47.847 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9543d3a887 ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.889302 containerd[2014]: 2025-08-19 08:16:47.868 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:47.889828 containerd[2014]: 2025-08-19 08:16:47.868 [INFO][4647] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0", GenerateName:"whisker-5c878cc6c5-", Namespace:"calico-system", SelfLink:"", UID:"f0391106-c5bd-4e01-921c-237d094904c1", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5c878cc6c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb", Pod:"whisker-5c878cc6c5-8rfs9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.80.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9543d3a887", MAC:"f2:fc:85:23:25:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:47.889938 containerd[2014]: 2025-08-19 08:16:47.884 [INFO][4647] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" Namespace="calico-system" Pod="whisker-5c878cc6c5-8rfs9" WorkloadEndpoint="ip--172--31--23--28-k8s-whisker--5c878cc6c5--8rfs9-eth0" Aug 19 08:16:48.252318 containerd[2014]: time="2025-08-19T08:16:48.252244548Z" level=info msg="connecting to shim cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb" address="unix:///run/containerd/s/c9808de8c22630ed6f8643967cb8a1fcd70a238ab4c1bf98fd3c3f3da043aeb7" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:48.346270 systemd[1]: Started cri-containerd-cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb.scope - libcontainer container cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb. Aug 19 08:16:48.523143 containerd[2014]: time="2025-08-19T08:16:48.522427432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c878cc6c5-8rfs9,Uid:f0391106-c5bd-4e01-921c-237d094904c1,Namespace:calico-system,Attempt:0,} returns sandbox id \"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb\"" Aug 19 08:16:48.548232 containerd[2014]: time="2025-08-19T08:16:48.548170305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 08:16:48.615786 kubelet[3281]: I0819 08:16:48.615737 3281 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c131fa7d-d233-47a1-b711-976a55e028ab" path="/var/lib/kubelet/pods/c131fa7d-d233-47a1-b711-976a55e028ab/volumes" Aug 19 08:16:49.149383 systemd-networkd[1812]: calie9543d3a887: Gained IPv6LL Aug 19 08:16:49.614930 containerd[2014]: time="2025-08-19T08:16:49.614552199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-9dqqf,Uid:e0e4745e-1317-4dcd-9549-1d73509f7275,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:16:49.616195 containerd[2014]: time="2025-08-19T08:16:49.615785103Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-s59gg,Uid:d00cdf73-4cda-404a-bf73-b97dabb3c18b,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:49.886233 systemd-networkd[1812]: calica5c8950800: Link UP Aug 19 08:16:49.888310 systemd-networkd[1812]: calica5c8950800: Gained carrier Aug 19 08:16:49.920732 containerd[2014]: 2025-08-19 08:16:49.728 [INFO][4828] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:49.920732 containerd[2014]: 2025-08-19 08:16:49.745 [INFO][4828] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0 calico-apiserver-f4577c465- calico-apiserver e0e4745e-1317-4dcd-9549-1d73509f7275 807 0 2025-08-19 08:16:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f4577c465 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-23-28 calico-apiserver-f4577c465-9dqqf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calica5c8950800 [] [] }} ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-" Aug 19 08:16:49.920732 containerd[2014]: 2025-08-19 08:16:49.745 [INFO][4828] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.920732 containerd[2014]: 2025-08-19 08:16:49.809 [INFO][4857] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" HandleID="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.810 [INFO][4857] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" HandleID="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7970), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-23-28", "pod":"calico-apiserver-f4577c465-9dqqf", "timestamp":"2025-08-19 08:16:49.809476478 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.810 [INFO][4857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.810 [INFO][4857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.810 [INFO][4857] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.824 [INFO][4857] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" host="ip-172-31-23-28" Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.838 [INFO][4857] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.847 [INFO][4857] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.851 [INFO][4857] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:49.921082 containerd[2014]: 2025-08-19 08:16:49.854 [INFO][4857] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.854 [INFO][4857] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" host="ip-172-31-23-28" Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.856 [INFO][4857] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404 Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.863 [INFO][4857] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" host="ip-172-31-23-28" Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.872 [INFO][4857] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.2/26] block=192.168.80.0/26 handle="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" host="ip-172-31-23-28" Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.872 [INFO][4857] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.2/26] handle="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" host="ip-172-31-23-28" Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.872 [INFO][4857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:49.921510 containerd[2014]: 2025-08-19 08:16:49.873 [INFO][4857] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.2/26] IPv6=[] ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" HandleID="k8s-pod-network.c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.921872 containerd[2014]: 2025-08-19 08:16:49.879 [INFO][4828] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0", GenerateName:"calico-apiserver-f4577c465-", Namespace:"calico-apiserver", SelfLink:"", UID:"e0e4745e-1317-4dcd-9549-1d73509f7275", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f4577c465", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"calico-apiserver-f4577c465-9dqqf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calica5c8950800", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:49.921985 containerd[2014]: 2025-08-19 08:16:49.880 [INFO][4828] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.2/32] ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.921985 containerd[2014]: 2025-08-19 08:16:49.880 [INFO][4828] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica5c8950800 ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.921985 containerd[2014]: 2025-08-19 08:16:49.890 [INFO][4828] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.922221 containerd[2014]: 2025-08-19 08:16:49.894 [INFO][4828] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0", GenerateName:"calico-apiserver-f4577c465-", Namespace:"calico-apiserver", SelfLink:"", UID:"e0e4745e-1317-4dcd-9549-1d73509f7275", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f4577c465", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404", Pod:"calico-apiserver-f4577c465-9dqqf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calica5c8950800", MAC:"f6:11:cd:8e:cb:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:49.922330 containerd[2014]: 2025-08-19 08:16:49.911 [INFO][4828] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-9dqqf" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--9dqqf-eth0" Aug 19 08:16:49.975500 containerd[2014]: time="2025-08-19T08:16:49.975349672Z" level=info msg="connecting to shim c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404" address="unix:///run/containerd/s/f8e87bde284100ba484d1590a6fc2462a9e9a1af733da316dd84e0e59b288335" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:50.003171 systemd-networkd[1812]: cali3f6846b2ef4: Link UP Aug 19 08:16:50.004736 systemd-networkd[1812]: cali3f6846b2ef4: Gained carrier Aug 19 08:16:50.031929 systemd[1]: Started cri-containerd-c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404.scope - libcontainer container c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404. Aug 19 08:16:50.038406 containerd[2014]: 2025-08-19 08:16:49.692 [INFO][4838] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:50.038406 containerd[2014]: 2025-08-19 08:16:49.713 [INFO][4838] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0 goldmane-58fd7646b9- calico-system d00cdf73-4cda-404a-bf73-b97dabb3c18b 804 0 2025-08-19 08:16:24 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-23-28 goldmane-58fd7646b9-s59gg eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3f6846b2ef4 [] [] }} ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-" Aug 19 08:16:50.038406 containerd[2014]: 2025-08-19 08:16:49.713 [INFO][4838] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.038406 containerd[2014]: 2025-08-19 08:16:49.832 [INFO][4849] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" HandleID="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Workload="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.834 [INFO][4849] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" HandleID="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Workload="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003238d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-28", "pod":"goldmane-58fd7646b9-s59gg", "timestamp":"2025-08-19 08:16:49.831823985 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.834 [INFO][4849] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.873 [INFO][4849] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.873 [INFO][4849] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.923 [INFO][4849] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" host="ip-172-31-23-28" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.931 [INFO][4849] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.948 [INFO][4849] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.950 [INFO][4849] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.957 [INFO][4849] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:50.038682 containerd[2014]: 2025-08-19 08:16:49.959 [INFO][4849] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" host="ip-172-31-23-28" Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.962 [INFO][4849] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37 Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.972 [INFO][4849] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" host="ip-172-31-23-28" Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.991 [INFO][4849] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.3/26] block=192.168.80.0/26 handle="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" host="ip-172-31-23-28" Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.991 [INFO][4849] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.3/26] handle="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" host="ip-172-31-23-28" Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.991 [INFO][4849] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:50.039497 containerd[2014]: 2025-08-19 08:16:49.991 [INFO][4849] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.3/26] IPv6=[] ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" HandleID="k8s-pod-network.5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Workload="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.039756 containerd[2014]: 2025-08-19 08:16:49.999 [INFO][4838] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"d00cdf73-4cda-404a-bf73-b97dabb3c18b", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"goldmane-58fd7646b9-s59gg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.80.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3f6846b2ef4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:50.039756 containerd[2014]: 2025-08-19 08:16:49.999 [INFO][4838] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.3/32] ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.040008 containerd[2014]: 2025-08-19 08:16:49.999 [INFO][4838] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f6846b2ef4 ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.040008 containerd[2014]: 2025-08-19 08:16:50.007 [INFO][4838] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.040092 containerd[2014]: 2025-08-19 08:16:50.008 [INFO][4838] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"d00cdf73-4cda-404a-bf73-b97dabb3c18b", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37", Pod:"goldmane-58fd7646b9-s59gg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.80.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3f6846b2ef4", MAC:"a2:84:1b:75:c4:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:50.040169 containerd[2014]: 2025-08-19 08:16:50.033 [INFO][4838] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" Namespace="calico-system" Pod="goldmane-58fd7646b9-s59gg" WorkloadEndpoint="ip--172--31--23--28-k8s-goldmane--58fd7646b9--s59gg-eth0" Aug 19 08:16:50.107738 containerd[2014]: time="2025-08-19T08:16:50.107254722Z" level=info msg="connecting to shim 5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37" address="unix:///run/containerd/s/52425a07c10203b490135b82583de9f1c4eb84d22d79321bee93066bd29e35e9" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:50.110661 kubelet[3281]: I0819 08:16:50.110619 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:16:50.146501 systemd[1]: Started cri-containerd-5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37.scope - libcontainer container 5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37. Aug 19 08:16:50.179015 containerd[2014]: time="2025-08-19T08:16:50.177960176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-9dqqf,Uid:e0e4745e-1317-4dcd-9549-1d73509f7275,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404\"" Aug 19 08:16:50.280097 containerd[2014]: time="2025-08-19T08:16:50.278949002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-s59gg,Uid:d00cdf73-4cda-404a-bf73-b97dabb3c18b,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37\"" Aug 19 08:16:50.416120 containerd[2014]: time="2025-08-19T08:16:50.415911560Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:50.417720 containerd[2014]: time="2025-08-19T08:16:50.417673289Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 19 08:16:50.420060 containerd[2014]: time="2025-08-19T08:16:50.419963853Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:50.423787 containerd[2014]: time="2025-08-19T08:16:50.423726252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:50.424416 containerd[2014]: time="2025-08-19T08:16:50.424160528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.875945094s" Aug 19 08:16:50.424416 containerd[2014]: time="2025-08-19T08:16:50.424194989Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 19 08:16:50.425843 containerd[2014]: time="2025-08-19T08:16:50.425346825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:16:50.428113 containerd[2014]: time="2025-08-19T08:16:50.428080811Z" level=info msg="CreateContainer within sandbox \"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 08:16:50.443464 containerd[2014]: time="2025-08-19T08:16:50.443426562Z" level=info msg="Container 46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:50.457889 containerd[2014]: time="2025-08-19T08:16:50.457828462Z" level=info msg="CreateContainer within sandbox \"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7\"" Aug 19 08:16:50.460262 containerd[2014]: time="2025-08-19T08:16:50.460202159Z" level=info msg="StartContainer for \"46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7\"" Aug 19 08:16:50.461408 containerd[2014]: time="2025-08-19T08:16:50.461371282Z" level=info msg="connecting to shim 46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7" address="unix:///run/containerd/s/c9808de8c22630ed6f8643967cb8a1fcd70a238ab4c1bf98fd3c3f3da043aeb7" protocol=ttrpc version=3 Aug 19 08:16:50.487808 systemd[1]: Started cri-containerd-46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7.scope - libcontainer container 46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7. Aug 19 08:16:50.594077 containerd[2014]: time="2025-08-19T08:16:50.594002439Z" level=info msg="StartContainer for \"46353e77adf2fdccc5d462d7149ab7ce54ba5c9feabe230f4ddd16fbe48ce4f7\" returns successfully" Aug 19 08:16:50.648667 containerd[2014]: time="2025-08-19T08:16:50.648627294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fhzgx,Uid:fd28c4a4-f3f6-4029-ab76-882058ba96b8,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:50.667112 containerd[2014]: time="2025-08-19T08:16:50.666241499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5dnf2,Uid:48eab14c-74c0-4682-912d-dd41c3aaf383,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:51.046378 systemd-networkd[1812]: calic1c3ccfd653: Link UP Aug 19 08:16:51.047888 systemd-networkd[1812]: calic1c3ccfd653: Gained carrier Aug 19 08:16:51.083151 containerd[2014]: 2025-08-19 08:16:50.777 [INFO][5021] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:51.083151 containerd[2014]: 2025-08-19 08:16:50.811 [INFO][5021] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0 coredns-7c65d6cfc9- kube-system 48eab14c-74c0-4682-912d-dd41c3aaf383 806 0 2025-08-19 08:16:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-23-28 coredns-7c65d6cfc9-5dnf2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic1c3ccfd653 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-" Aug 19 08:16:51.083151 containerd[2014]: 2025-08-19 08:16:50.811 [INFO][5021] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.083151 containerd[2014]: 2025-08-19 08:16:50.903 [INFO][5045] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" HandleID="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.905 [INFO][5045] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" HandleID="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333940), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-23-28", "pod":"coredns-7c65d6cfc9-5dnf2", "timestamp":"2025-08-19 08:16:50.903242863 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.905 [INFO][5045] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.905 [INFO][5045] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.905 [INFO][5045] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.945 [INFO][5045] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" host="ip-172-31-23-28" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.956 [INFO][5045] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.987 [INFO][5045] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:50.995 [INFO][5045] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:51.000 [INFO][5045] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.083502 containerd[2014]: 2025-08-19 08:16:51.001 [INFO][5045] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" host="ip-172-31-23-28" Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.007 [INFO][5045] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.015 [INFO][5045] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" host="ip-172-31-23-28" Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5045] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.4/26] block=192.168.80.0/26 handle="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" host="ip-172-31-23-28" Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5045] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.4/26] handle="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" host="ip-172-31-23-28" Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5045] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:51.083904 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5045] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.4/26] IPv6=[] ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" HandleID="k8s-pod-network.1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.039 [INFO][5021] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"48eab14c-74c0-4682-912d-dd41c3aaf383", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"coredns-7c65d6cfc9-5dnf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic1c3ccfd653", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.040 [INFO][5021] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.4/32] ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.040 [INFO][5021] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic1c3ccfd653 ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.049 [INFO][5021] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.050 [INFO][5021] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"48eab14c-74c0-4682-912d-dd41c3aaf383", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf", Pod:"coredns-7c65d6cfc9-5dnf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic1c3ccfd653", MAC:"0a:09:52:04:ef:06", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:51.084705 containerd[2014]: 2025-08-19 08:16:51.077 [INFO][5021] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5dnf2" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--5dnf2-eth0" Aug 19 08:16:51.164068 containerd[2014]: time="2025-08-19T08:16:51.162479414Z" level=info msg="connecting to shim 1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf" address="unix:///run/containerd/s/e157b570b1f30e228f755d3559de9a5001dc1e8270d30635f06cf439b7f6b564" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:51.194770 systemd-networkd[1812]: cali0d8085b89a2: Link UP Aug 19 08:16:51.202315 systemd-networkd[1812]: cali0d8085b89a2: Gained carrier Aug 19 08:16:51.249623 systemd[1]: Started cri-containerd-1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf.scope - libcontainer container 1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf. Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.798 [INFO][5019] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.837 [INFO][5019] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0 csi-node-driver- calico-system fd28c4a4-f3f6-4029-ab76-882058ba96b8 693 0 2025-08-19 08:16:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-23-28 csi-node-driver-fhzgx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0d8085b89a2 [] [] }} ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.837 [INFO][5019] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.967 [INFO][5053] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" HandleID="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Workload="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.967 [INFO][5053] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" HandleID="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Workload="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000306300), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-28", "pod":"csi-node-driver-fhzgx", "timestamp":"2025-08-19 08:16:50.964912192 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:50.967 [INFO][5053] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5053] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.034 [INFO][5053] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.058 [INFO][5053] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.073 [INFO][5053] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.092 [INFO][5053] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.095 [INFO][5053] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.103 [INFO][5053] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.103 [INFO][5053] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.108 [INFO][5053] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.130 [INFO][5053] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.166 [INFO][5053] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.5/26] block=192.168.80.0/26 handle="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.166 [INFO][5053] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.5/26] handle="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" host="ip-172-31-23-28" Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.167 [INFO][5053] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:51.253354 containerd[2014]: 2025-08-19 08:16:51.167 [INFO][5053] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.5/26] IPv6=[] ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" HandleID="k8s-pod-network.5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Workload="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.178 [INFO][5019] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fd28c4a4-f3f6-4029-ab76-882058ba96b8", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"csi-node-driver-fhzgx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.80.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0d8085b89a2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.178 [INFO][5019] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.5/32] ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.178 [INFO][5019] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0d8085b89a2 ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.209 [INFO][5019] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.210 [INFO][5019] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fd28c4a4-f3f6-4029-ab76-882058ba96b8", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c", Pod:"csi-node-driver-fhzgx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.80.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0d8085b89a2", MAC:"ee:00:c2:33:8a:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:51.256454 containerd[2014]: 2025-08-19 08:16:51.238 [INFO][5019] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" Namespace="calico-system" Pod="csi-node-driver-fhzgx" WorkloadEndpoint="ip--172--31--23--28-k8s-csi--node--driver--fhzgx-eth0" Aug 19 08:16:51.306155 containerd[2014]: time="2025-08-19T08:16:51.305234020Z" level=info msg="connecting to shim 5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c" address="unix:///run/containerd/s/80e67bd57fb5ea1d749e06d802fd36c9b4bb6e4381b36c8000a2c83b89733303" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:51.361164 containerd[2014]: time="2025-08-19T08:16:51.360877609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5dnf2,Uid:48eab14c-74c0-4682-912d-dd41c3aaf383,Namespace:kube-system,Attempt:0,} returns sandbox id \"1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf\"" Aug 19 08:16:51.376545 systemd[1]: Started cri-containerd-5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c.scope - libcontainer container 5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c. Aug 19 08:16:51.378813 containerd[2014]: time="2025-08-19T08:16:51.378722851Z" level=info msg="CreateContainer within sandbox \"1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:16:51.409754 containerd[2014]: time="2025-08-19T08:16:51.409703049Z" level=info msg="Container 3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:51.428876 containerd[2014]: time="2025-08-19T08:16:51.428167918Z" level=info msg="CreateContainer within sandbox \"1aa64ce3fbaf5bd7b0d1b5bec0e733a8acf5b5f5ad80ee2e295f2465a4b2ecaf\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626\"" Aug 19 08:16:51.430869 containerd[2014]: time="2025-08-19T08:16:51.430832104Z" level=info msg="StartContainer for \"3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626\"" Aug 19 08:16:51.433493 containerd[2014]: time="2025-08-19T08:16:51.433338159Z" level=info msg="connecting to shim 3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626" address="unix:///run/containerd/s/e157b570b1f30e228f755d3559de9a5001dc1e8270d30635f06cf439b7f6b564" protocol=ttrpc version=3 Aug 19 08:16:51.486060 containerd[2014]: time="2025-08-19T08:16:51.485954136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fhzgx,Uid:fd28c4a4-f3f6-4029-ab76-882058ba96b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c\"" Aug 19 08:16:51.488340 systemd[1]: Started cri-containerd-3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626.scope - libcontainer container 3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626. Aug 19 08:16:51.616708 containerd[2014]: time="2025-08-19T08:16:51.616008397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hp9n,Uid:407658c8-3f54-45e6-834a-4675e288a13c,Namespace:kube-system,Attempt:0,}" Aug 19 08:16:51.616708 containerd[2014]: time="2025-08-19T08:16:51.616103695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4b76999-hwqxm,Uid:97b6de00-458f-491a-a402-be43782ea6a4,Namespace:calico-system,Attempt:0,}" Aug 19 08:16:51.616708 containerd[2014]: time="2025-08-19T08:16:51.616224912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-x44gs,Uid:234924fd-06d9-4c69-a752-1aa0b5b48653,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:16:51.629824 containerd[2014]: time="2025-08-19T08:16:51.629189626Z" level=info msg="StartContainer for \"3ef821b773b80506920bad74c3a6d811e487d1a843a2e4538238b3b11650b626\" returns successfully" Aug 19 08:16:51.647373 systemd-networkd[1812]: cali3f6846b2ef4: Gained IPv6LL Aug 19 08:16:51.841215 systemd-networkd[1812]: calica5c8950800: Gained IPv6LL Aug 19 08:16:51.848311 systemd[1]: Started sshd@7-172.31.23.28:22-139.178.89.65:59320.service - OpenSSH per-connection server daemon (139.178.89.65:59320). Aug 19 08:16:52.050322 kubelet[3281]: I0819 08:16:52.049925 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-5dnf2" podStartSLOduration=46.049895414 podStartE2EDuration="46.049895414s" podCreationTimestamp="2025-08-19 08:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:52.037411527 +0000 UTC m=+51.588247867" watchObservedRunningTime="2025-08-19 08:16:52.049895414 +0000 UTC m=+51.600731752" Aug 19 08:16:52.203520 sshd[5246]: Accepted publickey for core from 139.178.89.65 port 59320 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:16:52.209936 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:52.221928 systemd-networkd[1812]: calic1c3ccfd653: Gained IPv6LL Aug 19 08:16:52.236894 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 08:16:52.237952 systemd-logind[1976]: New session 8 of user core. Aug 19 08:16:52.240586 systemd-networkd[1812]: cali6331509f853: Link UP Aug 19 08:16:52.240809 systemd-networkd[1812]: cali6331509f853: Gained carrier Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:51.776 [INFO][5204] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:51.813 [INFO][5204] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0 coredns-7c65d6cfc9- kube-system 407658c8-3f54-45e6-834a-4675e288a13c 801 0 2025-08-19 08:16:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-23-28 coredns-7c65d6cfc9-8hp9n eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6331509f853 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:51.814 [INFO][5204] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.057 [INFO][5245] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" HandleID="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.069 [INFO][5245] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" HandleID="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003329b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-23-28", "pod":"coredns-7c65d6cfc9-8hp9n", "timestamp":"2025-08-19 08:16:52.057654942 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.071 [INFO][5245] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.073 [INFO][5245] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.073 [INFO][5245] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.139 [INFO][5245] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.145 [INFO][5245] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.151 [INFO][5245] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.154 [INFO][5245] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.160 [INFO][5245] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.160 [INFO][5245] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.168 [INFO][5245] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609 Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.184 [INFO][5245] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.206 [INFO][5245] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.6/26] block=192.168.80.0/26 handle="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.206 [INFO][5245] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.6/26] handle="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" host="ip-172-31-23-28" Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.206 [INFO][5245] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:52.320135 containerd[2014]: 2025-08-19 08:16:52.206 [INFO][5245] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.6/26] IPv6=[] ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" HandleID="k8s-pod-network.af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Workload="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.223 [INFO][5204] cni-plugin/k8s.go 418: Populated endpoint ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"407658c8-3f54-45e6-834a-4675e288a13c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"coredns-7c65d6cfc9-8hp9n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6331509f853", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.231 [INFO][5204] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.6/32] ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.231 [INFO][5204] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6331509f853 ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.239 [INFO][5204] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.269 [INFO][5204] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"407658c8-3f54-45e6-834a-4675e288a13c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609", Pod:"coredns-7c65d6cfc9-8hp9n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.80.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6331509f853", MAC:"6e:f8:74:77:61:27", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.323263 containerd[2014]: 2025-08-19 08:16:52.311 [INFO][5204] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hp9n" WorkloadEndpoint="ip--172--31--23--28-k8s-coredns--7c65d6cfc9--8hp9n-eth0" Aug 19 08:16:52.476473 systemd-networkd[1812]: calia12e5bce282: Link UP Aug 19 08:16:52.477520 systemd-networkd[1812]: calia12e5bce282: Gained carrier Aug 19 08:16:52.508578 containerd[2014]: time="2025-08-19T08:16:52.508512733Z" level=info msg="connecting to shim af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609" address="unix:///run/containerd/s/3178d3a66ccb9573cf166bc00b0de508bd57e13d64b3524ea9ae77d2fac0c965" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:51.793 [INFO][5221] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:51.861 [INFO][5221] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0 calico-apiserver-f4577c465- calico-apiserver 234924fd-06d9-4c69-a752-1aa0b5b48653 802 0 2025-08-19 08:16:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f4577c465 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-23-28 calico-apiserver-f4577c465-x44gs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia12e5bce282 [] [] }} ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:51.862 [INFO][5221] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.098 [INFO][5253] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" HandleID="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.099 [INFO][5253] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" HandleID="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000125a90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-23-28", "pod":"calico-apiserver-f4577c465-x44gs", "timestamp":"2025-08-19 08:16:52.0989047 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.099 [INFO][5253] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.209 [INFO][5253] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.209 [INFO][5253] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.289 [INFO][5253] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.303 [INFO][5253] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.325 [INFO][5253] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.336 [INFO][5253] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.343 [INFO][5253] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.344 [INFO][5253] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.349 [INFO][5253] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.411 [INFO][5253] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.447 [INFO][5253] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.7/26] block=192.168.80.0/26 handle="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.449 [INFO][5253] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.7/26] handle="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" host="ip-172-31-23-28" Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.449 [INFO][5253] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:52.538487 containerd[2014]: 2025-08-19 08:16:52.449 [INFO][5253] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.7/26] IPv6=[] ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" HandleID="k8s-pod-network.1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Workload="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.463 [INFO][5221] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0", GenerateName:"calico-apiserver-f4577c465-", Namespace:"calico-apiserver", SelfLink:"", UID:"234924fd-06d9-4c69-a752-1aa0b5b48653", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f4577c465", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"calico-apiserver-f4577c465-x44gs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia12e5bce282", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.464 [INFO][5221] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.7/32] ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.464 [INFO][5221] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia12e5bce282 ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.478 [INFO][5221] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.482 [INFO][5221] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0", GenerateName:"calico-apiserver-f4577c465-", Namespace:"calico-apiserver", SelfLink:"", UID:"234924fd-06d9-4c69-a752-1aa0b5b48653", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f4577c465", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f", Pod:"calico-apiserver-f4577c465-x44gs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.80.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia12e5bce282", MAC:"be:23:d4:fc:06:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.543759 containerd[2014]: 2025-08-19 08:16:52.528 [INFO][5221] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" Namespace="calico-apiserver" Pod="calico-apiserver-f4577c465-x44gs" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--apiserver--f4577c465--x44gs-eth0" Aug 19 08:16:52.692276 containerd[2014]: time="2025-08-19T08:16:52.691021666Z" level=info msg="connecting to shim 1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f" address="unix:///run/containerd/s/cbf12d083cdb209ec02990741279f7bbf54b568243f5fef22be8b66baec98aeb" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:52.731248 systemd[1]: Started cri-containerd-af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609.scope - libcontainer container af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609. Aug 19 08:16:52.782204 systemd-networkd[1812]: cali5acf9f15998: Link UP Aug 19 08:16:52.785063 systemd-networkd[1812]: cali5acf9f15998: Gained carrier Aug 19 08:16:52.854633 systemd[1]: Started cri-containerd-1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f.scope - libcontainer container 1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f. Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:51.786 [INFO][5212] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:51.828 [INFO][5212] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0 calico-kube-controllers-85b4b76999- calico-system 97b6de00-458f-491a-a402-be43782ea6a4 808 0 2025-08-19 08:16:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:85b4b76999 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-23-28 calico-kube-controllers-85b4b76999-hwqxm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5acf9f15998 [] [] }} ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:51.830 [INFO][5212] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.108 [INFO][5251] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" HandleID="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Workload="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.109 [INFO][5251] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" HandleID="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Workload="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103e30), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-23-28", "pod":"calico-kube-controllers-85b4b76999-hwqxm", "timestamp":"2025-08-19 08:16:52.106010186 +0000 UTC"}, Hostname:"ip-172-31-23-28", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.109 [INFO][5251] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.451 [INFO][5251] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.451 [INFO][5251] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-23-28' Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.497 [INFO][5251] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.554 [INFO][5251] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.596 [INFO][5251] ipam/ipam.go 511: Trying affinity for 192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.607 [INFO][5251] ipam/ipam.go 158: Attempting to load block cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.625 [INFO][5251] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.80.0/26 host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.626 [INFO][5251] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.80.0/26 handle="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.642 [INFO][5251] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.695 [INFO][5251] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.80.0/26 handle="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.743 [INFO][5251] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.80.8/26] block=192.168.80.0/26 handle="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.744 [INFO][5251] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.80.8/26] handle="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" host="ip-172-31-23-28" Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.745 [INFO][5251] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:16:52.864510 containerd[2014]: 2025-08-19 08:16:52.745 [INFO][5251] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.80.8/26] IPv6=[] ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" HandleID="k8s-pod-network.38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Workload="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.776 [INFO][5212] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0", GenerateName:"calico-kube-controllers-85b4b76999-", Namespace:"calico-system", SelfLink:"", UID:"97b6de00-458f-491a-a402-be43782ea6a4", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85b4b76999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"", Pod:"calico-kube-controllers-85b4b76999-hwqxm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.80.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5acf9f15998", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.776 [INFO][5212] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.80.8/32] ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.776 [INFO][5212] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5acf9f15998 ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.784 [INFO][5212] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.789 [INFO][5212] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0", GenerateName:"calico-kube-controllers-85b4b76999-", Namespace:"calico-system", SelfLink:"", UID:"97b6de00-458f-491a-a402-be43782ea6a4", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 16, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85b4b76999", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-23-28", ContainerID:"38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf", Pod:"calico-kube-controllers-85b4b76999-hwqxm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.80.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5acf9f15998", MAC:"32:18:9d:1a:a4:56", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:16:52.866992 containerd[2014]: 2025-08-19 08:16:52.835 [INFO][5212] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" Namespace="calico-system" Pod="calico-kube-controllers-85b4b76999-hwqxm" WorkloadEndpoint="ip--172--31--23--28-k8s-calico--kube--controllers--85b4b76999--hwqxm-eth0" Aug 19 08:16:52.976678 containerd[2014]: time="2025-08-19T08:16:52.974685358Z" level=info msg="connecting to shim 38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf" address="unix:///run/containerd/s/7faad9f90e1880c177abf22fd13e00a22b3519319785d5020ab99e0466d955ee" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:16:53.053626 systemd-networkd[1812]: cali0d8085b89a2: Gained IPv6LL Aug 19 08:16:53.164249 containerd[2014]: time="2025-08-19T08:16:53.164201181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hp9n,Uid:407658c8-3f54-45e6-834a-4675e288a13c,Namespace:kube-system,Attempt:0,} returns sandbox id \"af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609\"" Aug 19 08:16:53.165755 systemd[1]: Started cri-containerd-38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf.scope - libcontainer container 38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf. Aug 19 08:16:53.185331 containerd[2014]: time="2025-08-19T08:16:53.185016784Z" level=info msg="CreateContainer within sandbox \"af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:16:53.313282 containerd[2014]: time="2025-08-19T08:16:53.313157208Z" level=info msg="Container 20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:53.352899 containerd[2014]: time="2025-08-19T08:16:53.352843761Z" level=info msg="CreateContainer within sandbox \"af9d8be75cdbccbb43826f1eedd2f276cc642ac788d8134301684b214dace609\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5\"" Aug 19 08:16:53.359059 containerd[2014]: time="2025-08-19T08:16:53.357152468Z" level=info msg="StartContainer for \"20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5\"" Aug 19 08:16:53.362463 containerd[2014]: time="2025-08-19T08:16:53.362396418Z" level=info msg="connecting to shim 20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5" address="unix:///run/containerd/s/3178d3a66ccb9573cf166bc00b0de508bd57e13d64b3524ea9ae77d2fac0c965" protocol=ttrpc version=3 Aug 19 08:16:53.442590 systemd[1]: Started cri-containerd-20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5.scope - libcontainer container 20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5. Aug 19 08:16:53.501306 systemd-networkd[1812]: cali6331509f853: Gained IPv6LL Aug 19 08:16:53.562072 sshd[5284]: Connection closed by 139.178.89.65 port 59320 Aug 19 08:16:53.562818 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:53.582799 systemd[1]: sshd@7-172.31.23.28:22-139.178.89.65:59320.service: Deactivated successfully. Aug 19 08:16:53.584418 systemd-logind[1976]: Session 8 logged out. Waiting for processes to exit. Aug 19 08:16:53.589769 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 08:16:53.606061 systemd-logind[1976]: Removed session 8. Aug 19 08:16:53.628950 containerd[2014]: time="2025-08-19T08:16:53.628908040Z" level=info msg="StartContainer for \"20734842efebbabc769a1e40f7526e7bd2122a33668ffc978a1085f27747a7e5\" returns successfully" Aug 19 08:16:53.646476 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1935628621.mount: Deactivated successfully. Aug 19 08:16:53.820499 containerd[2014]: time="2025-08-19T08:16:53.820177438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f4577c465-x44gs,Uid:234924fd-06d9-4c69-a752-1aa0b5b48653,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f\"" Aug 19 08:16:53.886009 systemd-networkd[1812]: calia12e5bce282: Gained IPv6LL Aug 19 08:16:53.970792 containerd[2014]: time="2025-08-19T08:16:53.970720874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85b4b76999-hwqxm,Uid:97b6de00-458f-491a-a402-be43782ea6a4,Namespace:calico-system,Attempt:0,} returns sandbox id \"38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf\"" Aug 19 08:16:54.525249 systemd-networkd[1812]: cali5acf9f15998: Gained IPv6LL Aug 19 08:16:54.666763 systemd-networkd[1812]: vxlan.calico: Link UP Aug 19 08:16:54.666774 systemd-networkd[1812]: vxlan.calico: Gained carrier Aug 19 08:16:54.743903 (udev-worker)[4616]: Network interface NamePolicy= disabled on kernel command line. Aug 19 08:16:55.801144 containerd[2014]: time="2025-08-19T08:16:55.800612865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.804636 containerd[2014]: time="2025-08-19T08:16:55.803699943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 19 08:16:55.805180 containerd[2014]: time="2025-08-19T08:16:55.805140151Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.812467 containerd[2014]: time="2025-08-19T08:16:55.810941791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.813319 containerd[2014]: time="2025-08-19T08:16:55.813177457Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 5.387798103s" Aug 19 08:16:55.813319 containerd[2014]: time="2025-08-19T08:16:55.813265028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:16:55.826986 containerd[2014]: time="2025-08-19T08:16:55.826302176Z" level=info msg="CreateContainer within sandbox \"c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:16:55.835874 containerd[2014]: time="2025-08-19T08:16:55.835063823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 08:16:55.868061 containerd[2014]: time="2025-08-19T08:16:55.866550531Z" level=info msg="Container 317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:16:55.894610 containerd[2014]: time="2025-08-19T08:16:55.894561726Z" level=info msg="CreateContainer within sandbox \"c322a43fd960c52849b73d0b57c5d1e683e0a4d2f64f9457a0ba623225550404\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe\"" Aug 19 08:16:55.896319 containerd[2014]: time="2025-08-19T08:16:55.896283788Z" level=info msg="StartContainer for \"317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe\"" Aug 19 08:16:55.899159 containerd[2014]: time="2025-08-19T08:16:55.898809147Z" level=info msg="connecting to shim 317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe" address="unix:///run/containerd/s/f8e87bde284100ba484d1590a6fc2462a9e9a1af733da316dd84e0e59b288335" protocol=ttrpc version=3 Aug 19 08:16:55.934309 systemd[1]: Started cri-containerd-317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe.scope - libcontainer container 317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe. Aug 19 08:16:56.030461 containerd[2014]: time="2025-08-19T08:16:56.030358810Z" level=info msg="StartContainer for \"317e43cf5c1cbba650768ba57b7836e07c1ed2b6a4699a5e227deacd4306afbe\" returns successfully" Aug 19 08:16:56.276124 containerd[2014]: time="2025-08-19T08:16:56.276069734Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" id:\"f6aca7c289313e3819014463d7794e10a4ca79224829e92a11e7c161d2c3ae65\" pid:5649 exited_at:{seconds:1755591416 nanos:275617130}" Aug 19 08:16:56.383073 kubelet[3281]: I0819 08:16:56.381497 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8hp9n" podStartSLOduration=50.329725959 podStartE2EDuration="50.329725959s" podCreationTimestamp="2025-08-19 08:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:16:54.061864833 +0000 UTC m=+53.612701173" watchObservedRunningTime="2025-08-19 08:16:56.329725959 +0000 UTC m=+55.880562299" Aug 19 08:16:56.498404 containerd[2014]: time="2025-08-19T08:16:56.498355923Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" id:\"6e1074fe9bd5e52235117d46bfd16e444f01d3d6ea0f964cc1604760c72ae6b8\" pid:5673 exited_at:{seconds:1755591416 nanos:496671533}" Aug 19 08:16:56.509265 systemd-networkd[1812]: vxlan.calico: Gained IPv6LL Aug 19 08:16:57.087416 kubelet[3281]: I0819 08:16:57.086996 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f4577c465-9dqqf" podStartSLOduration=30.448224422 podStartE2EDuration="36.08697213s" podCreationTimestamp="2025-08-19 08:16:21 +0000 UTC" firstStartedPulling="2025-08-19 08:16:50.180339092 +0000 UTC m=+49.731175431" lastFinishedPulling="2025-08-19 08:16:55.819086809 +0000 UTC m=+55.369923139" observedRunningTime="2025-08-19 08:16:57.086772276 +0000 UTC m=+56.637608616" watchObservedRunningTime="2025-08-19 08:16:57.08697213 +0000 UTC m=+56.637808473" Aug 19 08:16:58.599797 systemd[1]: Started sshd@8-172.31.23.28:22-139.178.89.65:59322.service - OpenSSH per-connection server daemon (139.178.89.65:59322). Aug 19 08:16:58.723914 ntpd[1966]: Listen normally on 8 vxlan.calico 192.168.80.0:123 Aug 19 08:16:58.724007 ntpd[1966]: Listen normally on 9 calie9543d3a887 [fe80::ecee:eeff:feee:eeee%4]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 8 vxlan.calico 192.168.80.0:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 9 calie9543d3a887 [fe80::ecee:eeff:feee:eeee%4]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 10 calica5c8950800 [fe80::ecee:eeff:feee:eeee%5]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 11 cali3f6846b2ef4 [fe80::ecee:eeff:feee:eeee%6]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 12 calic1c3ccfd653 [fe80::ecee:eeff:feee:eeee%7]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 13 cali0d8085b89a2 [fe80::ecee:eeff:feee:eeee%8]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 14 cali6331509f853 [fe80::ecee:eeff:feee:eeee%9]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 15 calia12e5bce282 [fe80::ecee:eeff:feee:eeee%10]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 16 cali5acf9f15998 [fe80::ecee:eeff:feee:eeee%11]:123 Aug 19 08:16:58.725211 ntpd[1966]: 19 Aug 08:16:58 ntpd[1966]: Listen normally on 17 vxlan.calico [fe80::64a4:aff:fe3f:c48d%12]:123 Aug 19 08:16:58.724703 ntpd[1966]: Listen normally on 10 calica5c8950800 [fe80::ecee:eeff:feee:eeee%5]:123 Aug 19 08:16:58.724783 ntpd[1966]: Listen normally on 11 cali3f6846b2ef4 [fe80::ecee:eeff:feee:eeee%6]:123 Aug 19 08:16:58.724823 ntpd[1966]: Listen normally on 12 calic1c3ccfd653 [fe80::ecee:eeff:feee:eeee%7]:123 Aug 19 08:16:58.724860 ntpd[1966]: Listen normally on 13 cali0d8085b89a2 [fe80::ecee:eeff:feee:eeee%8]:123 Aug 19 08:16:58.724895 ntpd[1966]: Listen normally on 14 cali6331509f853 [fe80::ecee:eeff:feee:eeee%9]:123 Aug 19 08:16:58.724930 ntpd[1966]: Listen normally on 15 calia12e5bce282 [fe80::ecee:eeff:feee:eeee%10]:123 Aug 19 08:16:58.724965 ntpd[1966]: Listen normally on 16 cali5acf9f15998 [fe80::ecee:eeff:feee:eeee%11]:123 Aug 19 08:16:58.725001 ntpd[1966]: Listen normally on 17 vxlan.calico [fe80::64a4:aff:fe3f:c48d%12]:123 Aug 19 08:16:58.929206 sshd[5693]: Accepted publickey for core from 139.178.89.65 port 59322 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:16:58.936492 sshd-session[5693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:58.949475 systemd-logind[1976]: New session 9 of user core. Aug 19 08:16:58.956349 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 08:16:59.097086 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2141560144.mount: Deactivated successfully. Aug 19 08:17:00.136561 sshd[5696]: Connection closed by 139.178.89.65 port 59322 Aug 19 08:17:00.138536 sshd-session[5693]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:00.151883 systemd[1]: sshd@8-172.31.23.28:22-139.178.89.65:59322.service: Deactivated successfully. Aug 19 08:17:00.157768 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 08:17:00.162812 systemd-logind[1976]: Session 9 logged out. Waiting for processes to exit. Aug 19 08:17:00.165643 systemd-logind[1976]: Removed session 9. Aug 19 08:17:00.946336 containerd[2014]: time="2025-08-19T08:17:00.946287566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.955480 containerd[2014]: time="2025-08-19T08:17:00.955442508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 19 08:17:00.979056 containerd[2014]: time="2025-08-19T08:17:00.978700068Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.985867 containerd[2014]: time="2025-08-19T08:17:00.985821107Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.986786 containerd[2014]: time="2025-08-19T08:17:00.986750520Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 5.151613019s" Aug 19 08:17:00.987070 containerd[2014]: time="2025-08-19T08:17:00.986925616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 19 08:17:00.988525 containerd[2014]: time="2025-08-19T08:17:00.988496466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 08:17:00.991507 containerd[2014]: time="2025-08-19T08:17:00.991478029Z" level=info msg="CreateContainer within sandbox \"5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 08:17:01.061986 containerd[2014]: time="2025-08-19T08:17:01.061770472Z" level=info msg="Container b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:01.088044 containerd[2014]: time="2025-08-19T08:17:01.087730758Z" level=info msg="CreateContainer within sandbox \"5c51b3fc0faeaec5235752981e6e8fa0d4543bc928632cd9f03fc4240f971f37\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\"" Aug 19 08:17:01.090629 containerd[2014]: time="2025-08-19T08:17:01.090572220Z" level=info msg="StartContainer for \"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\"" Aug 19 08:17:01.096730 containerd[2014]: time="2025-08-19T08:17:01.096383119Z" level=info msg="connecting to shim b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723" address="unix:///run/containerd/s/52425a07c10203b490135b82583de9f1c4eb84d22d79321bee93066bd29e35e9" protocol=ttrpc version=3 Aug 19 08:17:01.155506 systemd[1]: Started cri-containerd-b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723.scope - libcontainer container b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723. Aug 19 08:17:01.358135 containerd[2014]: time="2025-08-19T08:17:01.354695696Z" level=info msg="StartContainer for \"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" returns successfully" Aug 19 08:17:02.588454 containerd[2014]: time="2025-08-19T08:17:02.588392488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"916b537902213ba82dc329202c0a92d046ebc26ad03d7bd79924788f8e225ddb\" pid:5775 exit_status:1 exited_at:{seconds:1755591422 nanos:552980599}" Aug 19 08:17:03.362186 containerd[2014]: time="2025-08-19T08:17:03.362020349Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"c77a24176c1cf4e1748b51e4f2f3018e39f99f9f93d9d0b1d595da07698ce5f3\" pid:5801 exit_status:1 exited_at:{seconds:1755591423 nanos:361697446}" Aug 19 08:17:04.217790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3422455625.mount: Deactivated successfully. Aug 19 08:17:04.261059 containerd[2014]: time="2025-08-19T08:17:04.260694683Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.263824 containerd[2014]: time="2025-08-19T08:17:04.263784934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 19 08:17:04.265976 containerd[2014]: time="2025-08-19T08:17:04.265862908Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.362656 containerd[2014]: time="2025-08-19T08:17:04.362118474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.372709 containerd[2014]: time="2025-08-19T08:17:04.372555484Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 3.384017234s" Aug 19 08:17:04.372709 containerd[2014]: time="2025-08-19T08:17:04.372615260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 19 08:17:04.388311 containerd[2014]: time="2025-08-19T08:17:04.388104953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 08:17:04.388469 containerd[2014]: time="2025-08-19T08:17:04.388407112Z" level=info msg="CreateContainer within sandbox \"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 08:17:04.391246 containerd[2014]: time="2025-08-19T08:17:04.391204196Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"77d26a2ea2a20c4162c20f7050841ca61a40879a22db3acfef2ace9e69ecf3fe\" pid:5828 exit_status:1 exited_at:{seconds:1755591424 nanos:386546297}" Aug 19 08:17:04.420588 containerd[2014]: time="2025-08-19T08:17:04.418270711Z" level=info msg="Container b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:04.439930 containerd[2014]: time="2025-08-19T08:17:04.439877454Z" level=info msg="CreateContainer within sandbox \"cd45200c6ae0a6ab769e0936f7b4258e932d7cda5f094eea2a02370b91ecccbb\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f\"" Aug 19 08:17:04.442335 containerd[2014]: time="2025-08-19T08:17:04.442295760Z" level=info msg="StartContainer for \"b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f\"" Aug 19 08:17:04.446215 containerd[2014]: time="2025-08-19T08:17:04.446063036Z" level=info msg="connecting to shim b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f" address="unix:///run/containerd/s/c9808de8c22630ed6f8643967cb8a1fcd70a238ab4c1bf98fd3c3f3da043aeb7" protocol=ttrpc version=3 Aug 19 08:17:04.483555 systemd[1]: Started cri-containerd-b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f.scope - libcontainer container b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f. Aug 19 08:17:04.557988 containerd[2014]: time="2025-08-19T08:17:04.557954285Z" level=info msg="StartContainer for \"b74cde188c52cfee0f33587e9313b8e486eb0988536a9c2f39e97adb8ed0b18f\" returns successfully" Aug 19 08:17:05.175706 systemd[1]: Started sshd@9-172.31.23.28:22-139.178.89.65:44068.service - OpenSSH per-connection server daemon (139.178.89.65:44068). Aug 19 08:17:05.210056 kubelet[3281]: I0819 08:17:05.209880 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-s59gg" podStartSLOduration=30.502916479 podStartE2EDuration="41.209846365s" podCreationTimestamp="2025-08-19 08:16:24 +0000 UTC" firstStartedPulling="2025-08-19 08:16:50.281425658 +0000 UTC m=+49.832261978" lastFinishedPulling="2025-08-19 08:17:00.988355533 +0000 UTC m=+60.539191864" observedRunningTime="2025-08-19 08:17:02.366892073 +0000 UTC m=+61.917814876" watchObservedRunningTime="2025-08-19 08:17:05.209846365 +0000 UTC m=+64.760682706" Aug 19 08:17:05.442115 sshd[5874]: Accepted publickey for core from 139.178.89.65 port 44068 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:05.446160 sshd-session[5874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:05.477789 systemd-logind[1976]: New session 10 of user core. Aug 19 08:17:05.484616 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 08:17:06.339936 containerd[2014]: time="2025-08-19T08:17:06.339206167Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:06.343356 containerd[2014]: time="2025-08-19T08:17:06.343308038Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 19 08:17:06.345491 containerd[2014]: time="2025-08-19T08:17:06.345423116Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:06.353544 containerd[2014]: time="2025-08-19T08:17:06.352799204Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:06.355705 containerd[2014]: time="2025-08-19T08:17:06.355096928Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.96694258s" Aug 19 08:17:06.355705 containerd[2014]: time="2025-08-19T08:17:06.355164853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 19 08:17:06.359722 containerd[2014]: time="2025-08-19T08:17:06.359549100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:17:06.391765 containerd[2014]: time="2025-08-19T08:17:06.391689927Z" level=info msg="CreateContainer within sandbox \"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 08:17:06.514049 containerd[2014]: time="2025-08-19T08:17:06.513237063Z" level=info msg="Container 7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:06.535464 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount513871018.mount: Deactivated successfully. Aug 19 08:17:06.575897 containerd[2014]: time="2025-08-19T08:17:06.574855794Z" level=info msg="CreateContainer within sandbox \"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b\"" Aug 19 08:17:06.586626 containerd[2014]: time="2025-08-19T08:17:06.586569681Z" level=info msg="StartContainer for \"7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b\"" Aug 19 08:17:06.593588 containerd[2014]: time="2025-08-19T08:17:06.593469300Z" level=info msg="connecting to shim 7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b" address="unix:///run/containerd/s/80e67bd57fb5ea1d749e06d802fd36c9b4bb6e4381b36c8000a2c83b89733303" protocol=ttrpc version=3 Aug 19 08:17:06.694532 systemd[1]: Started cri-containerd-7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b.scope - libcontainer container 7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b. Aug 19 08:17:06.795869 containerd[2014]: time="2025-08-19T08:17:06.795820888Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:06.800232 containerd[2014]: time="2025-08-19T08:17:06.800184374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 08:17:06.808653 containerd[2014]: time="2025-08-19T08:17:06.808378675Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 448.765502ms" Aug 19 08:17:06.808653 containerd[2014]: time="2025-08-19T08:17:06.808430582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:17:06.812648 containerd[2014]: time="2025-08-19T08:17:06.809942815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 08:17:06.812648 containerd[2014]: time="2025-08-19T08:17:06.812573458Z" level=info msg="CreateContainer within sandbox \"1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:17:06.839952 containerd[2014]: time="2025-08-19T08:17:06.837732248Z" level=info msg="Container 75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:06.848504 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4238404013.mount: Deactivated successfully. Aug 19 08:17:06.863402 containerd[2014]: time="2025-08-19T08:17:06.863356062Z" level=info msg="StartContainer for \"7ea6344b0e1a8f092e4ac4c1722e5ea58cee7a15830c16f5a02e50db048b4c0b\" returns successfully" Aug 19 08:17:06.882962 containerd[2014]: time="2025-08-19T08:17:06.882569060Z" level=info msg="CreateContainer within sandbox \"1dfd005341da2f973863c5ac454a26c26ea6da433fb4de642c3ca6dabb416d8f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a\"" Aug 19 08:17:06.884884 containerd[2014]: time="2025-08-19T08:17:06.883367409Z" level=info msg="StartContainer for \"75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a\"" Aug 19 08:17:06.887514 containerd[2014]: time="2025-08-19T08:17:06.885456263Z" level=info msg="connecting to shim 75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a" address="unix:///run/containerd/s/cbf12d083cdb209ec02990741279f7bbf54b568243f5fef22be8b66baec98aeb" protocol=ttrpc version=3 Aug 19 08:17:06.933593 systemd[1]: Started cri-containerd-75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a.scope - libcontainer container 75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a. Aug 19 08:17:07.013634 sshd[5884]: Connection closed by 139.178.89.65 port 44068 Aug 19 08:17:07.014259 sshd-session[5874]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:07.022995 containerd[2014]: time="2025-08-19T08:17:07.022881229Z" level=info msg="StartContainer for \"75173f9a6f06eded0ced43b20ea8f945f13a33620baa14ae4cda1bae6688415a\" returns successfully" Aug 19 08:17:07.023718 systemd[1]: sshd@9-172.31.23.28:22-139.178.89.65:44068.service: Deactivated successfully. Aug 19 08:17:07.024168 systemd-logind[1976]: Session 10 logged out. Waiting for processes to exit. Aug 19 08:17:07.028954 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 08:17:07.032065 systemd-logind[1976]: Removed session 10. Aug 19 08:17:07.046210 systemd[1]: Started sshd@10-172.31.23.28:22-139.178.89.65:44070.service - OpenSSH per-connection server daemon (139.178.89.65:44070). Aug 19 08:17:07.252557 kubelet[3281]: I0819 08:17:07.252487 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5c878cc6c5-8rfs9" podStartSLOduration=5.421244055 podStartE2EDuration="21.252463716s" podCreationTimestamp="2025-08-19 08:16:46 +0000 UTC" firstStartedPulling="2025-08-19 08:16:48.545864319 +0000 UTC m=+48.096700651" lastFinishedPulling="2025-08-19 08:17:04.377083975 +0000 UTC m=+63.927920312" observedRunningTime="2025-08-19 08:17:05.210749318 +0000 UTC m=+64.761585657" watchObservedRunningTime="2025-08-19 08:17:07.252463716 +0000 UTC m=+66.803300066" Aug 19 08:17:07.254156 kubelet[3281]: I0819 08:17:07.252732 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f4577c465-x44gs" podStartSLOduration=33.267525584 podStartE2EDuration="46.252718689s" podCreationTimestamp="2025-08-19 08:16:21 +0000 UTC" firstStartedPulling="2025-08-19 08:16:53.824253863 +0000 UTC m=+53.375090190" lastFinishedPulling="2025-08-19 08:17:06.809446967 +0000 UTC m=+66.360283295" observedRunningTime="2025-08-19 08:17:07.244385312 +0000 UTC m=+66.795221652" watchObservedRunningTime="2025-08-19 08:17:07.252718689 +0000 UTC m=+66.803555029" Aug 19 08:17:07.260396 sshd[5959]: Accepted publickey for core from 139.178.89.65 port 44070 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:07.264971 sshd-session[5959]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:07.285733 systemd-logind[1976]: New session 11 of user core. Aug 19 08:17:07.293892 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 08:17:07.460112 containerd[2014]: time="2025-08-19T08:17:07.460026550Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"8b39220c51d265e9a709d1135506ebde25f2284fa7155d19b51427d9add15d4a\" pid:5981 exit_status:1 exited_at:{seconds:1755591427 nanos:459672022}" Aug 19 08:17:08.075664 sshd[5987]: Connection closed by 139.178.89.65 port 44070 Aug 19 08:17:08.076668 sshd-session[5959]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:08.087099 systemd[1]: sshd@10-172.31.23.28:22-139.178.89.65:44070.service: Deactivated successfully. Aug 19 08:17:08.088275 systemd-logind[1976]: Session 11 logged out. Waiting for processes to exit. Aug 19 08:17:08.094898 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 08:17:08.135618 systemd[1]: Started sshd@11-172.31.23.28:22-139.178.89.65:44084.service - OpenSSH per-connection server daemon (139.178.89.65:44084). Aug 19 08:17:08.147198 systemd-logind[1976]: Removed session 11. Aug 19 08:17:08.380734 sshd[6004]: Accepted publickey for core from 139.178.89.65 port 44084 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:08.380365 sshd-session[6004]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:08.388332 systemd-logind[1976]: New session 12 of user core. Aug 19 08:17:08.397526 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 08:17:08.785302 sshd[6007]: Connection closed by 139.178.89.65 port 44084 Aug 19 08:17:08.790260 sshd-session[6004]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:08.810963 systemd[1]: sshd@11-172.31.23.28:22-139.178.89.65:44084.service: Deactivated successfully. Aug 19 08:17:08.815586 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 08:17:08.818439 systemd-logind[1976]: Session 12 logged out. Waiting for processes to exit. Aug 19 08:17:08.822101 systemd-logind[1976]: Removed session 12. Aug 19 08:17:13.209326 containerd[2014]: time="2025-08-19T08:17:13.209260614Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:13.213352 containerd[2014]: time="2025-08-19T08:17:13.213307940Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 19 08:17:13.305172 containerd[2014]: time="2025-08-19T08:17:13.305096835Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:13.310063 containerd[2014]: time="2025-08-19T08:17:13.309161892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:13.310063 containerd[2014]: time="2025-08-19T08:17:13.309957671Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 6.499727904s" Aug 19 08:17:13.310063 containerd[2014]: time="2025-08-19T08:17:13.309999505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 19 08:17:13.318305 containerd[2014]: time="2025-08-19T08:17:13.318061638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 08:17:13.447855 containerd[2014]: time="2025-08-19T08:17:13.447304624Z" level=info msg="CreateContainer within sandbox \"38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 08:17:13.522499 containerd[2014]: time="2025-08-19T08:17:13.522405418Z" level=info msg="Container b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:13.543380 containerd[2014]: time="2025-08-19T08:17:13.542653696Z" level=info msg="CreateContainer within sandbox \"38422b7260b551aaaf5fac52ef714b397effa4c49dfb1cc24c6179f9d98c65bf\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\"" Aug 19 08:17:13.566569 containerd[2014]: time="2025-08-19T08:17:13.566529923Z" level=info msg="StartContainer for \"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\"" Aug 19 08:17:13.595754 containerd[2014]: time="2025-08-19T08:17:13.595697680Z" level=info msg="connecting to shim b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc" address="unix:///run/containerd/s/7faad9f90e1880c177abf22fd13e00a22b3519319785d5020ab99e0466d955ee" protocol=ttrpc version=3 Aug 19 08:17:13.801437 systemd[1]: Started cri-containerd-b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc.scope - libcontainer container b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc. Aug 19 08:17:13.835498 systemd[1]: Started sshd@12-172.31.23.28:22-139.178.89.65:54780.service - OpenSSH per-connection server daemon (139.178.89.65:54780). Aug 19 08:17:14.256516 containerd[2014]: time="2025-08-19T08:17:14.256347368Z" level=info msg="StartContainer for \"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" returns successfully" Aug 19 08:17:14.274923 sshd[6055]: Accepted publickey for core from 139.178.89.65 port 54780 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:14.279936 sshd-session[6055]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:14.294903 systemd-logind[1976]: New session 13 of user core. Aug 19 08:17:14.300391 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 08:17:14.818958 containerd[2014]: time="2025-08-19T08:17:14.818877188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" id:\"439995b8cc422bb1b1296954e8d15ef0cea31dd863ecddc193eebc1d2a235b1d\" pid:6101 exited_at:{seconds:1755591434 nanos:753250013}" Aug 19 08:17:14.906234 kubelet[3281]: I0819 08:17:14.901948 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-85b4b76999-hwqxm" podStartSLOduration=30.537078909 podStartE2EDuration="49.879378257s" podCreationTimestamp="2025-08-19 08:16:25 +0000 UTC" firstStartedPulling="2025-08-19 08:16:53.975521524 +0000 UTC m=+53.526357855" lastFinishedPulling="2025-08-19 08:17:13.317820868 +0000 UTC m=+72.868657203" observedRunningTime="2025-08-19 08:17:14.646978244 +0000 UTC m=+74.197814583" watchObservedRunningTime="2025-08-19 08:17:14.879378257 +0000 UTC m=+74.430214589" Aug 19 08:17:15.489337 sshd[6079]: Connection closed by 139.178.89.65 port 54780 Aug 19 08:17:15.491513 sshd-session[6055]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:15.496804 systemd[1]: sshd@12-172.31.23.28:22-139.178.89.65:54780.service: Deactivated successfully. Aug 19 08:17:15.500306 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 08:17:15.501385 systemd-logind[1976]: Session 13 logged out. Waiting for processes to exit. Aug 19 08:17:15.504468 systemd-logind[1976]: Removed session 13. Aug 19 08:17:16.895549 containerd[2014]: time="2025-08-19T08:17:16.895492830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:16.897689 containerd[2014]: time="2025-08-19T08:17:16.897471473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 19 08:17:16.900055 containerd[2014]: time="2025-08-19T08:17:16.900001578Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:16.904165 containerd[2014]: time="2025-08-19T08:17:16.904121660Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:16.905069 containerd[2014]: time="2025-08-19T08:17:16.904644229Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 3.58654018s" Aug 19 08:17:16.905069 containerd[2014]: time="2025-08-19T08:17:16.904681392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 19 08:17:16.908110 containerd[2014]: time="2025-08-19T08:17:16.907831354Z" level=info msg="CreateContainer within sandbox \"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 08:17:16.924393 containerd[2014]: time="2025-08-19T08:17:16.923842926Z" level=info msg="Container 6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:16.957366 containerd[2014]: time="2025-08-19T08:17:16.957306361Z" level=info msg="CreateContainer within sandbox \"5cf9498aed6019b6e2e7525a3847c759df54ac95a855c1bf091db30e537d580c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad\"" Aug 19 08:17:16.958022 containerd[2014]: time="2025-08-19T08:17:16.957947402Z" level=info msg="StartContainer for \"6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad\"" Aug 19 08:17:16.960534 containerd[2014]: time="2025-08-19T08:17:16.960490668Z" level=info msg="connecting to shim 6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad" address="unix:///run/containerd/s/80e67bd57fb5ea1d749e06d802fd36c9b4bb6e4381b36c8000a2c83b89733303" protocol=ttrpc version=3 Aug 19 08:17:16.990637 systemd[1]: Started cri-containerd-6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad.scope - libcontainer container 6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad. Aug 19 08:17:17.085568 containerd[2014]: time="2025-08-19T08:17:17.085534982Z" level=info msg="StartContainer for \"6b6afbc854f9c87ba1faad0c45d7fdf2f83b5bfc91ff10a6f8e4a9d2308f48ad\" returns successfully" Aug 19 08:17:17.592861 kubelet[3281]: I0819 08:17:17.592677 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fhzgx" podStartSLOduration=28.179730826 podStartE2EDuration="53.592653792s" podCreationTimestamp="2025-08-19 08:16:24 +0000 UTC" firstStartedPulling="2025-08-19 08:16:51.492812179 +0000 UTC m=+51.043648508" lastFinishedPulling="2025-08-19 08:17:16.905735155 +0000 UTC m=+76.456571474" observedRunningTime="2025-08-19 08:17:17.585721703 +0000 UTC m=+77.136558043" watchObservedRunningTime="2025-08-19 08:17:17.592653792 +0000 UTC m=+77.143490127" Aug 19 08:17:17.962548 kubelet[3281]: I0819 08:17:17.957956 3281 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 08:17:17.965440 kubelet[3281]: I0819 08:17:17.965400 3281 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 08:17:19.950469 containerd[2014]: time="2025-08-19T08:17:19.950419555Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"b48a4cd4c9c493178b1cd60b65e85203d58d8eb5ee188e9a1b04d3f336b50aad\" pid:6172 exited_at:{seconds:1755591439 nanos:950115138}" Aug 19 08:17:20.525136 systemd[1]: Started sshd@13-172.31.23.28:22-139.178.89.65:35130.service - OpenSSH per-connection server daemon (139.178.89.65:35130). Aug 19 08:17:20.816633 sshd[6183]: Accepted publickey for core from 139.178.89.65 port 35130 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:20.821377 sshd-session[6183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:20.828719 systemd-logind[1976]: New session 14 of user core. Aug 19 08:17:20.833290 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 08:17:21.481645 sshd[6186]: Connection closed by 139.178.89.65 port 35130 Aug 19 08:17:21.483287 sshd-session[6183]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:21.486891 systemd[1]: sshd@13-172.31.23.28:22-139.178.89.65:35130.service: Deactivated successfully. Aug 19 08:17:21.489443 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 08:17:21.492182 systemd-logind[1976]: Session 14 logged out. Waiting for processes to exit. Aug 19 08:17:21.493759 systemd-logind[1976]: Removed session 14. Aug 19 08:17:26.521407 systemd[1]: Started sshd@14-172.31.23.28:22-139.178.89.65:35140.service - OpenSSH per-connection server daemon (139.178.89.65:35140). Aug 19 08:17:26.845450 sshd[6229]: Accepted publickey for core from 139.178.89.65 port 35140 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:26.848991 sshd-session[6229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:26.860100 systemd-logind[1976]: New session 15 of user core. Aug 19 08:17:26.864592 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 08:17:26.897848 containerd[2014]: time="2025-08-19T08:17:26.897740728Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" id:\"e790751be14b1b2516d6afcd3643ad067a267f0ef730a3007fd9d2a203530a57\" pid:6218 exited_at:{seconds:1755591446 nanos:892688815}" Aug 19 08:17:28.022524 sshd[6233]: Connection closed by 139.178.89.65 port 35140 Aug 19 08:17:28.030450 sshd-session[6229]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:28.043403 systemd-logind[1976]: Session 15 logged out. Waiting for processes to exit. Aug 19 08:17:28.043657 systemd[1]: sshd@14-172.31.23.28:22-139.178.89.65:35140.service: Deactivated successfully. Aug 19 08:17:28.045867 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 08:17:28.048022 systemd-logind[1976]: Removed session 15. Aug 19 08:17:33.062509 systemd[1]: Started sshd@15-172.31.23.28:22-139.178.89.65:40824.service - OpenSSH per-connection server daemon (139.178.89.65:40824). Aug 19 08:17:33.259111 sshd[6244]: Accepted publickey for core from 139.178.89.65 port 40824 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:33.261435 sshd-session[6244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:33.268985 systemd-logind[1976]: New session 16 of user core. Aug 19 08:17:33.278259 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 08:17:33.814101 sshd[6247]: Connection closed by 139.178.89.65 port 40824 Aug 19 08:17:33.814280 sshd-session[6244]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:33.820198 systemd[1]: sshd@15-172.31.23.28:22-139.178.89.65:40824.service: Deactivated successfully. Aug 19 08:17:33.822723 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 08:17:33.823738 systemd-logind[1976]: Session 16 logged out. Waiting for processes to exit. Aug 19 08:17:33.825305 systemd-logind[1976]: Removed session 16. Aug 19 08:17:33.850541 systemd[1]: Started sshd@16-172.31.23.28:22-139.178.89.65:40832.service - OpenSSH per-connection server daemon (139.178.89.65:40832). Aug 19 08:17:34.083784 sshd[6259]: Accepted publickey for core from 139.178.89.65 port 40832 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:34.085117 sshd-session[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:34.091224 systemd-logind[1976]: New session 17 of user core. Aug 19 08:17:34.095265 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 08:17:34.761628 sshd[6262]: Connection closed by 139.178.89.65 port 40832 Aug 19 08:17:34.767829 sshd-session[6259]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:34.778217 systemd[1]: sshd@16-172.31.23.28:22-139.178.89.65:40832.service: Deactivated successfully. Aug 19 08:17:34.780798 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 08:17:34.783266 systemd-logind[1976]: Session 17 logged out. Waiting for processes to exit. Aug 19 08:17:34.796262 systemd[1]: Started sshd@17-172.31.23.28:22-139.178.89.65:40848.service - OpenSSH per-connection server daemon (139.178.89.65:40848). Aug 19 08:17:34.797569 systemd-logind[1976]: Removed session 17. Aug 19 08:17:34.977715 sshd[6273]: Accepted publickey for core from 139.178.89.65 port 40848 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:34.979607 sshd-session[6273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:34.985150 systemd-logind[1976]: New session 18 of user core. Aug 19 08:17:34.990229 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 08:17:37.867629 sshd[6276]: Connection closed by 139.178.89.65 port 40848 Aug 19 08:17:37.881254 sshd-session[6273]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:37.954005 systemd[1]: sshd@17-172.31.23.28:22-139.178.89.65:40848.service: Deactivated successfully. Aug 19 08:17:37.960563 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 08:17:37.960847 systemd[1]: session-18.scope: Consumed 739ms CPU time, 82.1M memory peak. Aug 19 08:17:37.971260 systemd-logind[1976]: Session 18 logged out. Waiting for processes to exit. Aug 19 08:17:37.986020 systemd[1]: Started sshd@18-172.31.23.28:22-139.178.89.65:40862.service - OpenSSH per-connection server daemon (139.178.89.65:40862). Aug 19 08:17:38.002414 systemd-logind[1976]: Removed session 18. Aug 19 08:17:38.361643 sshd[6301]: Accepted publickey for core from 139.178.89.65 port 40862 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:38.369330 sshd-session[6301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:38.392116 systemd-logind[1976]: New session 19 of user core. Aug 19 08:17:38.396433 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 08:17:39.530350 containerd[2014]: time="2025-08-19T08:17:39.530012703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" id:\"7db77fd07e541bbfd1fc854f24ea5a619d5a68f976b56f608e253f6aa4c1adf6\" pid:6339 exited_at:{seconds:1755591459 nanos:500592873}" Aug 19 08:17:40.577581 containerd[2014]: time="2025-08-19T08:17:40.576709114Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"8a2a9afdedb772ff237b0761320f5a4055847f00e29c1748bb85a5cd1636d566\" pid:6346 exited_at:{seconds:1755591460 nanos:575876626}" Aug 19 08:17:40.672644 sshd[6306]: Connection closed by 139.178.89.65 port 40862 Aug 19 08:17:40.676809 sshd-session[6301]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:40.688462 systemd[1]: sshd@18-172.31.23.28:22-139.178.89.65:40862.service: Deactivated successfully. Aug 19 08:17:40.690825 systemd-logind[1976]: Session 19 logged out. Waiting for processes to exit. Aug 19 08:17:40.696683 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 08:17:40.697751 systemd[1]: session-19.scope: Consumed 878ms CPU time, 65.5M memory peak. Aug 19 08:17:40.727128 systemd[1]: Started sshd@19-172.31.23.28:22-139.178.89.65:45448.service - OpenSSH per-connection server daemon (139.178.89.65:45448). Aug 19 08:17:40.730944 systemd-logind[1976]: Removed session 19. Aug 19 08:17:41.017499 sshd[6364]: Accepted publickey for core from 139.178.89.65 port 45448 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:41.022616 sshd-session[6364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:41.058109 systemd-logind[1976]: New session 20 of user core. Aug 19 08:17:41.063251 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 08:17:41.855214 sshd[6367]: Connection closed by 139.178.89.65 port 45448 Aug 19 08:17:41.856435 sshd-session[6364]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:41.863719 systemd[1]: sshd@19-172.31.23.28:22-139.178.89.65:45448.service: Deactivated successfully. Aug 19 08:17:41.865545 systemd-logind[1976]: Session 20 logged out. Waiting for processes to exit. Aug 19 08:17:41.867904 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 08:17:41.876542 systemd-logind[1976]: Removed session 20. Aug 19 08:17:46.889300 systemd[1]: Started sshd@20-172.31.23.28:22-139.178.89.65:45462.service - OpenSSH per-connection server daemon (139.178.89.65:45462). Aug 19 08:17:47.190517 sshd[6384]: Accepted publickey for core from 139.178.89.65 port 45462 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:47.195269 sshd-session[6384]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:47.205268 systemd-logind[1976]: New session 21 of user core. Aug 19 08:17:47.209457 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 08:17:48.515202 sshd[6387]: Connection closed by 139.178.89.65 port 45462 Aug 19 08:17:48.513608 sshd-session[6384]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:48.522114 systemd[1]: sshd@20-172.31.23.28:22-139.178.89.65:45462.service: Deactivated successfully. Aug 19 08:17:48.525625 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 08:17:48.528418 systemd-logind[1976]: Session 21 logged out. Waiting for processes to exit. Aug 19 08:17:48.533345 systemd-logind[1976]: Removed session 21. Aug 19 08:17:53.548426 systemd[1]: Started sshd@21-172.31.23.28:22-139.178.89.65:55964.service - OpenSSH per-connection server daemon (139.178.89.65:55964). Aug 19 08:17:53.778168 sshd[6403]: Accepted publickey for core from 139.178.89.65 port 55964 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:53.779727 sshd-session[6403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:53.789456 systemd-logind[1976]: New session 22 of user core. Aug 19 08:17:53.794252 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 08:17:54.021920 sshd[6406]: Connection closed by 139.178.89.65 port 55964 Aug 19 08:17:54.023259 sshd-session[6403]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:54.027739 systemd[1]: sshd@21-172.31.23.28:22-139.178.89.65:55964.service: Deactivated successfully. Aug 19 08:17:54.029814 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 08:17:54.030750 systemd-logind[1976]: Session 22 logged out. Waiting for processes to exit. Aug 19 08:17:54.032773 systemd-logind[1976]: Removed session 22. Aug 19 08:17:56.812816 containerd[2014]: time="2025-08-19T08:17:56.812755539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" id:\"62b565be8aa0b4b09328060ed232cb22685a99e2cf99ff5ca90bba8f8f8ac528\" pid:6431 exited_at:{seconds:1755591476 nanos:760145521}" Aug 19 08:17:59.063338 systemd[1]: Started sshd@22-172.31.23.28:22-139.178.89.65:56734.service - OpenSSH per-connection server daemon (139.178.89.65:56734). Aug 19 08:17:59.373406 sshd[6443]: Accepted publickey for core from 139.178.89.65 port 56734 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:17:59.375287 sshd-session[6443]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:17:59.384207 systemd-logind[1976]: New session 23 of user core. Aug 19 08:17:59.389601 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 19 08:17:59.928565 sshd[6447]: Connection closed by 139.178.89.65 port 56734 Aug 19 08:17:59.929762 sshd-session[6443]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:59.936812 systemd-logind[1976]: Session 23 logged out. Waiting for processes to exit. Aug 19 08:17:59.937403 systemd[1]: sshd@22-172.31.23.28:22-139.178.89.65:56734.service: Deactivated successfully. Aug 19 08:17:59.941785 systemd[1]: session-23.scope: Deactivated successfully. Aug 19 08:17:59.948792 systemd-logind[1976]: Removed session 23. Aug 19 08:18:04.971567 systemd[1]: Started sshd@23-172.31.23.28:22-139.178.89.65:56748.service - OpenSSH per-connection server daemon (139.178.89.65:56748). Aug 19 08:18:05.323145 sshd[6463]: Accepted publickey for core from 139.178.89.65 port 56748 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:18:05.327853 sshd-session[6463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:05.341348 systemd-logind[1976]: New session 24 of user core. Aug 19 08:18:05.347272 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 19 08:18:06.468582 sshd[6466]: Connection closed by 139.178.89.65 port 56748 Aug 19 08:18:06.473271 sshd-session[6463]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:06.483012 systemd[1]: sshd@23-172.31.23.28:22-139.178.89.65:56748.service: Deactivated successfully. Aug 19 08:18:06.488022 systemd[1]: session-24.scope: Deactivated successfully. Aug 19 08:18:06.490926 systemd-logind[1976]: Session 24 logged out. Waiting for processes to exit. Aug 19 08:18:06.495700 systemd-logind[1976]: Removed session 24. Aug 19 08:18:07.575959 containerd[2014]: time="2025-08-19T08:18:07.575896526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" id:\"fd321afd6fbafeb064792d1ebdbd6fbab10af70ac83b7beceb00d45cf570d89e\" pid:6507 exited_at:{seconds:1755591487 nanos:565815789}" Aug 19 08:18:08.199171 containerd[2014]: time="2025-08-19T08:18:08.164612240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"131c77de628bf3353e2e3c4b58300bb71d4db72f364fac5a7350186f008e06d4\" pid:6499 exited_at:{seconds:1755591488 nanos:162653358}" Aug 19 08:18:11.527353 systemd[1]: Started sshd@24-172.31.23.28:22-139.178.89.65:47766.service - OpenSSH per-connection server daemon (139.178.89.65:47766). Aug 19 08:18:11.762552 sshd[6524]: Accepted publickey for core from 139.178.89.65 port 47766 ssh2: RSA SHA256:SANMFU2cnK7/l5OXguT20pR9hK/sN9KMMvxRhI/63O4 Aug 19 08:18:11.764410 sshd-session[6524]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:11.770986 systemd-logind[1976]: New session 25 of user core. Aug 19 08:18:11.778583 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 19 08:18:12.485001 sshd[6527]: Connection closed by 139.178.89.65 port 47766 Aug 19 08:18:12.488291 sshd-session[6524]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:12.494505 systemd[1]: sshd@24-172.31.23.28:22-139.178.89.65:47766.service: Deactivated successfully. Aug 19 08:18:12.498429 systemd[1]: session-25.scope: Deactivated successfully. Aug 19 08:18:12.501369 systemd-logind[1976]: Session 25 logged out. Waiting for processes to exit. Aug 19 08:18:12.504901 systemd-logind[1976]: Removed session 25. Aug 19 08:18:19.047611 containerd[2014]: time="2025-08-19T08:18:19.047530706Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" id:\"ddf8397c16417c5ac29771a5f106652a3773192a62b6859636c32f8927cd80e3\" pid:6557 exited_at:{seconds:1755591499 nanos:46823070}" Aug 19 08:18:20.114763 containerd[2014]: time="2025-08-19T08:18:20.114719534Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"f4f182b8892b74eb0b0b7efccd52b4d01058d24a1dd5cd95423151fa4a9e685e\" pid:6579 exited_at:{seconds:1755591500 nanos:114330010}" Aug 19 08:18:26.274845 containerd[2014]: time="2025-08-19T08:18:26.274799416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcef3581e4349e4b398004aa3d65afa5781cdbd287f2de77310d27e9a01fca28\" id:\"62800b7b94c9ba0f0a2fcebcb1c3a6a45c3bfc6f2fce1f16813c7f2a25c60378\" pid:6604 exit_status:1 exited_at:{seconds:1755591506 nanos:274325841}" Aug 19 08:18:37.409565 containerd[2014]: time="2025-08-19T08:18:37.409515525Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1a7260f2cc4b5b7a07062135d3cd6c11f4d02640c7eb9582957642e2b1996fc\" id:\"0ed79eda81b5253af7d75b30892984f993772a341e1b2510a81097b20fa8b18d\" pid:6671 exited_at:{seconds:1755591517 nanos:391501310}" Aug 19 08:18:37.410012 containerd[2014]: time="2025-08-19T08:18:37.409615227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b61a60649224c6ced9e1f442acf1488fa604668911f0a73a1d1ac222fa2e5723\" id:\"2d6ab9e47d7f0b6fe0cc30e6c2f64fbbd8d7466afc7c53d0ba331fc0266c4d80\" pid:6649 exited_at:{seconds:1755591517 nanos:402896298}" Aug 19 08:18:39.015829 systemd[1]: cri-containerd-814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923.scope: Deactivated successfully. Aug 19 08:18:39.028931 systemd[1]: cri-containerd-814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923.scope: Consumed 14.040s CPU time, 105.5M memory peak, 100.2M read from disk. Aug 19 08:18:39.164881 containerd[2014]: time="2025-08-19T08:18:39.164818694Z" level=info msg="TaskExit event in podsandbox handler container_id:\"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\" id:\"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\" pid:3599 exit_status:1 exited_at:{seconds:1755591519 nanos:131426568}" Aug 19 08:18:39.194253 containerd[2014]: time="2025-08-19T08:18:39.182084335Z" level=info msg="received exit event container_id:\"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\" id:\"814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923\" pid:3599 exit_status:1 exited_at:{seconds:1755591519 nanos:131426568}" Aug 19 08:18:39.324115 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923-rootfs.mount: Deactivated successfully. Aug 19 08:18:39.601978 systemd[1]: cri-containerd-03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666.scope: Deactivated successfully. Aug 19 08:18:39.602441 systemd[1]: cri-containerd-03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666.scope: Consumed 3.482s CPU time, 81.7M memory peak, 132.6M read from disk. Aug 19 08:18:39.616575 containerd[2014]: time="2025-08-19T08:18:39.616195648Z" level=info msg="received exit event container_id:\"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\" id:\"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\" pid:3092 exit_status:1 exited_at:{seconds:1755591519 nanos:615165893}" Aug 19 08:18:39.617051 containerd[2014]: time="2025-08-19T08:18:39.616458768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\" id:\"03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666\" pid:3092 exit_status:1 exited_at:{seconds:1755591519 nanos:615165893}" Aug 19 08:18:39.651715 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666-rootfs.mount: Deactivated successfully. Aug 19 08:18:39.907530 kubelet[3281]: I0819 08:18:39.901358 3281 scope.go:117] "RemoveContainer" containerID="814f579cfb62e75e25f9a8879fa127503ec3b092ee88a0efdc912b0c9a4b0923" Aug 19 08:18:39.919229 kubelet[3281]: I0819 08:18:39.919084 3281 scope.go:117] "RemoveContainer" containerID="03324a31ab015fc9a2c5b32c6e9d6535ae3df0b6cab050c56cd48a54466e3666" Aug 19 08:18:39.986978 containerd[2014]: time="2025-08-19T08:18:39.986931467Z" level=info msg="CreateContainer within sandbox \"26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Aug 19 08:18:39.987414 containerd[2014]: time="2025-08-19T08:18:39.986940781Z" level=info msg="CreateContainer within sandbox \"bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Aug 19 08:18:40.123474 containerd[2014]: time="2025-08-19T08:18:40.123135740Z" level=info msg="Container 20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:40.133686 containerd[2014]: time="2025-08-19T08:18:40.133649213Z" level=info msg="Container 43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:40.135373 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1651088282.mount: Deactivated successfully. Aug 19 08:18:40.165138 containerd[2014]: time="2025-08-19T08:18:40.164535272Z" level=info msg="CreateContainer within sandbox \"bc44d980f336defe0e9e95977a416f17c25fc404a85acfa924a1b8dd40c656bc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a\"" Aug 19 08:18:40.167627 containerd[2014]: time="2025-08-19T08:18:40.167527774Z" level=info msg="CreateContainer within sandbox \"26ead25ce943da983885d347d72dee684cc450e21027101dbe06b1e2d327d892\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca\"" Aug 19 08:18:40.170787 containerd[2014]: time="2025-08-19T08:18:40.170748175Z" level=info msg="StartContainer for \"20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca\"" Aug 19 08:18:40.173120 containerd[2014]: time="2025-08-19T08:18:40.173023788Z" level=info msg="StartContainer for \"43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a\"" Aug 19 08:18:40.177090 containerd[2014]: time="2025-08-19T08:18:40.176082190Z" level=info msg="connecting to shim 20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca" address="unix:///run/containerd/s/d823c6a8d3c42dd2f1e8118bdea90928666ca81138f939cb6bd0a15cf4e653e6" protocol=ttrpc version=3 Aug 19 08:18:40.182240 containerd[2014]: time="2025-08-19T08:18:40.182135004Z" level=info msg="connecting to shim 43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a" address="unix:///run/containerd/s/84ec6d575c1a36d097e53ae3292fa6e67a5cef0c1c909a16f28469bbfa5fdbbc" protocol=ttrpc version=3 Aug 19 08:18:40.277300 systemd[1]: Started cri-containerd-20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca.scope - libcontainer container 20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca. Aug 19 08:18:40.280190 systemd[1]: Started cri-containerd-43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a.scope - libcontainer container 43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a. Aug 19 08:18:40.401136 containerd[2014]: time="2025-08-19T08:18:40.401054184Z" level=info msg="StartContainer for \"20d9b60dde9fd8b9c1cdf2d7cfc264e17f46a0595aea21ab0a38ddc65e34efca\" returns successfully" Aug 19 08:18:40.429467 containerd[2014]: time="2025-08-19T08:18:40.429332312Z" level=info msg="StartContainer for \"43a98b422bf2634742c6a6128d61cd59effb56ed520e1cd0b36675dff5acaa9a\" returns successfully" Aug 19 08:18:43.436141 systemd[1]: cri-containerd-8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901.scope: Deactivated successfully. Aug 19 08:18:43.436439 systemd[1]: cri-containerd-8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901.scope: Consumed 1.748s CPU time, 37.3M memory peak, 86.5M read from disk. Aug 19 08:18:43.439383 containerd[2014]: time="2025-08-19T08:18:43.439340765Z" level=info msg="received exit event container_id:\"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\" id:\"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\" pid:3128 exit_status:1 exited_at:{seconds:1755591523 nanos:438807617}" Aug 19 08:18:43.440576 containerd[2014]: time="2025-08-19T08:18:43.439360484Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\" id:\"8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901\" pid:3128 exit_status:1 exited_at:{seconds:1755591523 nanos:438807617}" Aug 19 08:18:43.482639 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901-rootfs.mount: Deactivated successfully. Aug 19 08:18:43.852544 kubelet[3281]: I0819 08:18:43.852437 3281 scope.go:117] "RemoveContainer" containerID="8663b1b649ed995e58ed4b8da12ce3f5e981f98e3478945020d26318d1042901" Aug 19 08:18:43.854976 containerd[2014]: time="2025-08-19T08:18:43.854930786Z" level=info msg="CreateContainer within sandbox \"f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Aug 19 08:18:43.883435 containerd[2014]: time="2025-08-19T08:18:43.883383445Z" level=info msg="Container 8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:43.898186 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount380804234.mount: Deactivated successfully. Aug 19 08:18:43.904357 containerd[2014]: time="2025-08-19T08:18:43.904293156Z" level=info msg="CreateContainer within sandbox \"f0fc598301dc36071eb59ed71363744d96564a865769f4cbe9a9242910e94818\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514\"" Aug 19 08:18:43.904862 containerd[2014]: time="2025-08-19T08:18:43.904804431Z" level=info msg="StartContainer for \"8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514\"" Aug 19 08:18:43.906242 containerd[2014]: time="2025-08-19T08:18:43.906203092Z" level=info msg="connecting to shim 8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514" address="unix:///run/containerd/s/bab843e31cfaa8a765cefa333c93a27ee98a1dc51159e0847dff480a0d68f586" protocol=ttrpc version=3 Aug 19 08:18:43.935255 systemd[1]: Started cri-containerd-8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514.scope - libcontainer container 8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514. Aug 19 08:18:43.994314 kubelet[3281]: E0819 08:18:43.994074 3281 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ip-172-31-23-28)" Aug 19 08:18:44.009546 containerd[2014]: time="2025-08-19T08:18:44.009507047Z" level=info msg="StartContainer for \"8aca8b82872712416252a1c2a87600df40eeb636cd635986cd08dc65bd4c8514\" returns successfully"