Sep 9 05:33:51.907111 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 03:39:34 -00 2025 Sep 9 05:33:51.907149 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:33:51.907163 kernel: BIOS-provided physical RAM map: Sep 9 05:33:51.907174 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 9 05:33:51.907184 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000786cdfff] usable Sep 9 05:33:51.907194 kernel: BIOS-e820: [mem 0x00000000786ce000-0x000000007894dfff] reserved Sep 9 05:33:51.907207 kernel: BIOS-e820: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Sep 9 05:33:51.907218 kernel: BIOS-e820: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Sep 9 05:33:51.907232 kernel: BIOS-e820: [mem 0x00000000789de000-0x000000007c97bfff] usable Sep 9 05:33:51.907243 kernel: BIOS-e820: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Sep 9 05:33:51.907253 kernel: NX (Execute Disable) protection: active Sep 9 05:33:51.907264 kernel: APIC: Static calls initialized Sep 9 05:33:51.907275 kernel: e820: update [mem 0x768c0018-0x768c8e57] usable ==> usable Sep 9 05:33:51.907287 kernel: extended physical RAM map: Sep 9 05:33:51.907305 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 9 05:33:51.907318 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000768c0017] usable Sep 9 05:33:51.907331 kernel: reserve setup_data: [mem 0x00000000768c0018-0x00000000768c8e57] usable Sep 9 05:33:51.907343 kernel: reserve setup_data: [mem 0x00000000768c8e58-0x00000000786cdfff] usable Sep 9 05:33:51.907356 kernel: reserve setup_data: [mem 0x00000000786ce000-0x000000007894dfff] reserved Sep 9 05:33:51.907369 kernel: reserve setup_data: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Sep 9 05:33:51.907382 kernel: reserve setup_data: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Sep 9 05:33:51.907395 kernel: reserve setup_data: [mem 0x00000000789de000-0x000000007c97bfff] usable Sep 9 05:33:51.907407 kernel: reserve setup_data: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Sep 9 05:33:51.907420 kernel: efi: EFI v2.7 by EDK II Sep 9 05:33:51.907436 kernel: efi: SMBIOS=0x7886a000 ACPI=0x7895d000 ACPI 2.0=0x7895d014 MEMATTR=0x77003518 Sep 9 05:33:51.907449 kernel: secureboot: Secure boot disabled Sep 9 05:33:51.907461 kernel: SMBIOS 2.7 present. Sep 9 05:33:51.907474 kernel: DMI: Amazon EC2 t3.small/, BIOS 1.0 10/16/2017 Sep 9 05:33:51.907487 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:33:51.907499 kernel: Hypervisor detected: KVM Sep 9 05:33:51.907512 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 05:33:51.907526 kernel: kvm-clock: using sched offset of 5339837080 cycles Sep 9 05:33:51.907878 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 05:33:51.907893 kernel: tsc: Detected 2499.998 MHz processor Sep 9 05:33:51.907906 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 05:33:51.907923 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 05:33:51.907936 kernel: last_pfn = 0x7c97c max_arch_pfn = 0x400000000 Sep 9 05:33:51.907949 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 9 05:33:51.907961 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 05:33:51.907974 kernel: Using GB pages for direct mapping Sep 9 05:33:51.907992 kernel: ACPI: Early table checksum verification disabled Sep 9 05:33:51.908009 kernel: ACPI: RSDP 0x000000007895D014 000024 (v02 AMAZON) Sep 9 05:33:51.908023 kernel: ACPI: XSDT 0x000000007895C0E8 00006C (v01 AMAZON AMZNFACP 00000001 01000013) Sep 9 05:33:51.908036 kernel: ACPI: FACP 0x0000000078955000 000114 (v01 AMAZON AMZNFACP 00000001 AMZN 00000001) Sep 9 05:33:51.908050 kernel: ACPI: DSDT 0x0000000078956000 00115A (v01 AMAZON AMZNDSDT 00000001 AMZN 00000001) Sep 9 05:33:51.908064 kernel: ACPI: FACS 0x00000000789D0000 000040 Sep 9 05:33:51.908077 kernel: ACPI: WAET 0x000000007895B000 000028 (v01 AMAZON AMZNWAET 00000001 AMZN 00000001) Sep 9 05:33:51.908091 kernel: ACPI: SLIT 0x000000007895A000 00006C (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Sep 9 05:33:51.908104 kernel: ACPI: APIC 0x0000000078959000 000076 (v01 AMAZON AMZNAPIC 00000001 AMZN 00000001) Sep 9 05:33:51.908120 kernel: ACPI: SRAT 0x0000000078958000 0000A0 (v01 AMAZON AMZNSRAT 00000001 AMZN 00000001) Sep 9 05:33:51.908134 kernel: ACPI: HPET 0x0000000078954000 000038 (v01 AMAZON AMZNHPET 00000001 AMZN 00000001) Sep 9 05:33:51.908147 kernel: ACPI: SSDT 0x0000000078953000 000759 (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Sep 9 05:33:51.908160 kernel: ACPI: SSDT 0x0000000078952000 00007F (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Sep 9 05:33:51.908174 kernel: ACPI: BGRT 0x0000000078951000 000038 (v01 AMAZON AMAZON 00000002 01000013) Sep 9 05:33:51.908187 kernel: ACPI: Reserving FACP table memory at [mem 0x78955000-0x78955113] Sep 9 05:33:51.908201 kernel: ACPI: Reserving DSDT table memory at [mem 0x78956000-0x78957159] Sep 9 05:33:51.908214 kernel: ACPI: Reserving FACS table memory at [mem 0x789d0000-0x789d003f] Sep 9 05:33:51.908230 kernel: ACPI: Reserving WAET table memory at [mem 0x7895b000-0x7895b027] Sep 9 05:33:51.908243 kernel: ACPI: Reserving SLIT table memory at [mem 0x7895a000-0x7895a06b] Sep 9 05:33:51.908256 kernel: ACPI: Reserving APIC table memory at [mem 0x78959000-0x78959075] Sep 9 05:33:51.908270 kernel: ACPI: Reserving SRAT table memory at [mem 0x78958000-0x7895809f] Sep 9 05:33:51.908283 kernel: ACPI: Reserving HPET table memory at [mem 0x78954000-0x78954037] Sep 9 05:33:51.908297 kernel: ACPI: Reserving SSDT table memory at [mem 0x78953000-0x78953758] Sep 9 05:33:51.908310 kernel: ACPI: Reserving SSDT table memory at [mem 0x78952000-0x7895207e] Sep 9 05:33:51.908323 kernel: ACPI: Reserving BGRT table memory at [mem 0x78951000-0x78951037] Sep 9 05:33:51.908337 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x7fffffff] Sep 9 05:33:51.908350 kernel: NUMA: Initialized distance table, cnt=1 Sep 9 05:33:51.908366 kernel: NODE_DATA(0) allocated [mem 0x7a8eddc0-0x7a8f4fff] Sep 9 05:33:51.908380 kernel: Zone ranges: Sep 9 05:33:51.908393 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 05:33:51.908406 kernel: DMA32 [mem 0x0000000001000000-0x000000007c97bfff] Sep 9 05:33:51.908420 kernel: Normal empty Sep 9 05:33:51.908433 kernel: Device empty Sep 9 05:33:51.908447 kernel: Movable zone start for each node Sep 9 05:33:51.908460 kernel: Early memory node ranges Sep 9 05:33:51.908473 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 9 05:33:51.908489 kernel: node 0: [mem 0x0000000000100000-0x00000000786cdfff] Sep 9 05:33:51.908503 kernel: node 0: [mem 0x00000000789de000-0x000000007c97bfff] Sep 9 05:33:51.908516 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007c97bfff] Sep 9 05:33:51.908542 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:33:51.909360 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 9 05:33:51.909375 kernel: On node 0, zone DMA32: 784 pages in unavailable ranges Sep 9 05:33:51.909390 kernel: On node 0, zone DMA32: 13956 pages in unavailable ranges Sep 9 05:33:51.909405 kernel: ACPI: PM-Timer IO Port: 0xb008 Sep 9 05:33:51.909419 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 05:33:51.909438 kernel: IOAPIC[0]: apic_id 0, version 32, address 0xfec00000, GSI 0-23 Sep 9 05:33:51.909452 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 05:33:51.909466 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 05:33:51.909480 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 05:33:51.909495 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 05:33:51.909509 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 05:33:51.909524 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 9 05:33:51.909558 kernel: TSC deadline timer available Sep 9 05:33:51.909572 kernel: CPU topo: Max. logical packages: 1 Sep 9 05:33:51.909585 kernel: CPU topo: Max. logical dies: 1 Sep 9 05:33:51.909603 kernel: CPU topo: Max. dies per package: 1 Sep 9 05:33:51.909627 kernel: CPU topo: Max. threads per core: 2 Sep 9 05:33:51.909641 kernel: CPU topo: Num. cores per package: 1 Sep 9 05:33:51.909655 kernel: CPU topo: Num. threads per package: 2 Sep 9 05:33:51.909669 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 9 05:33:51.909683 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 05:33:51.909697 kernel: [mem 0x7ca00000-0xffffffff] available for PCI devices Sep 9 05:33:51.909712 kernel: Booting paravirtualized kernel on KVM Sep 9 05:33:51.909727 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 05:33:51.909744 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 9 05:33:51.909758 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 9 05:33:51.909772 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 9 05:33:51.909786 kernel: pcpu-alloc: [0] 0 1 Sep 9 05:33:51.909801 kernel: kvm-guest: PV spinlocks enabled Sep 9 05:33:51.909814 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 05:33:51.909832 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:33:51.909847 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:33:51.909864 kernel: random: crng init done Sep 9 05:33:51.909878 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 05:33:51.909891 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 05:33:51.909929 kernel: Fallback order for Node 0: 0 Sep 9 05:33:51.909941 kernel: Built 1 zonelists, mobility grouping on. Total pages: 509451 Sep 9 05:33:51.909954 kernel: Policy zone: DMA32 Sep 9 05:33:51.909980 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:33:51.909997 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 05:33:51.910011 kernel: Kernel/User page tables isolation: enabled Sep 9 05:33:51.910025 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 05:33:51.910038 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 05:33:51.910052 kernel: Dynamic Preempt: voluntary Sep 9 05:33:51.910069 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:33:51.910084 kernel: rcu: RCU event tracing is enabled. Sep 9 05:33:51.910098 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 05:33:51.910112 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:33:51.910126 kernel: Rude variant of Tasks RCU enabled. Sep 9 05:33:51.910142 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:33:51.910156 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:33:51.910170 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 05:33:51.910185 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:33:51.910198 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:33:51.910213 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:33:51.910227 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 9 05:33:51.910241 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:33:51.910255 kernel: Console: colour dummy device 80x25 Sep 9 05:33:51.910272 kernel: printk: legacy console [tty0] enabled Sep 9 05:33:51.910287 kernel: printk: legacy console [ttyS0] enabled Sep 9 05:33:51.910301 kernel: ACPI: Core revision 20240827 Sep 9 05:33:51.910315 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 30580167144 ns Sep 9 05:33:51.910329 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 05:33:51.910343 kernel: x2apic enabled Sep 9 05:33:51.910357 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 05:33:51.910380 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 9 05:33:51.910394 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Sep 9 05:33:51.910411 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 9 05:33:51.910425 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 9 05:33:51.910438 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 05:33:51.910452 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 05:33:51.910465 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 05:33:51.910479 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Sep 9 05:33:51.910494 kernel: RETBleed: Vulnerable Sep 9 05:33:51.910507 kernel: Speculative Store Bypass: Vulnerable Sep 9 05:33:51.910521 kernel: MDS: Vulnerable: Clear CPU buffers attempted, no microcode Sep 9 05:33:51.910546 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 9 05:33:51.910561 kernel: GDS: Unknown: Dependent on hypervisor status Sep 9 05:33:51.910576 kernel: active return thunk: its_return_thunk Sep 9 05:33:51.910589 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 05:33:51.910603 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 05:33:51.910618 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 05:33:51.910631 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 05:33:51.910644 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 9 05:33:51.910659 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 9 05:33:51.910673 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Sep 9 05:33:51.910687 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Sep 9 05:33:51.910701 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Sep 9 05:33:51.910719 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Sep 9 05:33:51.910733 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 05:33:51.910746 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 9 05:33:51.910760 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 9 05:33:51.910773 kernel: x86/fpu: xstate_offset[5]: 960, xstate_sizes[5]: 64 Sep 9 05:33:51.910787 kernel: x86/fpu: xstate_offset[6]: 1024, xstate_sizes[6]: 512 Sep 9 05:33:51.910801 kernel: x86/fpu: xstate_offset[7]: 1536, xstate_sizes[7]: 1024 Sep 9 05:33:51.910814 kernel: x86/fpu: xstate_offset[9]: 2560, xstate_sizes[9]: 8 Sep 9 05:33:51.910829 kernel: x86/fpu: Enabled xstate features 0x2ff, context size is 2568 bytes, using 'compacted' format. Sep 9 05:33:51.910843 kernel: Freeing SMP alternatives memory: 32K Sep 9 05:33:51.910857 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:33:51.910873 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:33:51.910886 kernel: landlock: Up and running. Sep 9 05:33:51.910900 kernel: SELinux: Initializing. Sep 9 05:33:51.910914 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:33:51.910928 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:33:51.910942 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz (family: 0x6, model: 0x55, stepping: 0x7) Sep 9 05:33:51.910955 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Sep 9 05:33:51.910969 kernel: signal: max sigframe size: 3632 Sep 9 05:33:51.910983 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:33:51.910997 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:33:51.911011 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 05:33:51.911029 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 05:33:51.911043 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:33:51.911056 kernel: smpboot: x86: Booting SMP configuration: Sep 9 05:33:51.911070 kernel: .... node #0, CPUs: #1 Sep 9 05:33:51.911085 kernel: Transient Scheduler Attacks: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Sep 9 05:33:51.911100 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 9 05:33:51.911114 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 05:33:51.911128 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Sep 9 05:33:51.911146 kernel: Memory: 1908056K/2037804K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 125192K reserved, 0K cma-reserved) Sep 9 05:33:51.911160 kernel: devtmpfs: initialized Sep 9 05:33:51.911174 kernel: x86/mm: Memory block size: 128MB Sep 9 05:33:51.911188 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7895e000-0x789ddfff] (524288 bytes) Sep 9 05:33:51.911202 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:33:51.911216 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 05:33:51.911230 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:33:51.911244 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:33:51.911257 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:33:51.911275 kernel: audit: type=2000 audit(1757396029.874:1): state=initialized audit_enabled=0 res=1 Sep 9 05:33:51.911289 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:33:51.911302 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 05:33:51.911316 kernel: cpuidle: using governor menu Sep 9 05:33:51.911330 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:33:51.911344 kernel: dca service started, version 1.12.1 Sep 9 05:33:51.911357 kernel: PCI: Using configuration type 1 for base access Sep 9 05:33:51.911372 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 05:33:51.911386 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:33:51.911403 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:33:51.911416 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:33:51.911431 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:33:51.911444 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:33:51.911458 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:33:51.911472 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:33:51.911486 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Sep 9 05:33:51.911500 kernel: ACPI: Interpreter enabled Sep 9 05:33:51.911514 kernel: ACPI: PM: (supports S0 S5) Sep 9 05:33:51.911540 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 05:33:51.911554 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 05:33:51.911568 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 05:33:51.911583 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Sep 9 05:33:51.911596 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:33:51.913710 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:33:51.913870 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Sep 9 05:33:51.914016 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Sep 9 05:33:51.914036 kernel: acpiphp: Slot [3] registered Sep 9 05:33:51.914052 kernel: acpiphp: Slot [4] registered Sep 9 05:33:51.914068 kernel: acpiphp: Slot [5] registered Sep 9 05:33:51.914083 kernel: acpiphp: Slot [6] registered Sep 9 05:33:51.914098 kernel: acpiphp: Slot [7] registered Sep 9 05:33:51.914114 kernel: acpiphp: Slot [8] registered Sep 9 05:33:51.914129 kernel: acpiphp: Slot [9] registered Sep 9 05:33:51.914144 kernel: acpiphp: Slot [10] registered Sep 9 05:33:51.914163 kernel: acpiphp: Slot [11] registered Sep 9 05:33:51.914178 kernel: acpiphp: Slot [12] registered Sep 9 05:33:51.914194 kernel: acpiphp: Slot [13] registered Sep 9 05:33:51.914208 kernel: acpiphp: Slot [14] registered Sep 9 05:33:51.914224 kernel: acpiphp: Slot [15] registered Sep 9 05:33:51.914240 kernel: acpiphp: Slot [16] registered Sep 9 05:33:51.914255 kernel: acpiphp: Slot [17] registered Sep 9 05:33:51.914270 kernel: acpiphp: Slot [18] registered Sep 9 05:33:51.914285 kernel: acpiphp: Slot [19] registered Sep 9 05:33:51.914300 kernel: acpiphp: Slot [20] registered Sep 9 05:33:51.914318 kernel: acpiphp: Slot [21] registered Sep 9 05:33:51.914333 kernel: acpiphp: Slot [22] registered Sep 9 05:33:51.914348 kernel: acpiphp: Slot [23] registered Sep 9 05:33:51.914374 kernel: acpiphp: Slot [24] registered Sep 9 05:33:51.914389 kernel: acpiphp: Slot [25] registered Sep 9 05:33:51.914404 kernel: acpiphp: Slot [26] registered Sep 9 05:33:51.914419 kernel: acpiphp: Slot [27] registered Sep 9 05:33:51.914434 kernel: acpiphp: Slot [28] registered Sep 9 05:33:51.914448 kernel: acpiphp: Slot [29] registered Sep 9 05:33:51.914464 kernel: acpiphp: Slot [30] registered Sep 9 05:33:51.914476 kernel: acpiphp: Slot [31] registered Sep 9 05:33:51.914488 kernel: PCI host bridge to bus 0000:00 Sep 9 05:33:51.914717 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 05:33:51.914869 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 05:33:51.915002 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 05:33:51.915124 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Sep 9 05:33:51.915241 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x2000ffffffff window] Sep 9 05:33:51.915374 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:33:51.917569 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:33:51.917775 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint Sep 9 05:33:51.917935 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x000000 conventional PCI endpoint Sep 9 05:33:51.918076 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Sep 9 05:33:51.918207 kernel: pci 0000:00:01.3: PIIX4 devres E PIO at fff0-ffff Sep 9 05:33:51.918339 kernel: pci 0000:00:01.3: PIIX4 devres F MMIO at ffc00000-ffffffff Sep 9 05:33:51.918479 kernel: pci 0000:00:01.3: PIIX4 devres G PIO at fff0-ffff Sep 9 05:33:51.918657 kernel: pci 0000:00:01.3: PIIX4 devres H MMIO at ffc00000-ffffffff Sep 9 05:33:51.918786 kernel: pci 0000:00:01.3: PIIX4 devres I PIO at fff0-ffff Sep 9 05:33:51.918912 kernel: pci 0000:00:01.3: PIIX4 devres J PIO at fff0-ffff Sep 9 05:33:51.919047 kernel: pci 0000:00:03.0: [1d0f:1111] type 00 class 0x030000 conventional PCI endpoint Sep 9 05:33:51.919653 kernel: pci 0000:00:03.0: BAR 0 [mem 0x80000000-0x803fffff pref] Sep 9 05:33:51.919820 kernel: pci 0000:00:03.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 9 05:33:51.919957 kernel: pci 0000:00:03.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 05:33:51.920110 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Endpoint Sep 9 05:33:51.920247 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80404000-0x80407fff] Sep 9 05:33:51.920395 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Endpoint Sep 9 05:33:51.922650 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80400000-0x80403fff] Sep 9 05:33:51.922694 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 05:33:51.922710 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 05:33:51.922723 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 05:33:51.922737 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 05:33:51.922754 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Sep 9 05:33:51.922770 kernel: iommu: Default domain type: Translated Sep 9 05:33:51.922787 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 05:33:51.922804 kernel: efivars: Registered efivars operations Sep 9 05:33:51.922820 kernel: PCI: Using ACPI for IRQ routing Sep 9 05:33:51.922841 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 05:33:51.922858 kernel: e820: reserve RAM buffer [mem 0x768c0018-0x77ffffff] Sep 9 05:33:51.922875 kernel: e820: reserve RAM buffer [mem 0x786ce000-0x7bffffff] Sep 9 05:33:51.922892 kernel: e820: reserve RAM buffer [mem 0x7c97c000-0x7fffffff] Sep 9 05:33:51.923073 kernel: pci 0000:00:03.0: vgaarb: setting as boot VGA device Sep 9 05:33:51.923218 kernel: pci 0000:00:03.0: vgaarb: bridge control possible Sep 9 05:33:51.923360 kernel: pci 0000:00:03.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 05:33:51.923382 kernel: vgaarb: loaded Sep 9 05:33:51.923404 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Sep 9 05:33:51.923421 kernel: hpet0: 8 comparators, 32-bit 62.500000 MHz counter Sep 9 05:33:51.923437 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 05:33:51.923454 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:33:51.923471 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:33:51.923488 kernel: pnp: PnP ACPI init Sep 9 05:33:51.923505 kernel: pnp: PnP ACPI: found 5 devices Sep 9 05:33:51.923523 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 05:33:51.927327 kernel: NET: Registered PF_INET protocol family Sep 9 05:33:51.927361 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 05:33:51.927376 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 05:33:51.927392 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:33:51.927408 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 05:33:51.927424 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 05:33:51.927440 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 05:33:51.927455 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:33:51.927471 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:33:51.927485 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:33:51.927505 kernel: NET: Registered PF_XDP protocol family Sep 9 05:33:51.927724 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 05:33:51.927854 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 05:33:51.927976 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 05:33:51.928121 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Sep 9 05:33:51.928246 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x2000ffffffff window] Sep 9 05:33:51.928401 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 9 05:33:51.928424 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:33:51.928447 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 05:33:51.928462 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 9 05:33:51.928476 kernel: clocksource: Switched to clocksource tsc Sep 9 05:33:51.928491 kernel: Initialise system trusted keyrings Sep 9 05:33:51.928507 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 05:33:51.928521 kernel: Key type asymmetric registered Sep 9 05:33:51.930590 kernel: Asymmetric key parser 'x509' registered Sep 9 05:33:51.930618 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 05:33:51.930635 kernel: io scheduler mq-deadline registered Sep 9 05:33:51.930657 kernel: io scheduler kyber registered Sep 9 05:33:51.930673 kernel: io scheduler bfq registered Sep 9 05:33:51.930689 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 05:33:51.930705 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:33:51.930721 kernel: 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 05:33:51.930736 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 05:33:51.930750 kernel: i8042: Warning: Keylock active Sep 9 05:33:51.930766 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 05:33:51.930782 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 05:33:51.930997 kernel: rtc_cmos 00:00: RTC can wake from S4 Sep 9 05:33:51.931142 kernel: rtc_cmos 00:00: registered as rtc0 Sep 9 05:33:51.931276 kernel: rtc_cmos 00:00: setting system clock to 2025-09-09T05:33:51 UTC (1757396031) Sep 9 05:33:51.931408 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Sep 9 05:33:51.931454 kernel: intel_pstate: CPU model not supported Sep 9 05:33:51.931475 kernel: efifb: probing for efifb Sep 9 05:33:51.931493 kernel: efifb: framebuffer at 0x80000000, using 1876k, total 1875k Sep 9 05:33:51.931512 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 Sep 9 05:33:51.931550 kernel: efifb: scrolling: redraw Sep 9 05:33:51.931565 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 9 05:33:51.931588 kernel: Console: switching to colour frame buffer device 100x37 Sep 9 05:33:51.931602 kernel: fb0: EFI VGA frame buffer device Sep 9 05:33:51.931615 kernel: pstore: Using crash dump compression: deflate Sep 9 05:33:51.931630 kernel: pstore: Registered efi_pstore as persistent store backend Sep 9 05:33:51.931643 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:33:51.931657 kernel: Segment Routing with IPv6 Sep 9 05:33:51.931672 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:33:51.931692 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:33:51.931708 kernel: Key type dns_resolver registered Sep 9 05:33:51.931724 kernel: IPI shorthand broadcast: enabled Sep 9 05:33:51.931741 kernel: sched_clock: Marking stable (2620002079, 145571261)->(2859525301, -93951961) Sep 9 05:33:51.931755 kernel: registered taskstats version 1 Sep 9 05:33:51.931771 kernel: Loading compiled-in X.509 certificates Sep 9 05:33:51.931788 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 884b9ad6a330f59ae6e6488b20a5491e41ff24a3' Sep 9 05:33:51.931806 kernel: Demotion targets for Node 0: null Sep 9 05:33:51.931825 kernel: Key type .fscrypt registered Sep 9 05:33:51.931847 kernel: Key type fscrypt-provisioning registered Sep 9 05:33:51.931864 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 05:33:51.931882 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:33:51.931900 kernel: ima: No architecture policies found Sep 9 05:33:51.931918 kernel: clk: Disabling unused clocks Sep 9 05:33:51.931936 kernel: Warning: unable to open an initial console. Sep 9 05:33:51.931954 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 9 05:33:51.931972 kernel: Write protecting the kernel read-only data: 24576k Sep 9 05:33:51.931994 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 05:33:51.932015 kernel: Run /init as init process Sep 9 05:33:51.932033 kernel: with arguments: Sep 9 05:33:51.932051 kernel: /init Sep 9 05:33:51.932068 kernel: with environment: Sep 9 05:33:51.932086 kernel: HOME=/ Sep 9 05:33:51.932107 kernel: TERM=linux Sep 9 05:33:51.932125 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:33:51.932145 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:33:51.932171 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:33:51.932191 systemd[1]: Detected virtualization amazon. Sep 9 05:33:51.932209 systemd[1]: Detected architecture x86-64. Sep 9 05:33:51.932225 systemd[1]: Running in initrd. Sep 9 05:33:51.932245 systemd[1]: No hostname configured, using default hostname. Sep 9 05:33:51.932263 systemd[1]: Hostname set to . Sep 9 05:33:51.932283 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:33:51.932301 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:33:51.932318 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:33:51.932335 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:33:51.932355 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:33:51.932373 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:33:51.932393 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:33:51.932412 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:33:51.932431 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:33:51.932449 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:33:51.932467 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:33:51.932485 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:33:51.932503 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:33:51.932522 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:33:51.934589 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:33:51.934615 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:33:51.934634 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:33:51.934652 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:33:51.934670 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:33:51.934689 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:33:51.934707 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:33:51.934725 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:33:51.934748 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:33:51.934766 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:33:51.934784 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:33:51.934801 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:33:51.934819 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:33:51.934837 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:33:51.934855 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:33:51.934873 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:33:51.934894 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:33:51.934911 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:33:51.934929 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:33:51.934985 systemd-journald[207]: Collecting audit messages is disabled. Sep 9 05:33:51.935028 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:33:51.935046 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:33:51.935065 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:33:51.935084 systemd-journald[207]: Journal started Sep 9 05:33:51.935123 systemd-journald[207]: Runtime Journal (/run/log/journal/ec22e84552f85cfad160da722288f4a3) is 4.8M, max 38.4M, 33.6M free. Sep 9 05:33:51.917263 systemd-modules-load[208]: Inserted module 'overlay' Sep 9 05:33:51.946557 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:33:51.949920 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:33:51.951912 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:33:51.957665 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:33:51.962699 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:33:51.971769 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:33:51.981940 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:33:51.981977 kernel: Bridge firewalling registered Sep 9 05:33:51.980176 systemd-modules-load[208]: Inserted module 'br_netfilter' Sep 9 05:33:51.987853 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:33:51.995640 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:33:51.999053 systemd-tmpfiles[227]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:33:52.006708 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 05:33:52.012399 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:33:52.022768 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:33:52.025026 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:33:52.029743 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:33:52.032859 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:33:52.040744 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:33:52.057714 dracut-cmdline[245]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:33:52.100072 systemd-resolved[248]: Positive Trust Anchors: Sep 9 05:33:52.100088 systemd-resolved[248]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:33:52.100152 systemd-resolved[248]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:33:52.108207 systemd-resolved[248]: Defaulting to hostname 'linux'. Sep 9 05:33:52.111518 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:33:52.112254 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:33:52.158576 kernel: SCSI subsystem initialized Sep 9 05:33:52.169571 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:33:52.180603 kernel: iscsi: registered transport (tcp) Sep 9 05:33:52.203058 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:33:52.203141 kernel: QLogic iSCSI HBA Driver Sep 9 05:33:52.222449 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:33:52.247027 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:33:52.250688 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:33:52.294970 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:33:52.297277 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:33:52.349564 kernel: raid6: avx512x4 gen() 17768 MB/s Sep 9 05:33:52.367561 kernel: raid6: avx512x2 gen() 17978 MB/s Sep 9 05:33:52.385560 kernel: raid6: avx512x1 gen() 17941 MB/s Sep 9 05:33:52.403557 kernel: raid6: avx2x4 gen() 17805 MB/s Sep 9 05:33:52.421558 kernel: raid6: avx2x2 gen() 17966 MB/s Sep 9 05:33:52.439825 kernel: raid6: avx2x1 gen() 13864 MB/s Sep 9 05:33:52.439887 kernel: raid6: using algorithm avx512x2 gen() 17978 MB/s Sep 9 05:33:52.458827 kernel: raid6: .... xor() 24229 MB/s, rmw enabled Sep 9 05:33:52.458888 kernel: raid6: using avx512x2 recovery algorithm Sep 9 05:33:52.479564 kernel: xor: automatically using best checksumming function avx Sep 9 05:33:52.648566 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:33:52.655754 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:33:52.657936 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:33:52.688190 systemd-udevd[457]: Using default interface naming scheme 'v255'. Sep 9 05:33:52.694877 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:33:52.699286 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:33:52.723342 dracut-pre-trigger[464]: rd.md=0: removing MD RAID activation Sep 9 05:33:52.750236 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:33:52.752410 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:33:52.815709 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:33:52.820757 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:33:52.913564 kernel: ena 0000:00:05.0: ENA device version: 0.10 Sep 9 05:33:52.913867 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Sep 9 05:33:52.918562 kernel: ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy. Sep 9 05:33:52.933425 kernel: nvme nvme0: pci function 0000:00:04.0 Sep 9 05:33:52.933699 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 05:33:52.933722 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Sep 9 05:33:52.941560 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80400000, mac addr 06:b2:0d:cf:81:ad Sep 9 05:33:52.949583 kernel: nvme nvme0: 2/0/0 default/read/poll queues Sep 9 05:33:52.958144 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:33:52.958213 kernel: GPT:9289727 != 16777215 Sep 9 05:33:52.958234 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:33:52.961166 kernel: GPT:9289727 != 16777215 Sep 9 05:33:52.961224 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:33:52.964055 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 9 05:33:52.966709 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:33:52.967009 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:33:52.969123 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 05:33:52.970681 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:33:52.974023 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:33:52.975750 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:33:52.983768 (udev-worker)[503]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:33:52.990577 kernel: AES CTR mode by8 optimization enabled Sep 9 05:33:53.031566 kernel: nvme nvme0: using unchecked data buffer Sep 9 05:33:53.033159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:33:53.140335 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 9 05:33:53.167380 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Sep 9 05:33:53.168294 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:33:53.180160 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Sep 9 05:33:53.197856 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Sep 9 05:33:53.198521 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Sep 9 05:33:53.199855 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:33:53.200980 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:33:53.202090 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:33:53.203892 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:33:53.206768 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:33:53.229131 disk-uuid[692]: Primary Header is updated. Sep 9 05:33:53.229131 disk-uuid[692]: Secondary Entries is updated. Sep 9 05:33:53.229131 disk-uuid[692]: Secondary Header is updated. Sep 9 05:33:53.236480 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:33:53.237276 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 9 05:33:54.247493 disk-uuid[695]: The operation has completed successfully. Sep 9 05:33:54.248789 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Sep 9 05:33:54.381987 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:33:54.382129 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:33:54.427746 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:33:54.448665 sh[960]: Success Sep 9 05:33:54.475872 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:33:54.476453 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:33:54.476485 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:33:54.487563 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 9 05:33:54.578098 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:33:54.581620 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:33:54.590280 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:33:54.608556 kernel: BTRFS: device fsid 9ca60a92-6b53-4529-adc0-1f4392d2ad56 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (983) Sep 9 05:33:54.612582 kernel: BTRFS info (device dm-0): first mount of filesystem 9ca60a92-6b53-4529-adc0-1f4392d2ad56 Sep 9 05:33:54.612647 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:33:54.722847 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 05:33:54.722913 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:33:54.725034 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:33:54.737591 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:33:54.738606 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:33:54.739140 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:33:54.739883 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:33:54.742649 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:33:54.772612 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1014) Sep 9 05:33:54.776978 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:33:54.777044 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:33:54.786266 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 9 05:33:54.786386 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 9 05:33:54.794718 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:33:54.796086 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:33:54.799845 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:33:54.848101 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:33:54.851253 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:33:54.901424 systemd-networkd[1152]: lo: Link UP Sep 9 05:33:54.901437 systemd-networkd[1152]: lo: Gained carrier Sep 9 05:33:54.904622 systemd-networkd[1152]: Enumeration completed Sep 9 05:33:54.905003 systemd-networkd[1152]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:33:54.905008 systemd-networkd[1152]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:33:54.908655 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:33:54.913429 systemd[1]: Reached target network.target - Network. Sep 9 05:33:54.914681 systemd-networkd[1152]: eth0: Link UP Sep 9 05:33:54.914691 systemd-networkd[1152]: eth0: Gained carrier Sep 9 05:33:54.914712 systemd-networkd[1152]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:33:54.932641 systemd-networkd[1152]: eth0: DHCPv4 address 172.31.26.176/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 9 05:33:55.260247 ignition[1083]: Ignition 2.22.0 Sep 9 05:33:55.260264 ignition[1083]: Stage: fetch-offline Sep 9 05:33:55.260500 ignition[1083]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:55.260512 ignition[1083]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:55.260915 ignition[1083]: Ignition finished successfully Sep 9 05:33:55.262876 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:33:55.265149 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 05:33:55.296367 ignition[1161]: Ignition 2.22.0 Sep 9 05:33:55.296384 ignition[1161]: Stage: fetch Sep 9 05:33:55.296759 ignition[1161]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:55.296774 ignition[1161]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:55.296880 ignition[1161]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:55.313525 ignition[1161]: PUT result: OK Sep 9 05:33:55.316333 ignition[1161]: parsed url from cmdline: "" Sep 9 05:33:55.316343 ignition[1161]: no config URL provided Sep 9 05:33:55.316350 ignition[1161]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:33:55.316362 ignition[1161]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:33:55.316382 ignition[1161]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:55.317159 ignition[1161]: PUT result: OK Sep 9 05:33:55.317215 ignition[1161]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Sep 9 05:33:55.318026 ignition[1161]: GET result: OK Sep 9 05:33:55.318155 ignition[1161]: parsing config with SHA512: f7255a461b568c4a2798ced85e97e4672bd7aa72c320e468e9f5a00437bcd5118cfc5f1a871088aaa85920ace48d5734b00ae2819eed8fd323949c46f6eec38c Sep 9 05:33:55.324148 unknown[1161]: fetched base config from "system" Sep 9 05:33:55.324458 ignition[1161]: fetch: fetch complete Sep 9 05:33:55.324158 unknown[1161]: fetched base config from "system" Sep 9 05:33:55.324462 ignition[1161]: fetch: fetch passed Sep 9 05:33:55.324163 unknown[1161]: fetched user config from "aws" Sep 9 05:33:55.324499 ignition[1161]: Ignition finished successfully Sep 9 05:33:55.328850 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 05:33:55.330765 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:33:55.366512 ignition[1168]: Ignition 2.22.0 Sep 9 05:33:55.366528 ignition[1168]: Stage: kargs Sep 9 05:33:55.366950 ignition[1168]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:55.366963 ignition[1168]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:55.367092 ignition[1168]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:55.368162 ignition[1168]: PUT result: OK Sep 9 05:33:55.370828 ignition[1168]: kargs: kargs passed Sep 9 05:33:55.370900 ignition[1168]: Ignition finished successfully Sep 9 05:33:55.372742 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:33:55.374041 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:33:55.426456 ignition[1175]: Ignition 2.22.0 Sep 9 05:33:55.426474 ignition[1175]: Stage: disks Sep 9 05:33:55.426904 ignition[1175]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:55.426917 ignition[1175]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:55.427028 ignition[1175]: PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:55.428503 ignition[1175]: PUT result: OK Sep 9 05:33:55.431232 ignition[1175]: disks: disks passed Sep 9 05:33:55.431287 ignition[1175]: Ignition finished successfully Sep 9 05:33:55.433457 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:33:55.434185 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:33:55.434769 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:33:55.435302 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:33:55.435869 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:33:55.436429 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:33:55.438046 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:33:55.477719 systemd-fsck[1184]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 05:33:55.480868 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:33:55.483289 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:33:55.645560 kernel: EXT4-fs (nvme0n1p9): mounted filesystem d2d7815e-fa16-4396-ab9d-ac540c1d8856 r/w with ordered data mode. Quota mode: none. Sep 9 05:33:55.646863 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:33:55.647755 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:33:55.649948 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:33:55.652621 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:33:55.653792 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 05:33:55.654181 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:33:55.654207 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:33:55.669931 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:33:55.671841 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:33:55.686551 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1203) Sep 9 05:33:55.691651 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:33:55.691728 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:33:55.699386 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 9 05:33:55.700601 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 9 05:33:55.700812 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:33:56.087237 initrd-setup-root[1227]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:33:56.118388 initrd-setup-root[1234]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:33:56.136222 initrd-setup-root[1241]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:33:56.140195 initrd-setup-root[1248]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:33:56.382747 systemd-networkd[1152]: eth0: Gained IPv6LL Sep 9 05:33:56.401111 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:33:56.403417 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:33:56.406667 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:33:56.420448 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:33:56.423302 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:33:56.449416 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:33:56.459161 ignition[1315]: INFO : Ignition 2.22.0 Sep 9 05:33:56.459161 ignition[1315]: INFO : Stage: mount Sep 9 05:33:56.460782 ignition[1315]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:56.460782 ignition[1315]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:56.460782 ignition[1315]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:56.461952 ignition[1315]: INFO : PUT result: OK Sep 9 05:33:56.463872 ignition[1315]: INFO : mount: mount passed Sep 9 05:33:56.464364 ignition[1315]: INFO : Ignition finished successfully Sep 9 05:33:56.466158 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:33:56.467733 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:33:56.648477 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:33:56.679561 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 (259:5) scanned by mount (1327) Sep 9 05:33:56.682576 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:33:56.682642 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:33:56.689836 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Sep 9 05:33:56.689912 kernel: BTRFS info (device nvme0n1p6): enabling free space tree Sep 9 05:33:56.692961 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:33:56.727600 ignition[1344]: INFO : Ignition 2.22.0 Sep 9 05:33:56.727600 ignition[1344]: INFO : Stage: files Sep 9 05:33:56.729240 ignition[1344]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:56.729240 ignition[1344]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:56.729240 ignition[1344]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:56.729240 ignition[1344]: INFO : PUT result: OK Sep 9 05:33:56.732282 ignition[1344]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:33:56.733814 ignition[1344]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:33:56.733814 ignition[1344]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:33:56.755075 ignition[1344]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:33:56.755912 ignition[1344]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:33:56.755912 ignition[1344]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:33:56.755466 unknown[1344]: wrote ssh authorized keys file for user: core Sep 9 05:33:56.758638 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 05:33:56.759395 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 9 05:33:56.808268 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:33:57.096621 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 05:33:57.096621 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:33:57.098093 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:33:57.102615 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:33:57.102615 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:33:57.102615 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:33:57.105029 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:33:57.105029 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:33:57.105029 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 9 05:33:57.539085 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:33:58.595745 ignition[1344]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 05:33:58.595745 ignition[1344]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:33:58.619222 ignition[1344]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:33:58.624384 ignition[1344]: INFO : files: files passed Sep 9 05:33:58.624384 ignition[1344]: INFO : Ignition finished successfully Sep 9 05:33:58.627624 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:33:58.632776 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:33:58.636659 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:33:58.648124 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:33:58.648240 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:33:58.655508 initrd-setup-root-after-ignition[1377]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:33:58.657507 initrd-setup-root-after-ignition[1373]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:33:58.657507 initrd-setup-root-after-ignition[1373]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:33:58.660459 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:33:58.661148 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:33:58.663227 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:33:58.708768 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:33:58.708924 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:33:58.711077 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:33:58.711635 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:33:58.712416 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:33:58.713580 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:33:58.753176 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:33:58.755211 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:33:58.781956 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:33:58.782651 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:33:58.783291 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:33:58.784039 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:33:58.784188 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:33:58.785156 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:33:58.785985 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:33:58.786791 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:33:58.787422 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:33:58.788223 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:33:58.788837 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:33:58.789614 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:33:58.790243 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:33:58.791123 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:33:58.791885 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:33:58.793686 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:33:58.795044 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:33:58.795324 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:33:58.796377 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:33:58.797228 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:33:58.798051 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:33:58.798497 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:33:58.799040 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:33:58.799217 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:33:58.800318 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:33:58.800586 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:33:58.801696 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:33:58.801902 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:33:58.804659 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:33:58.807841 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:33:58.809727 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:33:58.810025 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:33:58.811016 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:33:58.811681 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:33:58.818852 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:33:58.820649 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:33:58.845883 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:33:58.848068 ignition[1397]: INFO : Ignition 2.22.0 Sep 9 05:33:58.848068 ignition[1397]: INFO : Stage: umount Sep 9 05:33:58.849814 ignition[1397]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:33:58.849814 ignition[1397]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Sep 9 05:33:58.849814 ignition[1397]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Sep 9 05:33:58.852065 ignition[1397]: INFO : PUT result: OK Sep 9 05:33:58.855491 ignition[1397]: INFO : umount: umount passed Sep 9 05:33:58.856063 ignition[1397]: INFO : Ignition finished successfully Sep 9 05:33:58.858988 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:33:58.859132 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:33:58.860737 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:33:58.860872 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:33:58.863375 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:33:58.863510 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:33:58.864845 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:33:58.864935 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:33:58.865683 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 05:33:58.865752 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 05:33:58.866836 systemd[1]: Stopped target network.target - Network. Sep 9 05:33:58.867620 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:33:58.867796 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:33:58.868944 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:33:58.869991 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:33:58.870068 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:33:58.871078 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:33:58.871728 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:33:58.872468 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:33:58.872529 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:33:58.873158 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:33:58.873216 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:33:58.873823 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:33:58.873906 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:33:58.875056 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:33:58.875126 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:33:58.875721 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:33:58.875792 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:33:58.876576 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:33:58.877485 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:33:58.882875 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:33:58.883031 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:33:58.895122 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:33:58.895859 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:33:58.896143 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:33:58.898744 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:33:58.899788 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:33:58.901446 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:33:58.901633 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:33:58.904369 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:33:58.904989 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:33:58.905074 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:33:58.905769 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:33:58.905832 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:33:58.906828 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:33:58.906889 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:33:58.907796 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:33:58.907924 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:33:58.909461 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:33:58.918228 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:33:58.918464 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:33:58.929063 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:33:58.929809 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:33:58.933227 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:33:58.933337 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:33:58.935738 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:33:58.935789 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:33:58.936481 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:33:58.936607 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:33:58.937486 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:33:58.940711 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:33:58.941836 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:33:58.941927 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:33:58.948090 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:33:58.949047 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:33:58.949165 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:33:58.955020 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:33:58.955182 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:33:58.958853 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 05:33:58.959049 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:33:58.961511 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:33:58.961682 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:33:58.962593 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:33:58.962797 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:33:58.970976 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 05:33:58.971076 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 05:33:58.971125 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 05:33:58.971181 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:33:58.972123 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:33:58.973619 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:33:58.977983 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:33:58.978125 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:33:58.980200 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:33:58.983678 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:33:59.007214 systemd[1]: Switching root. Sep 9 05:33:59.049554 systemd-journald[207]: Received SIGTERM from PID 1 (systemd). Sep 9 05:33:59.049651 systemd-journald[207]: Journal stopped Sep 9 05:34:02.435400 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:34:02.435512 kernel: SELinux: policy capability open_perms=1 Sep 9 05:34:02.436662 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:34:02.436700 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:34:02.436719 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:34:02.436737 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:34:02.436756 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:34:02.436780 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:34:02.436802 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:34:02.436819 kernel: audit: type=1403 audit(1757396039.523:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:34:02.436841 systemd[1]: Successfully loaded SELinux policy in 87.623ms. Sep 9 05:34:02.436879 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.576ms. Sep 9 05:34:02.436899 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:34:02.436918 systemd[1]: Detected virtualization amazon. Sep 9 05:34:02.436937 systemd[1]: Detected architecture x86-64. Sep 9 05:34:02.436958 systemd[1]: Detected first boot. Sep 9 05:34:02.436976 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:34:02.436996 zram_generator::config[1441]: No configuration found. Sep 9 05:34:02.437016 kernel: Guest personality initialized and is inactive Sep 9 05:34:02.437035 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 05:34:02.437054 kernel: Initialized host personality Sep 9 05:34:02.437071 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:34:02.437089 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:34:02.437108 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:34:02.437130 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:34:02.437148 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:34:02.437167 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:34:02.437187 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:34:02.437205 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:34:02.437223 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:34:02.437242 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:34:02.437260 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:34:02.437283 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:34:02.437302 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:34:02.437320 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:34:02.437338 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:34:02.437358 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:34:02.437377 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:34:02.437395 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:34:02.437414 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:34:02.437433 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:34:02.437454 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 05:34:02.437473 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:34:02.437491 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:34:02.437510 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:34:02.437528 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:34:02.438606 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:34:02.438634 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:34:02.438658 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:34:02.438699 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:34:02.438724 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:34:02.438748 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:34:02.438770 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:34:02.438796 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:34:02.438821 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:34:02.438845 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:34:02.438868 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:34:02.438892 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:34:02.438917 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:34:02.438943 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:34:02.438966 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:34:02.438990 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:34:02.439013 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:02.439038 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:34:02.439062 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:34:02.439087 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:34:02.439111 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:34:02.439137 systemd[1]: Reached target machines.target - Containers. Sep 9 05:34:02.439162 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:34:02.439187 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:34:02.439209 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:34:02.439233 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:34:02.439257 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:34:02.439280 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:34:02.439305 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:34:02.439328 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:34:02.439354 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:34:02.439378 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:34:02.439403 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:34:02.439427 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:34:02.439453 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:34:02.439475 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:34:02.439502 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:34:02.440561 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:34:02.440609 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:34:02.440632 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:34:02.440655 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:34:02.440677 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:34:02.440696 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:34:02.440722 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:34:02.440741 systemd[1]: Stopped verity-setup.service. Sep 9 05:34:02.440762 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:02.440781 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:34:02.440802 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:34:02.440824 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:34:02.440842 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:34:02.440861 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:34:02.440880 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:34:02.440900 kernel: loop: module loaded Sep 9 05:34:02.440922 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:34:02.440941 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:34:02.440959 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:34:02.440980 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:34:02.440998 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:34:02.441017 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:34:02.441039 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:34:02.441062 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:34:02.441086 kernel: fuse: init (API version 7.41) Sep 9 05:34:02.441106 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:34:02.441126 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:34:02.441151 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:34:02.441177 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:34:02.441203 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:34:02.441227 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:34:02.441252 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:34:02.441275 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:34:02.441300 kernel: ACPI: bus type drm_connector registered Sep 9 05:34:02.441323 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:34:02.441348 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:34:02.441377 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:34:02.441402 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:34:02.441427 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:34:02.441499 systemd-journald[1527]: Collecting audit messages is disabled. Sep 9 05:34:02.441581 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:34:02.441607 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:34:02.441634 systemd-journald[1527]: Journal started Sep 9 05:34:02.441679 systemd-journald[1527]: Runtime Journal (/run/log/journal/ec22e84552f85cfad160da722288f4a3) is 4.8M, max 38.4M, 33.6M free. Sep 9 05:34:01.743434 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:34:01.764304 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Sep 9 05:34:01.764878 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:34:02.446560 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:34:02.452933 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:34:02.459561 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:34:02.463649 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:34:02.466565 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:34:02.475565 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:34:02.488577 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:34:02.500207 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:34:02.498753 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:34:02.501347 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:34:02.501737 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:34:02.503986 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:34:02.505815 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:34:02.551105 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:34:02.553174 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:34:02.554139 kernel: loop0: detected capacity change from 0 to 72368 Sep 9 05:34:02.555411 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:34:02.558006 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:34:02.563376 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:34:02.601260 systemd-journald[1527]: Time spent on flushing to /var/log/journal/ec22e84552f85cfad160da722288f4a3 is 31.239ms for 1024 entries. Sep 9 05:34:02.601260 systemd-journald[1527]: System Journal (/var/log/journal/ec22e84552f85cfad160da722288f4a3) is 8M, max 195.6M, 187.6M free. Sep 9 05:34:02.670897 systemd-journald[1527]: Received client request to flush runtime journal. Sep 9 05:34:02.670986 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:34:02.671027 kernel: loop1: detected capacity change from 0 to 110984 Sep 9 05:34:02.608204 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:34:02.628898 systemd-tmpfiles[1557]: ACLs are not supported, ignoring. Sep 9 05:34:02.628918 systemd-tmpfiles[1557]: ACLs are not supported, ignoring. Sep 9 05:34:02.635360 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:34:02.639615 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:34:02.646833 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:34:02.674510 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:34:02.768526 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:34:02.775150 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:34:02.777565 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:34:02.808558 kernel: loop2: detected capacity change from 0 to 221472 Sep 9 05:34:02.806317 systemd-tmpfiles[1595]: ACLs are not supported, ignoring. Sep 9 05:34:02.806341 systemd-tmpfiles[1595]: ACLs are not supported, ignoring. Sep 9 05:34:02.812218 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:34:02.941570 kernel: loop3: detected capacity change from 0 to 128016 Sep 9 05:34:03.070597 kernel: loop4: detected capacity change from 0 to 72368 Sep 9 05:34:03.095615 kernel: loop5: detected capacity change from 0 to 110984 Sep 9 05:34:03.130606 kernel: loop6: detected capacity change from 0 to 221472 Sep 9 05:34:03.178674 kernel: loop7: detected capacity change from 0 to 128016 Sep 9 05:34:03.201146 (sd-merge)[1601]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Sep 9 05:34:03.203707 (sd-merge)[1601]: Merged extensions into '/usr'. Sep 9 05:34:03.213378 systemd[1]: Reload requested from client PID 1556 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:34:03.213576 systemd[1]: Reloading... Sep 9 05:34:03.344583 zram_generator::config[1627]: No configuration found. Sep 9 05:34:03.691615 systemd[1]: Reloading finished in 477 ms. Sep 9 05:34:03.712034 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:34:03.722038 systemd[1]: Starting ensure-sysext.service... Sep 9 05:34:03.731421 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:34:03.768669 systemd[1]: Reload requested from client PID 1678 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:34:03.768686 systemd[1]: Reloading... Sep 9 05:34:03.784035 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:34:03.784103 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:34:03.785459 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:34:03.785933 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:34:03.789820 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:34:03.790989 systemd-tmpfiles[1679]: ACLs are not supported, ignoring. Sep 9 05:34:03.792738 systemd-tmpfiles[1679]: ACLs are not supported, ignoring. Sep 9 05:34:03.817128 systemd-tmpfiles[1679]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:34:03.817299 systemd-tmpfiles[1679]: Skipping /boot Sep 9 05:34:03.839527 systemd-tmpfiles[1679]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:34:03.845601 systemd-tmpfiles[1679]: Skipping /boot Sep 9 05:34:03.920567 zram_generator::config[1707]: No configuration found. Sep 9 05:34:04.185314 systemd[1]: Reloading finished in 415 ms. Sep 9 05:34:04.218874 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:34:04.250217 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:34:04.269661 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:34:04.285893 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:34:04.290638 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:34:04.300067 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:34:04.305635 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:34:04.314083 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:34:04.322713 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.323030 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:34:04.326408 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:34:04.331633 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:34:04.336916 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:34:04.337676 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:34:04.337847 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:34:04.338003 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.350066 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.350476 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:34:04.350825 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:34:04.351041 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:34:04.357777 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:34:04.359658 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.361072 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:34:04.373396 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.373989 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:34:04.376844 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:34:04.377968 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:34:04.378354 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:34:04.378836 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:34:04.380143 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:34:04.386141 systemd[1]: Finished ensure-sysext.service. Sep 9 05:34:04.392868 ldconfig[1552]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:34:04.407486 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:34:04.423921 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:34:04.424197 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:34:04.431399 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:34:04.431698 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:34:04.432943 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:34:04.435969 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:34:04.436279 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:34:04.438498 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:34:04.439451 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:34:04.439782 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:34:04.446046 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:34:04.450717 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:34:04.463558 systemd-udevd[1771]: Using default interface naming scheme 'v255'. Sep 9 05:34:04.483006 augenrules[1800]: No rules Sep 9 05:34:04.486058 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:34:04.486368 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:34:04.490072 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:34:04.506230 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:34:04.536325 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:34:04.544503 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:34:04.559597 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:34:04.565094 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:34:04.642019 (udev-worker)[1816]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:34:04.647409 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 05:34:04.781956 systemd-resolved[1765]: Positive Trust Anchors: Sep 9 05:34:04.781972 systemd-resolved[1765]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:34:04.782028 systemd-resolved[1765]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:34:04.805687 systemd-resolved[1765]: Defaulting to hostname 'linux'. Sep 9 05:34:04.810122 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:34:04.811238 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:34:04.812644 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:34:04.813339 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:34:04.814126 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:34:04.815103 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 05:34:04.816154 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:34:04.817303 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:34:04.818633 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:34:04.819155 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:34:04.819202 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:34:04.819739 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:34:04.822232 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:34:04.826721 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:34:04.831908 systemd-networkd[1819]: lo: Link UP Sep 9 05:34:04.832273 systemd-networkd[1819]: lo: Gained carrier Sep 9 05:34:04.834128 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:34:04.834362 systemd-networkd[1819]: Enumeration completed Sep 9 05:34:04.835348 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:34:04.836779 systemd-networkd[1819]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:34:04.836915 systemd-networkd[1819]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:34:04.837119 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:34:04.840502 systemd-networkd[1819]: eth0: Link UP Sep 9 05:34:04.840888 systemd-networkd[1819]: eth0: Gained carrier Sep 9 05:34:04.841020 systemd-networkd[1819]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:34:04.847816 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:34:04.849674 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:34:04.851622 systemd-networkd[1819]: eth0: DHCPv4 address 172.31.26.176/20, gateway 172.31.16.1 acquired from 172.31.16.1 Sep 9 05:34:04.852101 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:34:04.853756 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:34:04.856351 systemd[1]: Reached target network.target - Network. Sep 9 05:34:04.858687 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:34:04.859281 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:34:04.859945 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:34:04.859986 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:34:04.862805 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:34:04.867235 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 05:34:04.867670 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 05:34:04.869763 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:34:04.876800 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:34:04.879314 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:34:04.887745 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:34:04.888362 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:34:04.892825 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 05:34:04.897453 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:34:04.904421 systemd[1]: Started ntpd.service - Network Time Service. Sep 9 05:34:04.910771 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:34:04.917328 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 9 05:34:04.923859 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:34:04.932846 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:34:04.948301 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:34:04.962734 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:34:04.967800 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:34:04.969560 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 05:34:04.970272 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:34:04.972766 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:34:04.985437 jq[1865]: false Sep 9 05:34:04.984869 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:34:05.000612 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:34:05.001729 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:34:05.002787 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:34:05.044111 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Refreshing passwd entry cache Sep 9 05:34:05.044134 oslogin_cache_refresh[1869]: Refreshing passwd entry cache Sep 9 05:34:05.057356 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:34:05.059287 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:34:05.086037 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Failure getting users, quitting Sep 9 05:34:05.086160 oslogin_cache_refresh[1869]: Failure getting users, quitting Sep 9 05:34:05.086619 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:34:05.086706 oslogin_cache_refresh[1869]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:34:05.086824 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Refreshing group entry cache Sep 9 05:34:05.086894 oslogin_cache_refresh[1869]: Refreshing group entry cache Sep 9 05:34:05.090130 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Failure getting groups, quitting Sep 9 05:34:05.092956 google_oslogin_nss_cache[1869]: oslogin_cache_refresh[1869]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:34:05.091846 oslogin_cache_refresh[1869]: Failure getting groups, quitting Sep 9 05:34:05.091871 oslogin_cache_refresh[1869]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:34:05.097040 jq[1885]: true Sep 9 05:34:05.106790 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 05:34:05.107484 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 05:34:05.111247 extend-filesystems[1868]: Found /dev/nvme0n1p6 Sep 9 05:34:05.133565 extend-filesystems[1868]: Found /dev/nvme0n1p9 Sep 9 05:34:05.136422 tar[1920]: linux-amd64/helm Sep 9 05:34:05.143159 update_engine[1884]: I20250909 05:34:05.143069 1884 main.cc:92] Flatcar Update Engine starting Sep 9 05:34:05.156715 extend-filesystems[1868]: Checking size of /dev/nvme0n1p9 Sep 9 05:34:05.164057 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:34:05.165251 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:34:05.172005 (ntainerd)[1938]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:34:05.196920 dbus-daemon[1862]: [system] SELinux support is enabled Sep 9 05:34:05.198398 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:34:05.205520 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:34:05.205578 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:34:05.207896 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:34:05.207922 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:34:05.213572 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 9 05:34:05.222140 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:34:05.226044 kernel: ACPI: button: Power Button [PWRF] Sep 9 05:34:05.226127 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input4 Sep 9 05:34:05.231215 jq[1936]: true Sep 9 05:34:05.241585 extend-filesystems[1868]: Resized partition /dev/nvme0n1p9 Sep 9 05:34:05.254017 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 9 05:34:05.259722 kernel: ACPI: button: Sleep Button [SLPF] Sep 9 05:34:05.261648 extend-filesystems[1972]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 05:34:05.273085 dbus-daemon[1862]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1819 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 05:34:05.276803 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:34:05.279581 update_engine[1884]: I20250909 05:34:05.277287 1884 update_check_scheduler.cc:74] Next update check in 9m33s Sep 9 05:34:05.283201 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 05:34:05.286090 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:34:05.290628 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Sep 9 05:34:05.429953 coreos-metadata[1861]: Sep 09 05:34:05.429 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 9 05:34:05.440661 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Sep 9 05:34:05.440715 coreos-metadata[1861]: Sep 09 05:34:05.434 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Sep 9 05:34:05.440715 coreos-metadata[1861]: Sep 09 05:34:05.436 INFO Fetch successful Sep 9 05:34:05.440715 coreos-metadata[1861]: Sep 09 05:34:05.436 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Sep 9 05:34:05.440715 coreos-metadata[1861]: Sep 09 05:34:05.439 INFO Fetch successful Sep 9 05:34:05.440715 coreos-metadata[1861]: Sep 09 05:34:05.440 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Sep 9 05:34:05.443230 coreos-metadata[1861]: Sep 09 05:34:05.442 INFO Fetch successful Sep 9 05:34:05.443230 coreos-metadata[1861]: Sep 09 05:34:05.442 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Sep 9 05:34:05.448194 coreos-metadata[1861]: Sep 09 05:34:05.445 INFO Fetch successful Sep 9 05:34:05.448194 coreos-metadata[1861]: Sep 09 05:34:05.445 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Sep 9 05:34:05.447594 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:34:05.448435 extend-filesystems[1972]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Sep 9 05:34:05.448435 extend-filesystems[1972]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 05:34:05.448435 extend-filesystems[1972]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Sep 9 05:34:05.467668 bash[2000]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.449 INFO Fetch failed with 404: resource not found Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.449 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.455 INFO Fetch successful Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.455 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.458 INFO Fetch successful Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.459 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.462 INFO Fetch successful Sep 9 05:34:05.467797 coreos-metadata[1861]: Sep 09 05:34:05.462 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Sep 9 05:34:05.468116 extend-filesystems[1868]: Resized filesystem in /dev/nvme0n1p9 Sep 9 05:34:05.448612 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:34:05.484626 coreos-metadata[1861]: Sep 09 05:34:05.470 INFO Fetch successful Sep 9 05:34:05.484626 coreos-metadata[1861]: Sep 09 05:34:05.470 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Sep 9 05:34:05.484626 coreos-metadata[1861]: Sep 09 05:34:05.475 INFO Fetch successful Sep 9 05:34:05.507090 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Sep 9 05:34:05.448886 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:34:05.471171 systemd[1]: Starting sshkeys.service... Sep 9 05:34:05.521266 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Sep 9 05:34:05.536676 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:34:05.545585 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 05:34:05.549308 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 05:34:05.630653 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 05:34:05.631822 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:34:05.701861 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:34:05.763151 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:34:05.837161 locksmithd[1984]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:34:05.883130 coreos-metadata[2020]: Sep 09 05:34:05.881 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Sep 9 05:34:05.883130 coreos-metadata[2020]: Sep 09 05:34:05.881 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Sep 9 05:34:05.883130 coreos-metadata[2020]: Sep 09 05:34:05.881 INFO Fetch successful Sep 9 05:34:05.883130 coreos-metadata[2020]: Sep 09 05:34:05.881 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 9 05:34:05.883130 coreos-metadata[2020]: Sep 09 05:34:05.881 INFO Fetch successful Sep 9 05:34:05.884673 unknown[2020]: wrote ssh authorized keys file for user: core Sep 9 05:34:05.916603 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:34:05.964051 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:34:05.964363 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:34:05.967343 ntpd[1871]: ntpd 4.2.8p17@1.4004-o Tue Sep 9 03:09:56 UTC 2025 (1): Starting Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: ntpd 4.2.8p17@1.4004-o Tue Sep 9 03:09:56 UTC 2025 (1): Starting Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: ---------------------------------------------------- Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: ntp-4 is maintained by Network Time Foundation, Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: corporation. Support and training for ntp-4 are Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: available at https://www.nwtime.org/support Sep 9 05:34:05.968241 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: ---------------------------------------------------- Sep 9 05:34:05.967378 ntpd[1871]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 9 05:34:05.969307 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:34:05.978026 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: proto: precision = 0.064 usec (-24) Sep 9 05:34:05.978026 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: basedate set to 2025-08-28 Sep 9 05:34:05.978026 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: gps base set to 2025-08-31 (week 2382) Sep 9 05:34:05.967388 ntpd[1871]: ---------------------------------------------------- Sep 9 05:34:05.967398 ntpd[1871]: ntp-4 is maintained by Network Time Foundation, Sep 9 05:34:05.967406 ntpd[1871]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 9 05:34:05.967415 ntpd[1871]: corporation. Support and training for ntp-4 are Sep 9 05:34:05.967423 ntpd[1871]: available at https://www.nwtime.org/support Sep 9 05:34:05.967433 ntpd[1871]: ---------------------------------------------------- Sep 9 05:34:05.972906 ntpd[1871]: proto: precision = 0.064 usec (-24) Sep 9 05:34:05.973718 ntpd[1871]: basedate set to 2025-08-28 Sep 9 05:34:05.973738 ntpd[1871]: gps base set to 2025-08-31 (week 2382) Sep 9 05:34:05.989638 ntpd[1871]: Listen and drop on 0 v6wildcard [::]:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listen and drop on 0 v6wildcard [::]:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listen normally on 2 lo 127.0.0.1:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listen normally on 3 eth0 172.31.26.176:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listen normally on 4 lo [::1]:123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: bind(21) AF_INET6 fe80::4b2:dff:fecf:81ad%2#123 flags 0x11 failed: Cannot assign requested address Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: unable to create socket on eth0 (5) for fe80::4b2:dff:fecf:81ad%2#123 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: failed to init interface for address fe80::4b2:dff:fecf:81ad%2 Sep 9 05:34:05.992521 ntpd[1871]: 9 Sep 05:34:05 ntpd[1871]: Listening on routing socket on fd #21 for interface updates Sep 9 05:34:05.989707 ntpd[1871]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 9 05:34:05.989900 ntpd[1871]: Listen normally on 2 lo 127.0.0.1:123 Sep 9 05:34:05.989939 ntpd[1871]: Listen normally on 3 eth0 172.31.26.176:123 Sep 9 05:34:05.989982 ntpd[1871]: Listen normally on 4 lo [::1]:123 Sep 9 05:34:05.990029 ntpd[1871]: bind(21) AF_INET6 fe80::4b2:dff:fecf:81ad%2#123 flags 0x11 failed: Cannot assign requested address Sep 9 05:34:05.990049 ntpd[1871]: unable to create socket on eth0 (5) for fe80::4b2:dff:fecf:81ad%2#123 Sep 9 05:34:05.990063 ntpd[1871]: failed to init interface for address fe80::4b2:dff:fecf:81ad%2 Sep 9 05:34:05.990094 ntpd[1871]: Listening on routing socket on fd #21 for interface updates Sep 9 05:34:05.999685 update-ssh-keys[2053]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:34:05.998616 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 05:34:06.002008 systemd[1]: Finished sshkeys.service. Sep 9 05:34:06.017861 ntpd[1871]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:34:06.020048 ntpd[1871]: 9 Sep 05:34:06 ntpd[1871]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:34:06.020048 ntpd[1871]: 9 Sep 05:34:06 ntpd[1871]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:34:06.017901 ntpd[1871]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:34:06.026764 containerd[1938]: time="2025-09-09T05:34:06Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:34:06.043332 containerd[1938]: time="2025-09-09T05:34:06.043055478Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:34:06.106792 containerd[1938]: time="2025-09-09T05:34:06.106461418Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.969µs" Sep 9 05:34:06.106792 containerd[1938]: time="2025-09-09T05:34:06.106503783Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:34:06.106973 containerd[1938]: time="2025-09-09T05:34:06.106527505Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:34:06.107733 containerd[1938]: time="2025-09-09T05:34:06.107211048Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:34:06.107733 containerd[1938]: time="2025-09-09T05:34:06.107239814Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:34:06.107733 containerd[1938]: time="2025-09-09T05:34:06.107272048Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:34:06.111369 containerd[1938]: time="2025-09-09T05:34:06.110920375Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:34:06.111369 containerd[1938]: time="2025-09-09T05:34:06.110961312Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118167482Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118217375Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118253358Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118267836Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118411664Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118687593Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118730433Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:34:06.120208 containerd[1938]: time="2025-09-09T05:34:06.118747647Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:34:06.120755 containerd[1938]: time="2025-09-09T05:34:06.120170091Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:34:06.123203 containerd[1938]: time="2025-09-09T05:34:06.122767447Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:34:06.123203 containerd[1938]: time="2025-09-09T05:34:06.122888678Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:34:06.128626 containerd[1938]: time="2025-09-09T05:34:06.128350358Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:34:06.128626 containerd[1938]: time="2025-09-09T05:34:06.128443843Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:34:06.128626 containerd[1938]: time="2025-09-09T05:34:06.128464371Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131249438Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131303002Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131320962Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131343232Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131360792Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131377217Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131392090Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131405918Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131424028Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131601619Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131626237Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131646359Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131666963Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:34:06.132366 containerd[1938]: time="2025-09-09T05:34:06.131682470Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131696828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131712796Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131727840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131744542Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131761100Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131775839Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131861973Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:34:06.132923 containerd[1938]: time="2025-09-09T05:34:06.131881136Z" level=info msg="Start snapshots syncer" Sep 9 05:34:06.134747 containerd[1938]: time="2025-09-09T05:34:06.133696804Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:34:06.134747 containerd[1938]: time="2025-09-09T05:34:06.134185698Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.135963079Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.136244084Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.136434202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.136475989Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.136498370Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:34:06.136782 containerd[1938]: time="2025-09-09T05:34:06.136519242Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:34:06.137608 containerd[1938]: time="2025-09-09T05:34:06.137385618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:34:06.137608 containerd[1938]: time="2025-09-09T05:34:06.137417178Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:34:06.137608 containerd[1938]: time="2025-09-09T05:34:06.137435631Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:34:06.138350 containerd[1938]: time="2025-09-09T05:34:06.137750892Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:34:06.138350 containerd[1938]: time="2025-09-09T05:34:06.137776466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:34:06.138350 containerd[1938]: time="2025-09-09T05:34:06.137796823Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:34:06.138350 containerd[1938]: time="2025-09-09T05:34:06.138303218Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.138335938Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.138939854Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.138959746Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.138972016Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.138986694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139181767Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139209975Z" level=info msg="runtime interface created" Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139218096Z" level=info msg="created NRI interface" Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139231757Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139254312Z" level=info msg="Connect containerd service" Sep 9 05:34:06.139336 containerd[1938]: time="2025-09-09T05:34:06.139296686Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:34:06.143432 containerd[1938]: time="2025-09-09T05:34:06.143351038Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:34:06.241106 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:34:06.419067 sshd_keygen[1934]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:34:06.430655 systemd-networkd[1819]: eth0: Gained IPv6LL Sep 9 05:34:06.438018 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:34:06.439351 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:34:06.446894 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Sep 9 05:34:06.452489 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:06.458241 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:34:06.484897 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:34:06.490113 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:34:06.494869 systemd[1]: Started sshd@0-172.31.26.176:22-147.75.109.163:52750.service - OpenSSH per-connection server daemon (147.75.109.163:52750). Sep 9 05:34:06.570927 containerd[1938]: time="2025-09-09T05:34:06.570871265Z" level=info msg="Start subscribing containerd event" Sep 9 05:34:06.579322 containerd[1938]: time="2025-09-09T05:34:06.574909466Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:34:06.579322 containerd[1938]: time="2025-09-09T05:34:06.578654625Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:34:06.579322 containerd[1938]: time="2025-09-09T05:34:06.579064414Z" level=info msg="Start recovering state" Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589015198Z" level=info msg="Start event monitor" Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589158204Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589172599Z" level=info msg="Start streaming server" Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589366393Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589381725Z" level=info msg="runtime interface starting up..." Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589392013Z" level=info msg="starting plugins..." Sep 9 05:34:06.594557 containerd[1938]: time="2025-09-09T05:34:06.589410748Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:34:06.597415 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:34:06.599509 containerd[1938]: time="2025-09-09T05:34:06.597272498Z" level=info msg="containerd successfully booted in 0.572947s" Sep 9 05:34:06.632291 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:34:06.633653 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:34:06.638955 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:34:06.660370 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:34:06.680344 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:34:06.693646 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:34:06.697901 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 05:34:06.699721 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:34:06.788908 systemd-logind[1878]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 05:34:06.788941 systemd-logind[1878]: Watching system buttons on /dev/input/event3 (Sleep Button) Sep 9 05:34:06.788967 systemd-logind[1878]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 05:34:06.790172 systemd-logind[1878]: New seat seat0. Sep 9 05:34:06.795964 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:34:06.846428 amazon-ssm-agent[2159]: Initializing new seelog logger Sep 9 05:34:06.847052 amazon-ssm-agent[2159]: New Seelog Logger Creation Complete Sep 9 05:34:06.847178 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.847734 amazon-ssm-agent[2159]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.848565 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 processing appconfig overrides Sep 9 05:34:06.848994 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.849195 amazon-ssm-agent[2159]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.849463 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 processing appconfig overrides Sep 9 05:34:06.849789 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8489 INFO Proxy environment variables: Sep 9 05:34:06.849981 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.849981 amazon-ssm-agent[2159]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.850057 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 processing appconfig overrides Sep 9 05:34:06.853946 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.853946 amazon-ssm-agent[2159]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:06.854228 amazon-ssm-agent[2159]: 2025/09/09 05:34:06 processing appconfig overrides Sep 9 05:34:06.858896 tar[1920]: linux-amd64/LICENSE Sep 9 05:34:06.858896 tar[1920]: linux-amd64/README.md Sep 9 05:34:06.892756 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 05:34:06.895073 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:34:06.896422 dbus-daemon[1862]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 05:34:06.899136 dbus-daemon[1862]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1978 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 05:34:06.908831 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 05:34:06.925746 sshd[2168]: Accepted publickey for core from 147.75.109.163 port 52750 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:06.933986 sshd-session[2168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:06.952851 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:34:06.954158 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8489 INFO https_proxy: Sep 9 05:34:06.955650 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:34:06.983555 systemd-logind[1878]: New session 1 of user core. Sep 9 05:34:06.993798 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:34:07.002773 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:34:07.022050 (systemd)[2223]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:34:07.029567 systemd-logind[1878]: New session c1 of user core. Sep 9 05:34:07.057613 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8489 INFO http_proxy: Sep 9 05:34:07.104938 polkitd[2220]: Started polkitd version 126 Sep 9 05:34:07.118702 polkitd[2220]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 05:34:07.119212 polkitd[2220]: Loading rules from directory /run/polkit-1/rules.d Sep 9 05:34:07.119265 polkitd[2220]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:34:07.119899 polkitd[2220]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 9 05:34:07.120057 polkitd[2220]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:34:07.120168 polkitd[2220]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 05:34:07.125914 polkitd[2220]: Finished loading, compiling and executing 2 rules Sep 9 05:34:07.126293 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 05:34:07.132756 dbus-daemon[1862]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 05:34:07.133658 polkitd[2220]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 05:34:07.150944 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8489 INFO no_proxy: Sep 9 05:34:07.166712 systemd-resolved[1765]: System hostname changed to 'ip-172-31-26-176'. Sep 9 05:34:07.166714 systemd-hostnamed[1978]: Hostname set to (transient) Sep 9 05:34:07.250154 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8495 INFO Checking if agent identity type OnPrem can be assumed Sep 9 05:34:07.321470 systemd[2223]: Queued start job for default target default.target. Sep 9 05:34:07.328113 systemd[2223]: Created slice app.slice - User Application Slice. Sep 9 05:34:07.328151 systemd[2223]: Reached target paths.target - Paths. Sep 9 05:34:07.328206 systemd[2223]: Reached target timers.target - Timers. Sep 9 05:34:07.330231 systemd[2223]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:34:07.349481 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.8497 INFO Checking if agent identity type EC2 can be assumed Sep 9 05:34:07.358617 systemd[2223]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:34:07.358772 systemd[2223]: Reached target sockets.target - Sockets. Sep 9 05:34:07.358914 systemd[2223]: Reached target basic.target - Basic System. Sep 9 05:34:07.358996 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:34:07.359692 systemd[2223]: Reached target default.target - Main User Target. Sep 9 05:34:07.359737 systemd[2223]: Startup finished in 303ms. Sep 9 05:34:07.366758 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:34:07.448683 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9367 INFO Agent will take identity from EC2 Sep 9 05:34:07.516355 systemd[1]: Started sshd@1-172.31.26.176:22-147.75.109.163:52764.service - OpenSSH per-connection server daemon (147.75.109.163:52764). Sep 9 05:34:07.524769 amazon-ssm-agent[2159]: 2025/09/09 05:34:07 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:07.524769 amazon-ssm-agent[2159]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Sep 9 05:34:07.525730 amazon-ssm-agent[2159]: 2025/09/09 05:34:07 processing appconfig overrides Sep 9 05:34:07.547498 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9396 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9397 INFO [amazon-ssm-agent] OS: linux, Arch: amd64 Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9397 INFO [amazon-ssm-agent] Starting Core Agent Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9397 INFO [amazon-ssm-agent] Registrar detected. Attempting registration Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9397 INFO [Registrar] Starting registrar module Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9455 INFO [EC2Identity] Checking disk for registration info Sep 9 05:34:07.559510 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9456 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:06.9456 INFO [EC2Identity] Generating registration keypair Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.4756 INFO [EC2Identity] Checking write access before registering Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.4779 INFO [EC2Identity] Registering EC2 instance with Systems Manager Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5245 INFO [EC2Identity] EC2 registration was successful. Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5245 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5246 INFO [CredentialRefresher] credentialRefresher has started Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5246 INFO [CredentialRefresher] Starting credentials refresher loop Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5588 INFO EC2RoleProvider Successfully connected with instance profile role credentials Sep 9 05:34:07.559748 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5594 INFO [CredentialRefresher] Credentials ready Sep 9 05:34:07.645998 amazon-ssm-agent[2159]: 2025-09-09 05:34:07.5596 INFO [CredentialRefresher] Next credential rotation will be in 29.999987291683333 minutes Sep 9 05:34:07.694409 sshd[2243]: Accepted publickey for core from 147.75.109.163 port 52764 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:07.696111 sshd-session[2243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:07.704442 systemd-logind[1878]: New session 2 of user core. Sep 9 05:34:07.716781 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:34:07.838661 sshd[2246]: Connection closed by 147.75.109.163 port 52764 Sep 9 05:34:07.838756 sshd-session[2243]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:07.846376 systemd[1]: sshd@1-172.31.26.176:22-147.75.109.163:52764.service: Deactivated successfully. Sep 9 05:34:07.849363 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:34:07.852470 systemd-logind[1878]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:34:07.853735 systemd-logind[1878]: Removed session 2. Sep 9 05:34:07.870427 systemd[1]: Started sshd@2-172.31.26.176:22-147.75.109.163:52780.service - OpenSSH per-connection server daemon (147.75.109.163:52780). Sep 9 05:34:08.045686 sshd[2252]: Accepted publickey for core from 147.75.109.163 port 52780 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:08.046775 sshd-session[2252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:08.052604 systemd-logind[1878]: New session 3 of user core. Sep 9 05:34:08.059779 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:34:08.181199 sshd[2255]: Connection closed by 147.75.109.163 port 52780 Sep 9 05:34:08.181766 sshd-session[2252]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:08.186995 systemd[1]: sshd@2-172.31.26.176:22-147.75.109.163:52780.service: Deactivated successfully. Sep 9 05:34:08.189310 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 05:34:08.191268 systemd-logind[1878]: Session 3 logged out. Waiting for processes to exit. Sep 9 05:34:08.193471 systemd-logind[1878]: Removed session 3. Sep 9 05:34:08.581043 amazon-ssm-agent[2159]: 2025-09-09 05:34:08.5809 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Sep 9 05:34:08.683012 amazon-ssm-agent[2159]: 2025-09-09 05:34:08.5829 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2262) started Sep 9 05:34:08.783714 amazon-ssm-agent[2159]: 2025-09-09 05:34:08.5829 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Sep 9 05:34:08.967872 ntpd[1871]: Listen normally on 6 eth0 [fe80::4b2:dff:fecf:81ad%2]:123 Sep 9 05:34:08.968449 ntpd[1871]: 9 Sep 05:34:08 ntpd[1871]: Listen normally on 6 eth0 [fe80::4b2:dff:fecf:81ad%2]:123 Sep 9 05:34:09.389223 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:09.391456 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:34:09.393022 systemd[1]: Startup finished in 2.722s (kernel) + 7.818s (initrd) + 9.955s (userspace) = 20.496s. Sep 9 05:34:09.398646 (kubelet)[2280]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:34:10.635831 kubelet[2280]: E0909 05:34:10.635748 2280 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:34:10.638274 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:34:10.638472 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:34:10.639083 systemd[1]: kubelet.service: Consumed 1.083s CPU time, 264.5M memory peak. Sep 9 05:34:14.134294 systemd-resolved[1765]: Clock change detected. Flushing caches. Sep 9 05:34:19.392730 systemd[1]: Started sshd@3-172.31.26.176:22-147.75.109.163:56174.service - OpenSSH per-connection server daemon (147.75.109.163:56174). Sep 9 05:34:19.566593 sshd[2292]: Accepted publickey for core from 147.75.109.163 port 56174 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:19.570010 sshd-session[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:19.583969 systemd-logind[1878]: New session 4 of user core. Sep 9 05:34:19.587736 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:34:19.705467 sshd[2295]: Connection closed by 147.75.109.163 port 56174 Sep 9 05:34:19.706027 sshd-session[2292]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:19.710156 systemd[1]: sshd@3-172.31.26.176:22-147.75.109.163:56174.service: Deactivated successfully. Sep 9 05:34:19.712137 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 05:34:19.713122 systemd-logind[1878]: Session 4 logged out. Waiting for processes to exit. Sep 9 05:34:19.714502 systemd-logind[1878]: Removed session 4. Sep 9 05:34:19.739659 systemd[1]: Started sshd@4-172.31.26.176:22-147.75.109.163:56186.service - OpenSSH per-connection server daemon (147.75.109.163:56186). Sep 9 05:34:19.928602 sshd[2301]: Accepted publickey for core from 147.75.109.163 port 56186 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:19.929820 sshd-session[2301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:19.935316 systemd-logind[1878]: New session 5 of user core. Sep 9 05:34:19.942719 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:34:20.057385 sshd[2304]: Connection closed by 147.75.109.163 port 56186 Sep 9 05:34:20.058229 sshd-session[2301]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:20.062759 systemd[1]: sshd@4-172.31.26.176:22-147.75.109.163:56186.service: Deactivated successfully. Sep 9 05:34:20.064886 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:34:20.065726 systemd-logind[1878]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:34:20.066914 systemd-logind[1878]: Removed session 5. Sep 9 05:34:20.103198 systemd[1]: Started sshd@5-172.31.26.176:22-147.75.109.163:38218.service - OpenSSH per-connection server daemon (147.75.109.163:38218). Sep 9 05:34:20.285962 sshd[2310]: Accepted publickey for core from 147.75.109.163 port 38218 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:20.287278 sshd-session[2310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:20.293924 systemd-logind[1878]: New session 6 of user core. Sep 9 05:34:20.299730 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:34:20.419654 sshd[2313]: Connection closed by 147.75.109.163 port 38218 Sep 9 05:34:20.420584 sshd-session[2310]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:20.424522 systemd[1]: sshd@5-172.31.26.176:22-147.75.109.163:38218.service: Deactivated successfully. Sep 9 05:34:20.426322 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:34:20.427165 systemd-logind[1878]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:34:20.428731 systemd-logind[1878]: Removed session 6. Sep 9 05:34:20.449014 systemd[1]: Started sshd@6-172.31.26.176:22-147.75.109.163:38226.service - OpenSSH per-connection server daemon (147.75.109.163:38226). Sep 9 05:34:20.617424 sshd[2319]: Accepted publickey for core from 147.75.109.163 port 38226 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:20.618748 sshd-session[2319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:20.624317 systemd-logind[1878]: New session 7 of user core. Sep 9 05:34:20.630764 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:34:20.775805 sudo[2323]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:34:20.776409 sudo[2323]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:34:20.791803 sudo[2323]: pam_unix(sudo:session): session closed for user root Sep 9 05:34:20.814249 sshd[2322]: Connection closed by 147.75.109.163 port 38226 Sep 9 05:34:20.815437 sshd-session[2319]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:20.821550 systemd[1]: sshd@6-172.31.26.176:22-147.75.109.163:38226.service: Deactivated successfully. Sep 9 05:34:20.823466 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:34:20.824580 systemd-logind[1878]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:34:20.826594 systemd-logind[1878]: Removed session 7. Sep 9 05:34:20.852649 systemd[1]: Started sshd@7-172.31.26.176:22-147.75.109.163:38234.service - OpenSSH per-connection server daemon (147.75.109.163:38234). Sep 9 05:34:21.029688 sshd[2329]: Accepted publickey for core from 147.75.109.163 port 38234 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:21.031467 sshd-session[2329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:21.037549 systemd-logind[1878]: New session 8 of user core. Sep 9 05:34:21.050735 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:34:21.147567 sudo[2334]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:34:21.147837 sudo[2334]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:34:21.154639 sudo[2334]: pam_unix(sudo:session): session closed for user root Sep 9 05:34:21.160222 sudo[2333]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:34:21.160523 sudo[2333]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:34:21.171664 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:34:21.215192 augenrules[2356]: No rules Sep 9 05:34:21.216777 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:34:21.216990 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:34:21.218255 sudo[2333]: pam_unix(sudo:session): session closed for user root Sep 9 05:34:21.241087 sshd[2332]: Connection closed by 147.75.109.163 port 38234 Sep 9 05:34:21.241627 sshd-session[2329]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:21.245080 systemd[1]: sshd@7-172.31.26.176:22-147.75.109.163:38234.service: Deactivated successfully. Sep 9 05:34:21.246662 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:34:21.248402 systemd-logind[1878]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:34:21.249386 systemd-logind[1878]: Removed session 8. Sep 9 05:34:21.274207 systemd[1]: Started sshd@8-172.31.26.176:22-147.75.109.163:38238.service - OpenSSH per-connection server daemon (147.75.109.163:38238). Sep 9 05:34:21.451977 sshd[2365]: Accepted publickey for core from 147.75.109.163 port 38238 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:34:21.453256 sshd-session[2365]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:21.458884 systemd-logind[1878]: New session 9 of user core. Sep 9 05:34:21.463696 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:34:21.633012 sudo[2369]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:34:21.633390 sudo[2369]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:34:22.055126 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:34:22.059717 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:22.341813 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:34:22.353961 (dockerd)[2392]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:34:22.416195 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:22.428992 (kubelet)[2398]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:34:22.496208 kubelet[2398]: E0909 05:34:22.496083 2398 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:34:22.500375 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:34:22.500818 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:34:22.501203 systemd[1]: kubelet.service: Consumed 193ms CPU time, 110.6M memory peak. Sep 9 05:34:22.893119 dockerd[2392]: time="2025-09-09T05:34:22.892742448Z" level=info msg="Starting up" Sep 9 05:34:22.895778 dockerd[2392]: time="2025-09-09T05:34:22.895370586Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:34:22.908365 dockerd[2392]: time="2025-09-09T05:34:22.908322438Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:34:22.989929 dockerd[2392]: time="2025-09-09T05:34:22.989878833Z" level=info msg="Loading containers: start." Sep 9 05:34:23.014515 kernel: Initializing XFRM netlink socket Sep 9 05:34:23.325231 (udev-worker)[2426]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:34:23.373683 systemd-networkd[1819]: docker0: Link UP Sep 9 05:34:23.384554 dockerd[2392]: time="2025-09-09T05:34:23.384501126Z" level=info msg="Loading containers: done." Sep 9 05:34:23.399217 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2007630945-merged.mount: Deactivated successfully. Sep 9 05:34:23.412685 dockerd[2392]: time="2025-09-09T05:34:23.412618075Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:34:23.412890 dockerd[2392]: time="2025-09-09T05:34:23.412774968Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:34:23.412944 dockerd[2392]: time="2025-09-09T05:34:23.412894489Z" level=info msg="Initializing buildkit" Sep 9 05:34:23.467642 dockerd[2392]: time="2025-09-09T05:34:23.467593128Z" level=info msg="Completed buildkit initialization" Sep 9 05:34:23.476349 dockerd[2392]: time="2025-09-09T05:34:23.476287188Z" level=info msg="Daemon has completed initialization" Sep 9 05:34:23.476573 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:34:23.476965 dockerd[2392]: time="2025-09-09T05:34:23.476914879Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:34:24.743134 containerd[1938]: time="2025-09-09T05:34:24.743090176Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 05:34:25.324050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2252987029.mount: Deactivated successfully. Sep 9 05:34:26.564164 containerd[1938]: time="2025-09-09T05:34:26.564107937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:26.565302 containerd[1938]: time="2025-09-09T05:34:26.565119878Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 9 05:34:26.566362 containerd[1938]: time="2025-09-09T05:34:26.566325264Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:26.569051 containerd[1938]: time="2025-09-09T05:34:26.569018187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:26.570267 containerd[1938]: time="2025-09-09T05:34:26.569738970Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 1.826611158s" Sep 9 05:34:26.570267 containerd[1938]: time="2025-09-09T05:34:26.569773034Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 9 05:34:26.570580 containerd[1938]: time="2025-09-09T05:34:26.570546107Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 05:34:27.970830 containerd[1938]: time="2025-09-09T05:34:27.970763707Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:27.972583 containerd[1938]: time="2025-09-09T05:34:27.972371251Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 9 05:34:27.974441 containerd[1938]: time="2025-09-09T05:34:27.974406826Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:27.978383 containerd[1938]: time="2025-09-09T05:34:27.978343606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:27.979010 containerd[1938]: time="2025-09-09T05:34:27.978981249Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 1.408400261s" Sep 9 05:34:27.979105 containerd[1938]: time="2025-09-09T05:34:27.979092070Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 9 05:34:27.979961 containerd[1938]: time="2025-09-09T05:34:27.979924277Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 05:34:29.126651 containerd[1938]: time="2025-09-09T05:34:29.126595240Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:29.128063 containerd[1938]: time="2025-09-09T05:34:29.127959405Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 9 05:34:29.129112 containerd[1938]: time="2025-09-09T05:34:29.129069976Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:29.131711 containerd[1938]: time="2025-09-09T05:34:29.131653535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:29.132687 containerd[1938]: time="2025-09-09T05:34:29.132534312Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 1.152579749s" Sep 9 05:34:29.132687 containerd[1938]: time="2025-09-09T05:34:29.132566983Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 9 05:34:29.133122 containerd[1938]: time="2025-09-09T05:34:29.133091695Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 05:34:30.135522 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount15415599.mount: Deactivated successfully. Sep 9 05:34:30.702526 containerd[1938]: time="2025-09-09T05:34:30.702186904Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:30.705374 containerd[1938]: time="2025-09-09T05:34:30.705310568Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 9 05:34:30.707923 containerd[1938]: time="2025-09-09T05:34:30.706906282Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:30.709947 containerd[1938]: time="2025-09-09T05:34:30.709878832Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:30.710542 containerd[1938]: time="2025-09-09T05:34:30.710506700Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 1.577384318s" Sep 9 05:34:30.711373 containerd[1938]: time="2025-09-09T05:34:30.710547548Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 9 05:34:30.711373 containerd[1938]: time="2025-09-09T05:34:30.711042002Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 05:34:31.195105 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2134568253.mount: Deactivated successfully. Sep 9 05:34:32.393202 containerd[1938]: time="2025-09-09T05:34:32.393129835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:32.398030 containerd[1938]: time="2025-09-09T05:34:32.397763903Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 9 05:34:32.402717 containerd[1938]: time="2025-09-09T05:34:32.402670243Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:32.408941 containerd[1938]: time="2025-09-09T05:34:32.408522901Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:32.409554 containerd[1938]: time="2025-09-09T05:34:32.409519236Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.698446953s" Sep 9 05:34:32.409554 containerd[1938]: time="2025-09-09T05:34:32.409555255Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 05:34:32.410548 containerd[1938]: time="2025-09-09T05:34:32.410526564Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:34:32.668328 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:34:32.670500 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:33.045784 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:33.056841 (kubelet)[2746]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:34:33.073772 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3932095693.mount: Deactivated successfully. Sep 9 05:34:33.083562 containerd[1938]: time="2025-09-09T05:34:33.083505779Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:34:33.086639 containerd[1938]: time="2025-09-09T05:34:33.086593304Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 9 05:34:33.087915 containerd[1938]: time="2025-09-09T05:34:33.087867287Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:34:33.093223 containerd[1938]: time="2025-09-09T05:34:33.093064227Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:34:33.094303 containerd[1938]: time="2025-09-09T05:34:33.094259994Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 683.607591ms" Sep 9 05:34:33.094431 containerd[1938]: time="2025-09-09T05:34:33.094309294Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 05:34:33.095707 containerd[1938]: time="2025-09-09T05:34:33.095679946Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 05:34:33.115740 kubelet[2746]: E0909 05:34:33.115687 2746 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:34:33.118501 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:34:33.118646 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:34:33.119290 systemd[1]: kubelet.service: Consumed 181ms CPU time, 109.7M memory peak. Sep 9 05:34:33.529975 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3057583105.mount: Deactivated successfully. Sep 9 05:34:35.476091 containerd[1938]: time="2025-09-09T05:34:35.475975433Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:35.478506 containerd[1938]: time="2025-09-09T05:34:35.478430124Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 9 05:34:35.483180 containerd[1938]: time="2025-09-09T05:34:35.483067915Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:35.486942 containerd[1938]: time="2025-09-09T05:34:35.486879455Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:35.488434 containerd[1938]: time="2025-09-09T05:34:35.487998567Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.392151293s" Sep 9 05:34:35.488434 containerd[1938]: time="2025-09-09T05:34:35.488038352Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 9 05:34:38.028456 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:38.029214 systemd[1]: kubelet.service: Consumed 181ms CPU time, 109.7M memory peak. Sep 9 05:34:38.031759 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:38.067988 systemd[1]: Reload requested from client PID 2840 ('systemctl') (unit session-9.scope)... Sep 9 05:34:38.068012 systemd[1]: Reloading... Sep 9 05:34:38.190562 zram_generator::config[2887]: No configuration found. Sep 9 05:34:38.483809 systemd[1]: Reloading finished in 415 ms. Sep 9 05:34:38.508282 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 05:34:38.543328 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 05:34:38.543406 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 05:34:38.543927 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:38.543980 systemd[1]: kubelet.service: Consumed 139ms CPU time, 98.2M memory peak. Sep 9 05:34:38.546041 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:38.788280 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:38.798939 (kubelet)[2954]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:34:38.869618 kubelet[2954]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:34:38.869618 kubelet[2954]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:34:38.869618 kubelet[2954]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:34:38.872959 kubelet[2954]: I0909 05:34:38.872892 2954 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:34:39.180718 kubelet[2954]: I0909 05:34:39.180667 2954 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:34:39.180718 kubelet[2954]: I0909 05:34:39.180703 2954 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:34:39.181085 kubelet[2954]: I0909 05:34:39.181058 2954 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:34:39.238156 kubelet[2954]: I0909 05:34:39.238067 2954 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:34:39.242494 kubelet[2954]: E0909 05:34:39.242378 2954 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.26.176:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:39.256007 kubelet[2954]: I0909 05:34:39.255457 2954 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:34:39.263931 kubelet[2954]: I0909 05:34:39.259975 2954 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:34:39.269538 kubelet[2954]: I0909 05:34:39.269472 2954 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:34:39.269754 kubelet[2954]: I0909 05:34:39.269709 2954 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:34:39.269935 kubelet[2954]: I0909 05:34:39.269751 2954 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-26-176","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:34:39.270062 kubelet[2954]: I0909 05:34:39.269938 2954 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:34:39.270062 kubelet[2954]: I0909 05:34:39.269948 2954 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:34:39.270919 kubelet[2954]: I0909 05:34:39.270880 2954 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:34:39.277941 kubelet[2954]: I0909 05:34:39.277449 2954 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:34:39.277941 kubelet[2954]: I0909 05:34:39.277507 2954 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:34:39.277941 kubelet[2954]: I0909 05:34:39.277548 2954 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:34:39.277941 kubelet[2954]: I0909 05:34:39.277567 2954 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:34:39.286636 kubelet[2954]: W0909 05:34:39.286200 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.26.176:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-176&limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:39.286636 kubelet[2954]: E0909 05:34:39.286274 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.26.176:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-176&limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:39.286777 kubelet[2954]: I0909 05:34:39.286764 2954 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:34:39.289686 kubelet[2954]: W0909 05:34:39.289624 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.26.176:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:39.289686 kubelet[2954]: E0909 05:34:39.289687 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.26.176:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:39.291291 kubelet[2954]: I0909 05:34:39.291085 2954 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:34:39.292035 kubelet[2954]: W0909 05:34:39.291994 2954 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:34:39.292671 kubelet[2954]: I0909 05:34:39.292629 2954 server.go:1274] "Started kubelet" Sep 9 05:34:39.295499 kubelet[2954]: I0909 05:34:39.294780 2954 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:34:39.296051 kubelet[2954]: I0909 05:34:39.296035 2954 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:34:39.301312 kubelet[2954]: I0909 05:34:39.301252 2954 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:34:39.302203 kubelet[2954]: I0909 05:34:39.301641 2954 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:34:39.303360 kubelet[2954]: E0909 05:34:39.301867 2954 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.26.176:6443/api/v1/namespaces/default/events\": dial tcp 172.31.26.176:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-26-176.1863866c4be668f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-26-176,UID:ip-172-31-26-176,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-26-176,},FirstTimestamp:2025-09-09 05:34:39.292606706 +0000 UTC m=+0.489074291,LastTimestamp:2025-09-09 05:34:39.292606706 +0000 UTC m=+0.489074291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-26-176,}" Sep 9 05:34:39.310510 kubelet[2954]: I0909 05:34:39.310433 2954 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:34:39.313442 kubelet[2954]: I0909 05:34:39.313400 2954 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:34:39.319379 kubelet[2954]: I0909 05:34:39.319343 2954 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:34:39.319759 kubelet[2954]: E0909 05:34:39.319728 2954 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ip-172-31-26-176\" not found" Sep 9 05:34:39.323271 kubelet[2954]: I0909 05:34:39.322561 2954 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:34:39.323271 kubelet[2954]: E0909 05:34:39.322600 2954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": dial tcp 172.31.26.176:6443: connect: connection refused" interval="200ms" Sep 9 05:34:39.323271 kubelet[2954]: I0909 05:34:39.322917 2954 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:34:39.324020 kubelet[2954]: W0909 05:34:39.323970 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.26.176:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:39.324118 kubelet[2954]: E0909 05:34:39.324036 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.26.176:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:39.325879 kubelet[2954]: E0909 05:34:39.325857 2954 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:34:39.326619 kubelet[2954]: I0909 05:34:39.326599 2954 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:34:39.326729 kubelet[2954]: I0909 05:34:39.326686 2954 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:34:39.328923 kubelet[2954]: I0909 05:34:39.328856 2954 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:34:39.348650 kubelet[2954]: I0909 05:34:39.348613 2954 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:34:39.351334 kubelet[2954]: I0909 05:34:39.349326 2954 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:34:39.351334 kubelet[2954]: I0909 05:34:39.349356 2954 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:34:39.354968 kubelet[2954]: I0909 05:34:39.354942 2954 policy_none.go:49] "None policy: Start" Sep 9 05:34:39.357249 kubelet[2954]: I0909 05:34:39.357214 2954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:34:39.357473 kubelet[2954]: I0909 05:34:39.357428 2954 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:34:39.357473 kubelet[2954]: I0909 05:34:39.357453 2954 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:34:39.362123 kubelet[2954]: I0909 05:34:39.362057 2954 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:34:39.362123 kubelet[2954]: I0909 05:34:39.362095 2954 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:34:39.362123 kubelet[2954]: I0909 05:34:39.362119 2954 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:34:39.362333 kubelet[2954]: E0909 05:34:39.362167 2954 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:34:39.364730 kubelet[2954]: W0909 05:34:39.364422 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.26.176:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:39.364730 kubelet[2954]: E0909 05:34:39.364472 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.26.176:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:39.372042 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:34:39.385354 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:34:39.389784 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:34:39.400813 kubelet[2954]: I0909 05:34:39.400624 2954 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:34:39.400813 kubelet[2954]: I0909 05:34:39.400805 2954 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:34:39.400956 kubelet[2954]: I0909 05:34:39.400815 2954 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:34:39.401263 kubelet[2954]: I0909 05:34:39.401194 2954 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:34:39.403126 kubelet[2954]: E0909 05:34:39.402982 2954 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-26-176\" not found" Sep 9 05:34:39.476586 systemd[1]: Created slice kubepods-burstable-pod002c4c12c3be7213faf40e652fc9202a.slice - libcontainer container kubepods-burstable-pod002c4c12c3be7213faf40e652fc9202a.slice. Sep 9 05:34:39.495903 systemd[1]: Created slice kubepods-burstable-pod6ffb112b616f74d1396e66a2231fad32.slice - libcontainer container kubepods-burstable-pod6ffb112b616f74d1396e66a2231fad32.slice. Sep 9 05:34:39.502631 kubelet[2954]: I0909 05:34:39.502588 2954 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:39.502942 kubelet[2954]: E0909 05:34:39.502898 2954 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.26.176:6443/api/v1/nodes\": dial tcp 172.31.26.176:6443: connect: connection refused" node="ip-172-31-26-176" Sep 9 05:34:39.509263 systemd[1]: Created slice kubepods-burstable-pod0a21a70dc4e47fda4af927d036fbc9c1.slice - libcontainer container kubepods-burstable-pod0a21a70dc4e47fda4af927d036fbc9c1.slice. Sep 9 05:34:39.523989 kubelet[2954]: E0909 05:34:39.523908 2954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": dial tcp 172.31.26.176:6443: connect: connection refused" interval="400ms" Sep 9 05:34:39.624597 kubelet[2954]: I0909 05:34:39.624558 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-ca-certs\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:39.624597 kubelet[2954]: I0909 05:34:39.624600 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-ca-certs\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:39.624820 kubelet[2954]: I0909 05:34:39.624626 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:39.624820 kubelet[2954]: I0909 05:34:39.624648 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:39.624820 kubelet[2954]: I0909 05:34:39.624666 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a21a70dc4e47fda4af927d036fbc9c1-kubeconfig\") pod \"kube-scheduler-ip-172-31-26-176\" (UID: \"0a21a70dc4e47fda4af927d036fbc9c1\") " pod="kube-system/kube-scheduler-ip-172-31-26-176" Sep 9 05:34:39.624820 kubelet[2954]: I0909 05:34:39.624683 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-k8s-certs\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:39.624820 kubelet[2954]: I0909 05:34:39.624707 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:39.624968 kubelet[2954]: I0909 05:34:39.624727 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-k8s-certs\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:39.624968 kubelet[2954]: I0909 05:34:39.624742 2954 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-kubeconfig\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:39.705201 kubelet[2954]: I0909 05:34:39.705172 2954 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:39.705488 kubelet[2954]: E0909 05:34:39.705453 2954 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.26.176:6443/api/v1/nodes\": dial tcp 172.31.26.176:6443: connect: connection refused" node="ip-172-31-26-176" Sep 9 05:34:39.798526 containerd[1938]: time="2025-09-09T05:34:39.798360555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-26-176,Uid:002c4c12c3be7213faf40e652fc9202a,Namespace:kube-system,Attempt:0,}" Sep 9 05:34:39.808174 containerd[1938]: time="2025-09-09T05:34:39.808128323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-26-176,Uid:6ffb112b616f74d1396e66a2231fad32,Namespace:kube-system,Attempt:0,}" Sep 9 05:34:39.818134 containerd[1938]: time="2025-09-09T05:34:39.818090990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-26-176,Uid:0a21a70dc4e47fda4af927d036fbc9c1,Namespace:kube-system,Attempt:0,}" Sep 9 05:34:39.925406 kubelet[2954]: E0909 05:34:39.925337 2954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": dial tcp 172.31.26.176:6443: connect: connection refused" interval="800ms" Sep 9 05:34:39.957505 containerd[1938]: time="2025-09-09T05:34:39.956902651Z" level=info msg="connecting to shim 645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f" address="unix:///run/containerd/s/ea8a0118e6402a07b7056d0dd061ce0e082a6d16344d5bd7f4b1c08d7d3d5ef9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:34:39.957505 containerd[1938]: time="2025-09-09T05:34:39.957052854Z" level=info msg="connecting to shim beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b" address="unix:///run/containerd/s/af3968c66355fc38254673a6ae891076636620436c1cfebee3fe02f19eab4e5e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:34:39.958950 containerd[1938]: time="2025-09-09T05:34:39.958917027Z" level=info msg="connecting to shim de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c" address="unix:///run/containerd/s/22ef77054589c9dca60092ef46de6bb3918d2315c34859fbac6e8644db76c05a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:34:40.087785 systemd[1]: Started cri-containerd-de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c.scope - libcontainer container de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c. Sep 9 05:34:40.104351 systemd[1]: Started cri-containerd-645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f.scope - libcontainer container 645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f. Sep 9 05:34:40.106367 systemd[1]: Started cri-containerd-beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b.scope - libcontainer container beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b. Sep 9 05:34:40.112023 kubelet[2954]: I0909 05:34:40.111923 2954 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:40.113745 kubelet[2954]: E0909 05:34:40.113699 2954 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.26.176:6443/api/v1/nodes\": dial tcp 172.31.26.176:6443: connect: connection refused" node="ip-172-31-26-176" Sep 9 05:34:40.234339 kubelet[2954]: W0909 05:34:40.234241 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.26.176:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-176&limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:40.234746 kubelet[2954]: E0909 05:34:40.234690 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.26.176:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-26-176&limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:40.236665 containerd[1938]: time="2025-09-09T05:34:40.236443728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-26-176,Uid:0a21a70dc4e47fda4af927d036fbc9c1,Namespace:kube-system,Attempt:0,} returns sandbox id \"de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c\"" Sep 9 05:34:40.237371 containerd[1938]: time="2025-09-09T05:34:40.236766675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-26-176,Uid:6ffb112b616f74d1396e66a2231fad32,Namespace:kube-system,Attempt:0,} returns sandbox id \"645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f\"" Sep 9 05:34:40.247200 containerd[1938]: time="2025-09-09T05:34:40.247148684Z" level=info msg="CreateContainer within sandbox \"645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:34:40.250147 containerd[1938]: time="2025-09-09T05:34:40.249594566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-26-176,Uid:002c4c12c3be7213faf40e652fc9202a,Namespace:kube-system,Attempt:0,} returns sandbox id \"beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b\"" Sep 9 05:34:40.250636 containerd[1938]: time="2025-09-09T05:34:40.250608129Z" level=info msg="CreateContainer within sandbox \"de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:34:40.260044 containerd[1938]: time="2025-09-09T05:34:40.259974807Z" level=info msg="Container 37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:34:40.267816 containerd[1938]: time="2025-09-09T05:34:40.267777450Z" level=info msg="CreateContainer within sandbox \"beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:34:40.276926 containerd[1938]: time="2025-09-09T05:34:40.276237557Z" level=info msg="Container 3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:34:40.282729 containerd[1938]: time="2025-09-09T05:34:40.282688348Z" level=info msg="CreateContainer within sandbox \"645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\"" Sep 9 05:34:40.283324 containerd[1938]: time="2025-09-09T05:34:40.283299670Z" level=info msg="StartContainer for \"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\"" Sep 9 05:34:40.284574 containerd[1938]: time="2025-09-09T05:34:40.284462060Z" level=info msg="connecting to shim 37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03" address="unix:///run/containerd/s/ea8a0118e6402a07b7056d0dd061ce0e082a6d16344d5bd7f4b1c08d7d3d5ef9" protocol=ttrpc version=3 Sep 9 05:34:40.286549 containerd[1938]: time="2025-09-09T05:34:40.286382589Z" level=info msg="Container ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:34:40.294500 containerd[1938]: time="2025-09-09T05:34:40.294419839Z" level=info msg="CreateContainer within sandbox \"de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\"" Sep 9 05:34:40.295439 containerd[1938]: time="2025-09-09T05:34:40.295407949Z" level=info msg="StartContainer for \"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\"" Sep 9 05:34:40.298279 containerd[1938]: time="2025-09-09T05:34:40.298115254Z" level=info msg="connecting to shim 3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73" address="unix:///run/containerd/s/22ef77054589c9dca60092ef46de6bb3918d2315c34859fbac6e8644db76c05a" protocol=ttrpc version=3 Sep 9 05:34:40.307513 containerd[1938]: time="2025-09-09T05:34:40.307131466Z" level=info msg="CreateContainer within sandbox \"beeabbca2d90758e2bd7c9421c7cf60fd3a5956a08c9421cc442cd12a922cb8b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb\"" Sep 9 05:34:40.318812 containerd[1938]: time="2025-09-09T05:34:40.318772786Z" level=info msg="StartContainer for \"ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb\"" Sep 9 05:34:40.322402 containerd[1938]: time="2025-09-09T05:34:40.322334480Z" level=info msg="connecting to shim ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb" address="unix:///run/containerd/s/af3968c66355fc38254673a6ae891076636620436c1cfebee3fe02f19eab4e5e" protocol=ttrpc version=3 Sep 9 05:34:40.326726 systemd[1]: Started cri-containerd-37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03.scope - libcontainer container 37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03. Sep 9 05:34:40.335834 systemd[1]: Started cri-containerd-3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73.scope - libcontainer container 3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73. Sep 9 05:34:40.364929 systemd[1]: Started cri-containerd-ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb.scope - libcontainer container ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb. Sep 9 05:34:40.469547 containerd[1938]: time="2025-09-09T05:34:40.469367150Z" level=info msg="StartContainer for \"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\" returns successfully" Sep 9 05:34:40.507852 containerd[1938]: time="2025-09-09T05:34:40.507734155Z" level=info msg="StartContainer for \"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\" returns successfully" Sep 9 05:34:40.512776 containerd[1938]: time="2025-09-09T05:34:40.512658272Z" level=info msg="StartContainer for \"ee84f01533bb0b941c5b0bc29ade4ad3e0723b97dd9277d2a1a5dc08bd5709fb\" returns successfully" Sep 9 05:34:40.650993 kubelet[2954]: W0909 05:34:40.650791 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.26.176:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:40.650993 kubelet[2954]: E0909 05:34:40.650879 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.26.176:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:40.726032 kubelet[2954]: E0909 05:34:40.725978 2954 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": dial tcp 172.31.26.176:6443: connect: connection refused" interval="1.6s" Sep 9 05:34:40.744247 kubelet[2954]: W0909 05:34:40.744036 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.26.176:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:40.744247 kubelet[2954]: E0909 05:34:40.744341 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.26.176:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:40.916984 kubelet[2954]: I0909 05:34:40.916452 2954 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:40.916984 kubelet[2954]: E0909 05:34:40.916805 2954 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.31.26.176:6443/api/v1/nodes\": dial tcp 172.31.26.176:6443: connect: connection refused" node="ip-172-31-26-176" Sep 9 05:34:40.940501 kubelet[2954]: W0909 05:34:40.939305 2954 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.26.176:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.26.176:6443: connect: connection refused Sep 9 05:34:40.940501 kubelet[2954]: E0909 05:34:40.939439 2954 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.26.176:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.26.176:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:34:42.521459 kubelet[2954]: I0909 05:34:42.521426 2954 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:43.167692 kubelet[2954]: E0909 05:34:43.167632 2954 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-26-176\" not found" node="ip-172-31-26-176" Sep 9 05:34:43.262448 kubelet[2954]: I0909 05:34:43.262408 2954 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-26-176" Sep 9 05:34:43.262647 kubelet[2954]: E0909 05:34:43.262457 2954 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ip-172-31-26-176\": node \"ip-172-31-26-176\" not found" Sep 9 05:34:43.287828 kubelet[2954]: I0909 05:34:43.287797 2954 apiserver.go:52] "Watching apiserver" Sep 9 05:34:43.323452 kubelet[2954]: I0909 05:34:43.323406 2954 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:34:43.421525 kubelet[2954]: E0909 05:34:43.421232 2954 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ip-172-31-26-176\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:43.421525 kubelet[2954]: E0909 05:34:43.421243 2954 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-26-176\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:45.293403 systemd[1]: Reload requested from client PID 3227 ('systemctl') (unit session-9.scope)... Sep 9 05:34:45.293423 systemd[1]: Reloading... Sep 9 05:34:45.429507 zram_generator::config[3274]: No configuration found. Sep 9 05:34:45.694319 systemd[1]: Reloading finished in 400 ms. Sep 9 05:34:45.731236 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:45.745109 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:34:45.745334 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:45.745390 systemd[1]: kubelet.service: Consumed 915ms CPU time, 126.7M memory peak. Sep 9 05:34:45.748001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:34:46.021019 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:34:46.032104 (kubelet)[3331]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:34:46.091535 kubelet[3331]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:34:46.091535 kubelet[3331]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:34:46.091535 kubelet[3331]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:34:46.091535 kubelet[3331]: I0909 05:34:46.091295 3331 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:34:46.101967 kubelet[3331]: I0909 05:34:46.101833 3331 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:34:46.101967 kubelet[3331]: I0909 05:34:46.101861 3331 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:34:46.102233 kubelet[3331]: I0909 05:34:46.102220 3331 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:34:46.105090 kubelet[3331]: I0909 05:34:46.105053 3331 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 05:34:46.108366 kubelet[3331]: I0909 05:34:46.108323 3331 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:34:46.116768 kubelet[3331]: I0909 05:34:46.116736 3331 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:34:46.119638 kubelet[3331]: I0909 05:34:46.119603 3331 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:34:46.119781 kubelet[3331]: I0909 05:34:46.119755 3331 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:34:46.119968 kubelet[3331]: I0909 05:34:46.119927 3331 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:34:46.120166 kubelet[3331]: I0909 05:34:46.119959 3331 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-26-176","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:34:46.120304 kubelet[3331]: I0909 05:34:46.120175 3331 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:34:46.120304 kubelet[3331]: I0909 05:34:46.120191 3331 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:34:46.120304 kubelet[3331]: I0909 05:34:46.120225 3331 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:34:46.120428 kubelet[3331]: I0909 05:34:46.120362 3331 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:34:46.120428 kubelet[3331]: I0909 05:34:46.120380 3331 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:34:46.121504 kubelet[3331]: I0909 05:34:46.121150 3331 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:34:46.122316 kubelet[3331]: I0909 05:34:46.122296 3331 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:34:46.131988 kubelet[3331]: I0909 05:34:46.130596 3331 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:34:46.131988 kubelet[3331]: I0909 05:34:46.131196 3331 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:34:46.131988 kubelet[3331]: I0909 05:34:46.131795 3331 server.go:1274] "Started kubelet" Sep 9 05:34:46.145373 kubelet[3331]: I0909 05:34:46.145346 3331 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:34:46.153811 kubelet[3331]: I0909 05:34:46.153764 3331 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:34:46.155154 kubelet[3331]: I0909 05:34:46.155137 3331 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:34:46.155961 kubelet[3331]: I0909 05:34:46.155928 3331 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:34:46.156459 kubelet[3331]: I0909 05:34:46.156424 3331 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:34:46.156643 kubelet[3331]: I0909 05:34:46.156629 3331 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:34:46.157336 kubelet[3331]: I0909 05:34:46.157305 3331 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:34:46.160407 kubelet[3331]: I0909 05:34:46.160393 3331 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:34:46.163691 kubelet[3331]: E0909 05:34:46.163663 3331 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:34:46.164007 kubelet[3331]: I0909 05:34:46.163989 3331 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:34:46.164199 kubelet[3331]: I0909 05:34:46.164179 3331 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:34:46.166901 kubelet[3331]: I0909 05:34:46.166798 3331 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:34:46.171823 kubelet[3331]: I0909 05:34:46.171792 3331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:34:46.174739 kubelet[3331]: I0909 05:34:46.174714 3331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:34:46.174875 kubelet[3331]: I0909 05:34:46.174864 3331 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:34:46.174974 kubelet[3331]: I0909 05:34:46.174965 3331 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:34:46.175119 kubelet[3331]: E0909 05:34:46.175089 3331 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:34:46.178792 kubelet[3331]: I0909 05:34:46.178765 3331 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:34:46.221960 kubelet[3331]: I0909 05:34:46.221928 3331 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:34:46.221960 kubelet[3331]: I0909 05:34:46.221956 3331 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:34:46.222123 kubelet[3331]: I0909 05:34:46.221989 3331 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:34:46.222177 kubelet[3331]: I0909 05:34:46.222163 3331 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:34:46.222215 kubelet[3331]: I0909 05:34:46.222178 3331 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:34:46.222215 kubelet[3331]: I0909 05:34:46.222209 3331 policy_none.go:49] "None policy: Start" Sep 9 05:34:46.222869 kubelet[3331]: I0909 05:34:46.222851 3331 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:34:46.223036 kubelet[3331]: I0909 05:34:46.223027 3331 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:34:46.223236 kubelet[3331]: I0909 05:34:46.223228 3331 state_mem.go:75] "Updated machine memory state" Sep 9 05:34:46.228451 kubelet[3331]: I0909 05:34:46.228421 3331 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:34:46.229635 kubelet[3331]: I0909 05:34:46.229441 3331 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:34:46.229635 kubelet[3331]: I0909 05:34:46.229456 3331 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:34:46.230834 kubelet[3331]: I0909 05:34:46.230808 3331 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:34:46.339263 kubelet[3331]: I0909 05:34:46.339166 3331 kubelet_node_status.go:72] "Attempting to register node" node="ip-172-31-26-176" Sep 9 05:34:46.349863 kubelet[3331]: I0909 05:34:46.349232 3331 kubelet_node_status.go:111] "Node was previously registered" node="ip-172-31-26-176" Sep 9 05:34:46.349863 kubelet[3331]: I0909 05:34:46.349307 3331 kubelet_node_status.go:75] "Successfully registered node" node="ip-172-31-26-176" Sep 9 05:34:46.368193 kubelet[3331]: I0909 05:34:46.368135 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-k8s-certs\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:46.368193 kubelet[3331]: I0909 05:34:46.368184 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:46.368193 kubelet[3331]: I0909 05:34:46.368213 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-kubeconfig\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:46.368584 kubelet[3331]: I0909 05:34:46.368238 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a21a70dc4e47fda4af927d036fbc9c1-kubeconfig\") pod \"kube-scheduler-ip-172-31-26-176\" (UID: \"0a21a70dc4e47fda4af927d036fbc9c1\") " pod="kube-system/kube-scheduler-ip-172-31-26-176" Sep 9 05:34:46.368584 kubelet[3331]: I0909 05:34:46.368271 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/002c4c12c3be7213faf40e652fc9202a-ca-certs\") pod \"kube-apiserver-ip-172-31-26-176\" (UID: \"002c4c12c3be7213faf40e652fc9202a\") " pod="kube-system/kube-apiserver-ip-172-31-26-176" Sep 9 05:34:46.368584 kubelet[3331]: I0909 05:34:46.368297 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-ca-certs\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:46.368584 kubelet[3331]: I0909 05:34:46.368325 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:46.368584 kubelet[3331]: I0909 05:34:46.368349 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-k8s-certs\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:46.368718 kubelet[3331]: I0909 05:34:46.368374 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6ffb112b616f74d1396e66a2231fad32-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-26-176\" (UID: \"6ffb112b616f74d1396e66a2231fad32\") " pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:47.125371 kubelet[3331]: I0909 05:34:47.124237 3331 apiserver.go:52] "Watching apiserver" Sep 9 05:34:47.161258 kubelet[3331]: I0909 05:34:47.161223 3331 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:34:47.212143 kubelet[3331]: E0909 05:34:47.212083 3331 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ip-172-31-26-176\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-26-176" Sep 9 05:34:47.251061 kubelet[3331]: I0909 05:34:47.250992 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-26-176" podStartSLOduration=1.250966407 podStartE2EDuration="1.250966407s" podCreationTimestamp="2025-09-09 05:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:34:47.232732085 +0000 UTC m=+1.193028747" watchObservedRunningTime="2025-09-09 05:34:47.250966407 +0000 UTC m=+1.211263067" Sep 9 05:34:47.266433 kubelet[3331]: I0909 05:34:47.266366 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-26-176" podStartSLOduration=1.266343569 podStartE2EDuration="1.266343569s" podCreationTimestamp="2025-09-09 05:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:34:47.251855392 +0000 UTC m=+1.212152052" watchObservedRunningTime="2025-09-09 05:34:47.266343569 +0000 UTC m=+1.226640232" Sep 9 05:34:47.266751 kubelet[3331]: I0909 05:34:47.266685 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-26-176" podStartSLOduration=1.266466659 podStartE2EDuration="1.266466659s" podCreationTimestamp="2025-09-09 05:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:34:47.266048128 +0000 UTC m=+1.226344794" watchObservedRunningTime="2025-09-09 05:34:47.266466659 +0000 UTC m=+1.226763321" Sep 9 05:34:51.124793 kubelet[3331]: I0909 05:34:51.124747 3331 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:34:51.125819 kubelet[3331]: I0909 05:34:51.125713 3331 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:34:51.126494 containerd[1938]: time="2025-09-09T05:34:51.125452435Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:34:51.947383 systemd[1]: Created slice kubepods-besteffort-pod6e2a712a_93e7_40d4_a02f_24001c8b47b5.slice - libcontainer container kubepods-besteffort-pod6e2a712a_93e7_40d4_a02f_24001c8b47b5.slice. Sep 9 05:34:52.007829 kubelet[3331]: I0909 05:34:52.007727 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6e2a712a-93e7-40d4-a02f-24001c8b47b5-kube-proxy\") pod \"kube-proxy-lkfvz\" (UID: \"6e2a712a-93e7-40d4-a02f-24001c8b47b5\") " pod="kube-system/kube-proxy-lkfvz" Sep 9 05:34:52.008331 kubelet[3331]: I0909 05:34:52.007932 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6e2a712a-93e7-40d4-a02f-24001c8b47b5-xtables-lock\") pod \"kube-proxy-lkfvz\" (UID: \"6e2a712a-93e7-40d4-a02f-24001c8b47b5\") " pod="kube-system/kube-proxy-lkfvz" Sep 9 05:34:52.008331 kubelet[3331]: I0909 05:34:52.008122 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e2a712a-93e7-40d4-a02f-24001c8b47b5-lib-modules\") pod \"kube-proxy-lkfvz\" (UID: \"6e2a712a-93e7-40d4-a02f-24001c8b47b5\") " pod="kube-system/kube-proxy-lkfvz" Sep 9 05:34:52.008331 kubelet[3331]: I0909 05:34:52.008272 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pz4j\" (UniqueName: \"kubernetes.io/projected/6e2a712a-93e7-40d4-a02f-24001c8b47b5-kube-api-access-9pz4j\") pod \"kube-proxy-lkfvz\" (UID: \"6e2a712a-93e7-40d4-a02f-24001c8b47b5\") " pod="kube-system/kube-proxy-lkfvz" Sep 9 05:34:52.068287 update_engine[1884]: I20250909 05:34:52.068197 1884 update_attempter.cc:509] Updating boot flags... Sep 9 05:34:52.261078 containerd[1938]: time="2025-09-09T05:34:52.260868852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lkfvz,Uid:6e2a712a-93e7-40d4-a02f-24001c8b47b5,Namespace:kube-system,Attempt:0,}" Sep 9 05:34:52.372590 containerd[1938]: time="2025-09-09T05:34:52.371204292Z" level=info msg="connecting to shim ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c" address="unix:///run/containerd/s/5761f2217a8e0ec48a57e8b3bf02a1493149693a5a882a1ad9f19c5c24f57a35" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:34:52.417731 kubelet[3331]: I0909 05:34:52.416245 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vz8\" (UniqueName: \"kubernetes.io/projected/0978dcef-6c7f-4a80-956d-5d8cd66b4805-kube-api-access-w7vz8\") pod \"tigera-operator-58fc44c59b-zpshc\" (UID: \"0978dcef-6c7f-4a80-956d-5d8cd66b4805\") " pod="tigera-operator/tigera-operator-58fc44c59b-zpshc" Sep 9 05:34:52.417731 kubelet[3331]: I0909 05:34:52.416302 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0978dcef-6c7f-4a80-956d-5d8cd66b4805-var-lib-calico\") pod \"tigera-operator-58fc44c59b-zpshc\" (UID: \"0978dcef-6c7f-4a80-956d-5d8cd66b4805\") " pod="tigera-operator/tigera-operator-58fc44c59b-zpshc" Sep 9 05:34:52.455698 systemd[1]: Started cri-containerd-ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c.scope - libcontainer container ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c. Sep 9 05:34:52.458071 systemd[1]: Created slice kubepods-besteffort-pod0978dcef_6c7f_4a80_956d_5d8cd66b4805.slice - libcontainer container kubepods-besteffort-pod0978dcef_6c7f_4a80_956d_5d8cd66b4805.slice. Sep 9 05:34:52.573348 containerd[1938]: time="2025-09-09T05:34:52.573245669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zpshc,Uid:0978dcef-6c7f-4a80-956d-5d8cd66b4805,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:34:52.639604 containerd[1938]: time="2025-09-09T05:34:52.638944781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lkfvz,Uid:6e2a712a-93e7-40d4-a02f-24001c8b47b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c\"" Sep 9 05:34:52.650215 containerd[1938]: time="2025-09-09T05:34:52.650164789Z" level=info msg="CreateContainer within sandbox \"ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:34:52.665368 containerd[1938]: time="2025-09-09T05:34:52.664657450Z" level=info msg="connecting to shim fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73" address="unix:///run/containerd/s/c480ea042eb7c08968efffcfea3c4048cb8658240b370d426708b3299823d7ce" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:34:52.688331 containerd[1938]: time="2025-09-09T05:34:52.687611604Z" level=info msg="Container 907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:34:52.768710 containerd[1938]: time="2025-09-09T05:34:52.767398905Z" level=info msg="CreateContainer within sandbox \"ec2ce0ef9e0582b6e9b4e24024810166a0a1b4284ebcb5c6aeeb8bb79ddbc73c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa\"" Sep 9 05:34:52.771865 containerd[1938]: time="2025-09-09T05:34:52.771824393Z" level=info msg="StartContainer for \"907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa\"" Sep 9 05:34:52.785421 containerd[1938]: time="2025-09-09T05:34:52.785367301Z" level=info msg="connecting to shim 907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa" address="unix:///run/containerd/s/5761f2217a8e0ec48a57e8b3bf02a1493149693a5a882a1ad9f19c5c24f57a35" protocol=ttrpc version=3 Sep 9 05:34:52.833737 systemd[1]: Started cri-containerd-907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa.scope - libcontainer container 907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa. Sep 9 05:34:52.836916 systemd[1]: Started cri-containerd-fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73.scope - libcontainer container fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73. Sep 9 05:34:52.933202 containerd[1938]: time="2025-09-09T05:34:52.932615612Z" level=info msg="StartContainer for \"907b3e039f01fbd00603916048af9b90936f4c766fcda523f57e42f7790b75aa\" returns successfully" Sep 9 05:34:52.951296 containerd[1938]: time="2025-09-09T05:34:52.951240648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zpshc,Uid:0978dcef-6c7f-4a80-956d-5d8cd66b4805,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73\"" Sep 9 05:34:52.954115 containerd[1938]: time="2025-09-09T05:34:52.954030004Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:34:54.443139 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount419730029.mount: Deactivated successfully. Sep 9 05:34:54.556803 kubelet[3331]: I0909 05:34:54.556530 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lkfvz" podStartSLOduration=3.556512623 podStartE2EDuration="3.556512623s" podCreationTimestamp="2025-09-09 05:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:34:53.231792344 +0000 UTC m=+7.192088996" watchObservedRunningTime="2025-09-09 05:34:54.556512623 +0000 UTC m=+8.516809280" Sep 9 05:34:56.002941 containerd[1938]: time="2025-09-09T05:34:56.002885934Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:56.009641 containerd[1938]: time="2025-09-09T05:34:56.009580216Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 05:34:56.012207 containerd[1938]: time="2025-09-09T05:34:56.012133838Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:56.015973 containerd[1938]: time="2025-09-09T05:34:56.015896528Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:34:56.022208 containerd[1938]: time="2025-09-09T05:34:56.022156473Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.068083035s" Sep 9 05:34:56.022208 containerd[1938]: time="2025-09-09T05:34:56.022211617Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 05:34:56.025750 containerd[1938]: time="2025-09-09T05:34:56.025704985Z" level=info msg="CreateContainer within sandbox \"fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:34:56.050512 containerd[1938]: time="2025-09-09T05:34:56.048269248Z" level=info msg="Container 9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:34:56.060926 containerd[1938]: time="2025-09-09T05:34:56.060806629Z" level=info msg="CreateContainer within sandbox \"fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\"" Sep 9 05:34:56.062531 containerd[1938]: time="2025-09-09T05:34:56.061875833Z" level=info msg="StartContainer for \"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\"" Sep 9 05:34:56.063010 containerd[1938]: time="2025-09-09T05:34:56.062986253Z" level=info msg="connecting to shim 9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e" address="unix:///run/containerd/s/c480ea042eb7c08968efffcfea3c4048cb8658240b370d426708b3299823d7ce" protocol=ttrpc version=3 Sep 9 05:34:56.095833 systemd[1]: Started cri-containerd-9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e.scope - libcontainer container 9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e. Sep 9 05:34:56.138144 containerd[1938]: time="2025-09-09T05:34:56.138099311Z" level=info msg="StartContainer for \"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" returns successfully" Sep 9 05:34:56.692757 kubelet[3331]: I0909 05:34:56.692699 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-zpshc" podStartSLOduration=1.622532678 podStartE2EDuration="4.692680266s" podCreationTimestamp="2025-09-09 05:34:52 +0000 UTC" firstStartedPulling="2025-09-09 05:34:52.953094526 +0000 UTC m=+6.913391169" lastFinishedPulling="2025-09-09 05:34:56.023242104 +0000 UTC m=+9.983538757" observedRunningTime="2025-09-09 05:34:56.239720036 +0000 UTC m=+10.200016698" watchObservedRunningTime="2025-09-09 05:34:56.692680266 +0000 UTC m=+10.652976919" Sep 9 05:35:03.610352 sudo[2369]: pam_unix(sudo:session): session closed for user root Sep 9 05:35:03.634802 sshd[2368]: Connection closed by 147.75.109.163 port 38238 Sep 9 05:35:03.634678 sshd-session[2365]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:03.645091 systemd[1]: sshd@8-172.31.26.176:22-147.75.109.163:38238.service: Deactivated successfully. Sep 9 05:35:03.649251 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:35:03.649586 systemd[1]: session-9.scope: Consumed 4.741s CPU time, 154.4M memory peak. Sep 9 05:35:03.652548 systemd-logind[1878]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:35:03.655914 systemd-logind[1878]: Removed session 9. Sep 9 05:35:08.669233 systemd[1]: Created slice kubepods-besteffort-pod6f4ea0f8_1109_4e3e_8359_edb5eb13d301.slice - libcontainer container kubepods-besteffort-pod6f4ea0f8_1109_4e3e_8359_edb5eb13d301.slice. Sep 9 05:35:08.774497 kubelet[3331]: I0909 05:35:08.774443 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nbr\" (UniqueName: \"kubernetes.io/projected/6f4ea0f8-1109-4e3e-8359-edb5eb13d301-kube-api-access-k4nbr\") pod \"calico-typha-f9854dc6c-xbl7n\" (UID: \"6f4ea0f8-1109-4e3e-8359-edb5eb13d301\") " pod="calico-system/calico-typha-f9854dc6c-xbl7n" Sep 9 05:35:08.774960 kubelet[3331]: I0909 05:35:08.774524 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6f4ea0f8-1109-4e3e-8359-edb5eb13d301-typha-certs\") pod \"calico-typha-f9854dc6c-xbl7n\" (UID: \"6f4ea0f8-1109-4e3e-8359-edb5eb13d301\") " pod="calico-system/calico-typha-f9854dc6c-xbl7n" Sep 9 05:35:08.774960 kubelet[3331]: I0909 05:35:08.774557 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f4ea0f8-1109-4e3e-8359-edb5eb13d301-tigera-ca-bundle\") pod \"calico-typha-f9854dc6c-xbl7n\" (UID: \"6f4ea0f8-1109-4e3e-8359-edb5eb13d301\") " pod="calico-system/calico-typha-f9854dc6c-xbl7n" Sep 9 05:35:08.845523 systemd[1]: Created slice kubepods-besteffort-podc9d60776_1c60_4730_9afa_2bdd75050ecb.slice - libcontainer container kubepods-besteffort-podc9d60776_1c60_4730_9afa_2bdd75050ecb.slice. Sep 9 05:35:08.977197 kubelet[3331]: I0909 05:35:08.976545 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-xtables-lock\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.977571 kubelet[3331]: I0909 05:35:08.977545 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-cni-bin-dir\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.978021 kubelet[3331]: I0909 05:35:08.977951 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-var-lib-calico\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.978021 kubelet[3331]: I0909 05:35:08.977986 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-policysync\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979000 kubelet[3331]: I0909 05:35:08.978524 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-var-run-calico\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979000 kubelet[3331]: I0909 05:35:08.978564 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dffnf\" (UniqueName: \"kubernetes.io/projected/c9d60776-1c60-4730-9afa-2bdd75050ecb-kube-api-access-dffnf\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979000 kubelet[3331]: I0909 05:35:08.978590 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-lib-modules\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979000 kubelet[3331]: I0909 05:35:08.978614 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c9d60776-1c60-4730-9afa-2bdd75050ecb-node-certs\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979000 kubelet[3331]: I0909 05:35:08.978640 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9d60776-1c60-4730-9afa-2bdd75050ecb-tigera-ca-bundle\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979415 kubelet[3331]: I0909 05:35:08.978664 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-cni-net-dir\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979415 kubelet[3331]: I0909 05:35:08.978689 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-cni-log-dir\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.979415 kubelet[3331]: I0909 05:35:08.978721 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c9d60776-1c60-4730-9afa-2bdd75050ecb-flexvol-driver-host\") pod \"calico-node-djvx2\" (UID: \"c9d60776-1c60-4730-9afa-2bdd75050ecb\") " pod="calico-system/calico-node-djvx2" Sep 9 05:35:08.981159 containerd[1938]: time="2025-09-09T05:35:08.980845759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f9854dc6c-xbl7n,Uid:6f4ea0f8-1109-4e3e-8359-edb5eb13d301,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:09.038717 containerd[1938]: time="2025-09-09T05:35:09.038662896Z" level=info msg="connecting to shim 117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86" address="unix:///run/containerd/s/631a7625ef864dd134c3a25f7599e2a745054852def608f01549a99cfecb0322" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:09.071993 kubelet[3331]: E0909 05:35:09.071271 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:09.102322 kubelet[3331]: E0909 05:35:09.101636 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.102322 kubelet[3331]: W0909 05:35:09.101665 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.102322 kubelet[3331]: E0909 05:35:09.101694 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.104197 kubelet[3331]: E0909 05:35:09.104156 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.105521 kubelet[3331]: W0909 05:35:09.104185 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.105521 kubelet[3331]: E0909 05:35:09.105078 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.109057 systemd[1]: Started cri-containerd-117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86.scope - libcontainer container 117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86. Sep 9 05:35:09.114300 kubelet[3331]: E0909 05:35:09.114267 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.114892 kubelet[3331]: W0909 05:35:09.114313 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.114892 kubelet[3331]: E0909 05:35:09.114342 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.114892 kubelet[3331]: E0909 05:35:09.114775 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.114892 kubelet[3331]: W0909 05:35:09.114790 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.114892 kubelet[3331]: E0909 05:35:09.114810 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.115703 kubelet[3331]: E0909 05:35:09.115124 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.115703 kubelet[3331]: W0909 05:35:09.115138 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.115703 kubelet[3331]: E0909 05:35:09.115172 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.116764 kubelet[3331]: E0909 05:35:09.116743 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.116764 kubelet[3331]: W0909 05:35:09.116764 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.116908 kubelet[3331]: E0909 05:35:09.116782 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.117347 kubelet[3331]: E0909 05:35:09.117035 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.117347 kubelet[3331]: W0909 05:35:09.117049 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.117347 kubelet[3331]: E0909 05:35:09.117066 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.117347 kubelet[3331]: E0909 05:35:09.117294 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.117347 kubelet[3331]: W0909 05:35:09.117308 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.117347 kubelet[3331]: E0909 05:35:09.117322 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.117559 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.118776 kubelet[3331]: W0909 05:35:09.117570 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.117584 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.117805 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.118776 kubelet[3331]: W0909 05:35:09.117816 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.117830 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.118255 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.118776 kubelet[3331]: W0909 05:35:09.118267 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.118282 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.118776 kubelet[3331]: E0909 05:35:09.118533 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.119229 kubelet[3331]: W0909 05:35:09.118543 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.118557 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.118752 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.119229 kubelet[3331]: W0909 05:35:09.118766 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.118779 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.118963 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.119229 kubelet[3331]: W0909 05:35:09.118973 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.118985 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.119229 kubelet[3331]: E0909 05:35:09.119194 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.119229 kubelet[3331]: W0909 05:35:09.119205 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119216 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119420 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.121772 kubelet[3331]: W0909 05:35:09.119431 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119443 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119652 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.121772 kubelet[3331]: W0909 05:35:09.119662 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119675 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119856 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.121772 kubelet[3331]: W0909 05:35:09.119865 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.121772 kubelet[3331]: E0909 05:35:09.119876 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120080 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.122155 kubelet[3331]: W0909 05:35:09.120090 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120103 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120287 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.122155 kubelet[3331]: W0909 05:35:09.120296 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120308 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120516 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.122155 kubelet[3331]: W0909 05:35:09.120526 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120537 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.122155 kubelet[3331]: E0909 05:35:09.120725 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.123800 kubelet[3331]: W0909 05:35:09.120736 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.123800 kubelet[3331]: E0909 05:35:09.120747 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.180555 kubelet[3331]: E0909 05:35:09.180163 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.180555 kubelet[3331]: W0909 05:35:09.180208 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.181439 kubelet[3331]: E0909 05:35:09.181388 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.181439 kubelet[3331]: W0909 05:35:09.181417 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.181439 kubelet[3331]: E0909 05:35:09.181440 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.181819 kubelet[3331]: I0909 05:35:09.181472 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/72b2ef00-e3c5-4650-afc2-ad083fef1486-varrun\") pod \"csi-node-driver-zk5mx\" (UID: \"72b2ef00-e3c5-4650-afc2-ad083fef1486\") " pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:09.181819 kubelet[3331]: E0909 05:35:09.180239 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.181819 kubelet[3331]: E0909 05:35:09.181717 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.181819 kubelet[3331]: W0909 05:35:09.181730 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.181819 kubelet[3331]: E0909 05:35:09.181763 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.181819 kubelet[3331]: I0909 05:35:09.181786 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72b2ef00-e3c5-4650-afc2-ad083fef1486-socket-dir\") pod \"csi-node-driver-zk5mx\" (UID: \"72b2ef00-e3c5-4650-afc2-ad083fef1486\") " pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:09.183625 kubelet[3331]: E0909 05:35:09.181991 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.183625 kubelet[3331]: W0909 05:35:09.182003 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.183625 kubelet[3331]: E0909 05:35:09.182016 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.183625 kubelet[3331]: E0909 05:35:09.182280 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.183625 kubelet[3331]: W0909 05:35:09.182295 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.183625 kubelet[3331]: E0909 05:35:09.182317 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.184297 kubelet[3331]: E0909 05:35:09.184277 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.184297 kubelet[3331]: W0909 05:35:09.184297 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.184829 kubelet[3331]: E0909 05:35:09.184805 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.185415 kubelet[3331]: E0909 05:35:09.185388 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.185415 kubelet[3331]: W0909 05:35:09.185409 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.185957 kubelet[3331]: E0909 05:35:09.185441 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.186444 kubelet[3331]: E0909 05:35:09.186031 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.186444 kubelet[3331]: W0909 05:35:09.186046 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.186444 kubelet[3331]: E0909 05:35:09.186065 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.186444 kubelet[3331]: I0909 05:35:09.185473 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b2ef00-e3c5-4650-afc2-ad083fef1486-kubelet-dir\") pod \"csi-node-driver-zk5mx\" (UID: \"72b2ef00-e3c5-4650-afc2-ad083fef1486\") " pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:09.186790 kubelet[3331]: E0909 05:35:09.186668 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.186790 kubelet[3331]: W0909 05:35:09.186685 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.186880 kubelet[3331]: E0909 05:35:09.186797 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.188666 kubelet[3331]: I0909 05:35:09.186828 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72b2ef00-e3c5-4650-afc2-ad083fef1486-registration-dir\") pod \"csi-node-driver-zk5mx\" (UID: \"72b2ef00-e3c5-4650-afc2-ad083fef1486\") " pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:09.188666 kubelet[3331]: E0909 05:35:09.187447 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.188666 kubelet[3331]: W0909 05:35:09.187460 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.188666 kubelet[3331]: E0909 05:35:09.187590 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.189499 kubelet[3331]: E0909 05:35:09.188956 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.189499 kubelet[3331]: W0909 05:35:09.188975 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.190608 kubelet[3331]: E0909 05:35:09.189007 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.191040 kubelet[3331]: E0909 05:35:09.191018 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.191205 kubelet[3331]: W0909 05:35:09.191041 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.191205 kubelet[3331]: E0909 05:35:09.191076 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.191205 kubelet[3331]: I0909 05:35:09.191108 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7s7\" (UniqueName: \"kubernetes.io/projected/72b2ef00-e3c5-4650-afc2-ad083fef1486-kube-api-access-tq7s7\") pod \"csi-node-driver-zk5mx\" (UID: \"72b2ef00-e3c5-4650-afc2-ad083fef1486\") " pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:09.192729 kubelet[3331]: E0909 05:35:09.192536 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.192729 kubelet[3331]: W0909 05:35:09.192558 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.192729 kubelet[3331]: E0909 05:35:09.192726 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.196789 kubelet[3331]: E0909 05:35:09.196601 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.196789 kubelet[3331]: W0909 05:35:09.196627 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.196789 kubelet[3331]: E0909 05:35:09.196652 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.198393 kubelet[3331]: E0909 05:35:09.198362 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.198641 kubelet[3331]: W0909 05:35:09.198580 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.198809 kubelet[3331]: E0909 05:35:09.198616 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.199365 kubelet[3331]: E0909 05:35:09.199348 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.199531 kubelet[3331]: W0909 05:35:09.199428 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.199531 kubelet[3331]: E0909 05:35:09.199460 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.236806 containerd[1938]: time="2025-09-09T05:35:09.236669479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f9854dc6c-xbl7n,Uid:6f4ea0f8-1109-4e3e-8359-edb5eb13d301,Namespace:calico-system,Attempt:0,} returns sandbox id \"117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86\"" Sep 9 05:35:09.244841 containerd[1938]: time="2025-09-09T05:35:09.244800220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292261 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.293508 kubelet[3331]: W0909 05:35:09.292289 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292313 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292630 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.293508 kubelet[3331]: W0909 05:35:09.292643 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292671 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292897 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.293508 kubelet[3331]: W0909 05:35:09.292908 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.292932 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.293508 kubelet[3331]: E0909 05:35:09.293184 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.293968 kubelet[3331]: W0909 05:35:09.293194 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.293968 kubelet[3331]: E0909 05:35:09.293218 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.293968 kubelet[3331]: E0909 05:35:09.293449 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.293968 kubelet[3331]: W0909 05:35:09.293459 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.294716 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.297509 kubelet[3331]: W0909 05:35:09.294736 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.294754 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.294922 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.295195 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.297509 kubelet[3331]: W0909 05:35:09.295206 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.295230 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.295436 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.297509 kubelet[3331]: W0909 05:35:09.295447 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.297509 kubelet[3331]: E0909 05:35:09.295570 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.295760 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298071 kubelet[3331]: W0909 05:35:09.295773 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.295798 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.295993 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298071 kubelet[3331]: W0909 05:35:09.296002 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.296080 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.296216 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298071 kubelet[3331]: W0909 05:35:09.296230 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.296400 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298071 kubelet[3331]: E0909 05:35:09.296511 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298441 kubelet[3331]: W0909 05:35:09.296522 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.296606 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.297290 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298441 kubelet[3331]: W0909 05:35:09.297302 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.297512 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.297606 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298441 kubelet[3331]: W0909 05:35:09.297616 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.297693 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.298441 kubelet[3331]: E0909 05:35:09.297870 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.298441 kubelet[3331]: W0909 05:35:09.297881 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.297961 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.299582 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.300218 kubelet[3331]: W0909 05:35:09.299596 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.299680 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.299849 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.300218 kubelet[3331]: W0909 05:35:09.299859 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.300027 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.300089 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.300218 kubelet[3331]: W0909 05:35:09.300099 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.300218 kubelet[3331]: E0909 05:35:09.300178 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.300369 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.301508 kubelet[3331]: W0909 05:35:09.300378 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.300457 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.300649 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.301508 kubelet[3331]: W0909 05:35:09.300658 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.300690 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.300987 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.301508 kubelet[3331]: W0909 05:35:09.300999 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.301025 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.301508 kubelet[3331]: E0909 05:35:09.301233 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.306658 kubelet[3331]: W0909 05:35:09.301549 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.306658 kubelet[3331]: E0909 05:35:09.301714 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.306658 kubelet[3331]: E0909 05:35:09.304657 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.306658 kubelet[3331]: W0909 05:35:09.304679 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.306658 kubelet[3331]: E0909 05:35:09.304716 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.306658 kubelet[3331]: E0909 05:35:09.304980 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.306658 kubelet[3331]: W0909 05:35:09.304991 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.306658 kubelet[3331]: E0909 05:35:09.305025 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.310158 kubelet[3331]: E0909 05:35:09.308625 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.310158 kubelet[3331]: W0909 05:35:09.308648 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.310158 kubelet[3331]: E0909 05:35:09.308699 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.351684 kubelet[3331]: E0909 05:35:09.350833 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:09.351684 kubelet[3331]: W0909 05:35:09.350861 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:09.351684 kubelet[3331]: E0909 05:35:09.350891 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:09.450303 containerd[1938]: time="2025-09-09T05:35:09.450261612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-djvx2,Uid:c9d60776-1c60-4730-9afa-2bdd75050ecb,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:09.552613 containerd[1938]: time="2025-09-09T05:35:09.550677053Z" level=info msg="connecting to shim f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7" address="unix:///run/containerd/s/370b1a3662d3e5291c3bd9cc58a348b758551ab16fd960c032c665fa147001db" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:09.601960 systemd[1]: Started cri-containerd-f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7.scope - libcontainer container f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7. Sep 9 05:35:09.652281 containerd[1938]: time="2025-09-09T05:35:09.652239844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-djvx2,Uid:c9d60776-1c60-4730-9afa-2bdd75050ecb,Namespace:calico-system,Attempt:0,} returns sandbox id \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\"" Sep 9 05:35:10.177068 kubelet[3331]: E0909 05:35:10.177007 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:10.773466 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2714148766.mount: Deactivated successfully. Sep 9 05:35:11.970063 containerd[1938]: time="2025-09-09T05:35:11.969789053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 05:35:11.985504 containerd[1938]: time="2025-09-09T05:35:11.984867932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:11.992601 containerd[1938]: time="2025-09-09T05:35:11.992551173Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:11.994698 containerd[1938]: time="2025-09-09T05:35:11.994639652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:11.996002 containerd[1938]: time="2025-09-09T05:35:11.995849614Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.75098785s" Sep 9 05:35:11.996002 containerd[1938]: time="2025-09-09T05:35:11.995899912Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 05:35:12.053836 containerd[1938]: time="2025-09-09T05:35:12.053799607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:35:12.084417 containerd[1938]: time="2025-09-09T05:35:12.084343909Z" level=info msg="CreateContainer within sandbox \"117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:35:12.122141 containerd[1938]: time="2025-09-09T05:35:12.119660362Z" level=info msg="Container 818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:12.142017 containerd[1938]: time="2025-09-09T05:35:12.141951431Z" level=info msg="CreateContainer within sandbox \"117b063c2b3d12c31a5ddbfa468c2d248d5ec84b73e5dfb8fb4625854cae0a86\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839\"" Sep 9 05:35:12.144974 containerd[1938]: time="2025-09-09T05:35:12.144911682Z" level=info msg="StartContainer for \"818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839\"" Sep 9 05:35:12.146352 containerd[1938]: time="2025-09-09T05:35:12.146313593Z" level=info msg="connecting to shim 818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839" address="unix:///run/containerd/s/631a7625ef864dd134c3a25f7599e2a745054852def608f01549a99cfecb0322" protocol=ttrpc version=3 Sep 9 05:35:12.173750 systemd[1]: Started cri-containerd-818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839.scope - libcontainer container 818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839. Sep 9 05:35:12.182083 kubelet[3331]: E0909 05:35:12.182035 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:12.251705 containerd[1938]: time="2025-09-09T05:35:12.251423069Z" level=info msg="StartContainer for \"818199e0c64c846231e2404d4b882eba8c9737652b3ceef28f96041c31dc3839\" returns successfully" Sep 9 05:35:12.416125 kubelet[3331]: I0909 05:35:12.413039 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f9854dc6c-xbl7n" podStartSLOduration=1.6050742279999999 podStartE2EDuration="4.413015355s" podCreationTimestamp="2025-09-09 05:35:08 +0000 UTC" firstStartedPulling="2025-09-09 05:35:09.244086758 +0000 UTC m=+23.204383406" lastFinishedPulling="2025-09-09 05:35:12.052027889 +0000 UTC m=+26.012324533" observedRunningTime="2025-09-09 05:35:12.41150635 +0000 UTC m=+26.371803014" watchObservedRunningTime="2025-09-09 05:35:12.413015355 +0000 UTC m=+26.373312036" Sep 9 05:35:12.455315 kubelet[3331]: E0909 05:35:12.454935 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.455669 kubelet[3331]: W0909 05:35:12.455523 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.455669 kubelet[3331]: E0909 05:35:12.455564 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.456815 kubelet[3331]: E0909 05:35:12.456664 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.456815 kubelet[3331]: W0909 05:35:12.456682 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.456815 kubelet[3331]: E0909 05:35:12.456703 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.457222 kubelet[3331]: E0909 05:35:12.457144 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.457222 kubelet[3331]: W0909 05:35:12.457157 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.457222 kubelet[3331]: E0909 05:35:12.457171 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.457779 kubelet[3331]: E0909 05:35:12.457609 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.457779 kubelet[3331]: W0909 05:35:12.457644 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.457779 kubelet[3331]: E0909 05:35:12.457669 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.458458 kubelet[3331]: E0909 05:35:12.458387 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.458458 kubelet[3331]: W0909 05:35:12.458401 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.458458 kubelet[3331]: E0909 05:35:12.458414 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.459641 kubelet[3331]: E0909 05:35:12.459560 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.459641 kubelet[3331]: W0909 05:35:12.459576 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.459641 kubelet[3331]: E0909 05:35:12.459589 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.460045 kubelet[3331]: E0909 05:35:12.459976 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.460045 kubelet[3331]: W0909 05:35:12.459989 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.460045 kubelet[3331]: E0909 05:35:12.460001 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.460432 kubelet[3331]: E0909 05:35:12.460352 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.460432 kubelet[3331]: W0909 05:35:12.460364 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.460432 kubelet[3331]: E0909 05:35:12.460375 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.460851 kubelet[3331]: E0909 05:35:12.460778 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.460851 kubelet[3331]: W0909 05:35:12.460794 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.460851 kubelet[3331]: E0909 05:35:12.460808 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.461274 kubelet[3331]: E0909 05:35:12.461201 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.461274 kubelet[3331]: W0909 05:35:12.461214 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.461274 kubelet[3331]: E0909 05:35:12.461228 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.463657 kubelet[3331]: E0909 05:35:12.463286 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.463657 kubelet[3331]: W0909 05:35:12.463303 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.463657 kubelet[3331]: E0909 05:35:12.463320 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.464809 kubelet[3331]: E0909 05:35:12.463924 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.464809 kubelet[3331]: W0909 05:35:12.463938 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.464809 kubelet[3331]: E0909 05:35:12.463953 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.465636 kubelet[3331]: E0909 05:35:12.465525 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.465636 kubelet[3331]: W0909 05:35:12.465541 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.465636 kubelet[3331]: E0909 05:35:12.465557 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.466133 kubelet[3331]: E0909 05:35:12.466047 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.466133 kubelet[3331]: W0909 05:35:12.466061 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.466133 kubelet[3331]: E0909 05:35:12.466075 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.466508 kubelet[3331]: E0909 05:35:12.466430 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.466508 kubelet[3331]: W0909 05:35:12.466442 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.466508 kubelet[3331]: E0909 05:35:12.466455 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.532748 kubelet[3331]: E0909 05:35:12.532562 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.533068 kubelet[3331]: W0909 05:35:12.532643 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.533068 kubelet[3331]: E0909 05:35:12.532965 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.534427 kubelet[3331]: E0909 05:35:12.534366 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.534994 kubelet[3331]: W0909 05:35:12.534968 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.535157 kubelet[3331]: E0909 05:35:12.535141 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.536062 kubelet[3331]: E0909 05:35:12.536042 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.536178 kubelet[3331]: W0909 05:35:12.536062 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.536178 kubelet[3331]: E0909 05:35:12.536089 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.536573 kubelet[3331]: E0909 05:35:12.536543 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.537595 kubelet[3331]: W0909 05:35:12.537552 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.537700 kubelet[3331]: E0909 05:35:12.537658 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.537921 kubelet[3331]: E0909 05:35:12.537905 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.538011 kubelet[3331]: W0909 05:35:12.537921 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.538011 kubelet[3331]: E0909 05:35:12.538006 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.538224 kubelet[3331]: E0909 05:35:12.538206 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.538280 kubelet[3331]: W0909 05:35:12.538225 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.538280 kubelet[3331]: E0909 05:35:12.538252 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.538541 kubelet[3331]: E0909 05:35:12.538526 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.538602 kubelet[3331]: W0909 05:35:12.538542 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.538602 kubelet[3331]: E0909 05:35:12.538562 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.538818 kubelet[3331]: E0909 05:35:12.538803 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.538887 kubelet[3331]: W0909 05:35:12.538854 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.538955 kubelet[3331]: E0909 05:35:12.538938 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.539152 kubelet[3331]: E0909 05:35:12.539136 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.539209 kubelet[3331]: W0909 05:35:12.539152 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.539253 kubelet[3331]: E0909 05:35:12.539244 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.540707 kubelet[3331]: E0909 05:35:12.540686 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.540707 kubelet[3331]: W0909 05:35:12.540706 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.540832 kubelet[3331]: E0909 05:35:12.540794 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.541030 kubelet[3331]: E0909 05:35:12.541014 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.541030 kubelet[3331]: W0909 05:35:12.541030 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.541139 kubelet[3331]: E0909 05:35:12.541085 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.541454 kubelet[3331]: E0909 05:35:12.541435 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.541454 kubelet[3331]: W0909 05:35:12.541455 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.541686 kubelet[3331]: E0909 05:35:12.541668 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.542762 kubelet[3331]: E0909 05:35:12.541742 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.542762 kubelet[3331]: W0909 05:35:12.541753 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.542762 kubelet[3331]: E0909 05:35:12.541777 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.542762 kubelet[3331]: E0909 05:35:12.542561 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.542762 kubelet[3331]: W0909 05:35:12.542574 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.542762 kubelet[3331]: E0909 05:35:12.542590 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.543082 kubelet[3331]: E0909 05:35:12.542814 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.543082 kubelet[3331]: W0909 05:35:12.542824 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.543082 kubelet[3331]: E0909 05:35:12.542849 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.543213 kubelet[3331]: E0909 05:35:12.543101 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.543213 kubelet[3331]: W0909 05:35:12.543111 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.543213 kubelet[3331]: E0909 05:35:12.543136 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.544665 kubelet[3331]: E0909 05:35:12.544644 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.544665 kubelet[3331]: W0909 05:35:12.544664 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.544800 kubelet[3331]: E0909 05:35:12.544679 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:12.545412 kubelet[3331]: E0909 05:35:12.545392 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:12.545412 kubelet[3331]: W0909 05:35:12.545412 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:12.545531 kubelet[3331]: E0909 05:35:12.545426 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.377528 containerd[1938]: time="2025-09-09T05:35:13.377397458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:13.379346 containerd[1938]: time="2025-09-09T05:35:13.379244241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 05:35:13.381812 containerd[1938]: time="2025-09-09T05:35:13.381751178Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:13.384038 kubelet[3331]: I0909 05:35:13.384000 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:13.384618 containerd[1938]: time="2025-09-09T05:35:13.384584751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:13.385208 containerd[1938]: time="2025-09-09T05:35:13.385182553Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.331143589s" Sep 9 05:35:13.385306 containerd[1938]: time="2025-09-09T05:35:13.385293707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 05:35:13.388367 containerd[1938]: time="2025-09-09T05:35:13.388252677Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:35:13.409792 containerd[1938]: time="2025-09-09T05:35:13.409748622Z" level=info msg="Container fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:13.426294 containerd[1938]: time="2025-09-09T05:35:13.426249418Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\"" Sep 9 05:35:13.427317 containerd[1938]: time="2025-09-09T05:35:13.427273606Z" level=info msg="StartContainer for \"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\"" Sep 9 05:35:13.429229 containerd[1938]: time="2025-09-09T05:35:13.429171207Z" level=info msg="connecting to shim fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f" address="unix:///run/containerd/s/370b1a3662d3e5291c3bd9cc58a348b758551ab16fd960c032c665fa147001db" protocol=ttrpc version=3 Sep 9 05:35:13.455690 systemd[1]: Started cri-containerd-fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f.scope - libcontainer container fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f. Sep 9 05:35:13.475435 kubelet[3331]: E0909 05:35:13.475360 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.475435 kubelet[3331]: W0909 05:35:13.475381 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.475435 kubelet[3331]: E0909 05:35:13.475402 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.476002 kubelet[3331]: E0909 05:35:13.475965 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.476002 kubelet[3331]: W0909 05:35:13.475977 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.476123 kubelet[3331]: E0909 05:35:13.476083 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.476385 kubelet[3331]: E0909 05:35:13.476336 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.476385 kubelet[3331]: W0909 05:35:13.476345 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.476385 kubelet[3331]: E0909 05:35:13.476355 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.477036 kubelet[3331]: E0909 05:35:13.476728 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.477036 kubelet[3331]: W0909 05:35:13.476738 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.477036 kubelet[3331]: E0909 05:35:13.476748 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.477332 kubelet[3331]: E0909 05:35:13.477075 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.477332 kubelet[3331]: W0909 05:35:13.477085 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.477332 kubelet[3331]: E0909 05:35:13.477165 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.477466 kubelet[3331]: E0909 05:35:13.477379 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.477466 kubelet[3331]: W0909 05:35:13.477401 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.477466 kubelet[3331]: E0909 05:35:13.477412 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.477670 kubelet[3331]: E0909 05:35:13.477614 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.477670 kubelet[3331]: W0909 05:35:13.477635 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.477670 kubelet[3331]: E0909 05:35:13.477644 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.477833 kubelet[3331]: E0909 05:35:13.477809 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.477833 kubelet[3331]: W0909 05:35:13.477819 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.477833 kubelet[3331]: E0909 05:35:13.477828 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.478034 kubelet[3331]: E0909 05:35:13.478023 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.478093 kubelet[3331]: W0909 05:35:13.478034 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.478093 kubelet[3331]: E0909 05:35:13.478043 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.478093 kubelet[3331]: E0909 05:35:13.478210 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.478357 kubelet[3331]: W0909 05:35:13.478232 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.478357 kubelet[3331]: E0909 05:35:13.478255 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.478443 kubelet[3331]: E0909 05:35:13.478418 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.478443 kubelet[3331]: W0909 05:35:13.478425 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.478443 kubelet[3331]: E0909 05:35:13.478432 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.478640 kubelet[3331]: E0909 05:35:13.478605 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.478640 kubelet[3331]: W0909 05:35:13.478611 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.478640 kubelet[3331]: E0909 05:35:13.478632 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.478820 kubelet[3331]: E0909 05:35:13.478796 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.478820 kubelet[3331]: W0909 05:35:13.478805 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.478820 kubelet[3331]: E0909 05:35:13.478814 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.479309 kubelet[3331]: E0909 05:35:13.478977 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.479309 kubelet[3331]: W0909 05:35:13.478984 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.479309 kubelet[3331]: E0909 05:35:13.478999 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.479309 kubelet[3331]: E0909 05:35:13.479237 3331 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:35:13.479309 kubelet[3331]: W0909 05:35:13.479259 3331 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:35:13.479309 kubelet[3331]: E0909 05:35:13.479268 3331 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:35:13.509498 containerd[1938]: time="2025-09-09T05:35:13.509243996Z" level=info msg="StartContainer for \"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\" returns successfully" Sep 9 05:35:13.518210 systemd[1]: cri-containerd-fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f.scope: Deactivated successfully. Sep 9 05:35:13.543632 containerd[1938]: time="2025-09-09T05:35:13.543518821Z" level=info msg="received exit event container_id:\"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\" id:\"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\" pid:4188 exited_at:{seconds:1757396113 nanos:520925040}" Sep 9 05:35:13.561883 containerd[1938]: time="2025-09-09T05:35:13.561824736Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\" id:\"fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f\" pid:4188 exited_at:{seconds:1757396113 nanos:520925040}" Sep 9 05:35:13.596470 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fe4ecf5d3a9f603cd996b9acadfa36818814ede14b11a8f08019965a90425e4f-rootfs.mount: Deactivated successfully. Sep 9 05:35:14.177028 kubelet[3331]: E0909 05:35:14.175930 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:14.390470 containerd[1938]: time="2025-09-09T05:35:14.390417409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:35:16.178029 kubelet[3331]: E0909 05:35:16.177948 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:17.725779 containerd[1938]: time="2025-09-09T05:35:17.725693704Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:17.727136 containerd[1938]: time="2025-09-09T05:35:17.727095755Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 05:35:17.727765 containerd[1938]: time="2025-09-09T05:35:17.727711553Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:17.729996 containerd[1938]: time="2025-09-09T05:35:17.729941675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:17.730392 containerd[1938]: time="2025-09-09T05:35:17.730365206Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.3399102s" Sep 9 05:35:17.730454 containerd[1938]: time="2025-09-09T05:35:17.730399461Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 05:35:17.733157 containerd[1938]: time="2025-09-09T05:35:17.733130550Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:35:17.750407 containerd[1938]: time="2025-09-09T05:35:17.746628309Z" level=info msg="Container 156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:17.764709 containerd[1938]: time="2025-09-09T05:35:17.764452186Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\"" Sep 9 05:35:17.766529 containerd[1938]: time="2025-09-09T05:35:17.766462058Z" level=info msg="StartContainer for \"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\"" Sep 9 05:35:17.768198 containerd[1938]: time="2025-09-09T05:35:17.768136308Z" level=info msg="connecting to shim 156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd" address="unix:///run/containerd/s/370b1a3662d3e5291c3bd9cc58a348b758551ab16fd960c032c665fa147001db" protocol=ttrpc version=3 Sep 9 05:35:17.792691 systemd[1]: Started cri-containerd-156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd.scope - libcontainer container 156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd. Sep 9 05:35:17.856922 containerd[1938]: time="2025-09-09T05:35:17.856806720Z" level=info msg="StartContainer for \"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\" returns successfully" Sep 9 05:35:18.177270 kubelet[3331]: E0909 05:35:18.177210 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:18.704379 systemd[1]: cri-containerd-156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd.scope: Deactivated successfully. Sep 9 05:35:18.704763 systemd[1]: cri-containerd-156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd.scope: Consumed 569ms CPU time, 164.5M memory peak, 7.8M read from disk, 171.3M written to disk. Sep 9 05:35:18.791260 containerd[1938]: time="2025-09-09T05:35:18.790730508Z" level=info msg="received exit event container_id:\"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\" id:\"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\" pid:4259 exited_at:{seconds:1757396118 nanos:790460145}" Sep 9 05:35:18.792377 containerd[1938]: time="2025-09-09T05:35:18.792337584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\" id:\"156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd\" pid:4259 exited_at:{seconds:1757396118 nanos:790460145}" Sep 9 05:35:18.802806 kubelet[3331]: I0909 05:35:18.802783 3331 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 05:35:18.881901 systemd[1]: Created slice kubepods-besteffort-pod35c3c610_8e0e_4648_acad_34df202fc795.slice - libcontainer container kubepods-besteffort-pod35c3c610_8e0e_4648_acad_34df202fc795.slice. Sep 9 05:35:18.925008 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-156bbe2c73635a806839d156b88314c0e88a2dc89e137d5b685d115abdcf1abd-rootfs.mount: Deactivated successfully. Sep 9 05:35:18.933270 systemd[1]: Created slice kubepods-besteffort-poddb0b27da_0f3a_412e_b1ae_6d3b86e6ab56.slice - libcontainer container kubepods-besteffort-poddb0b27da_0f3a_412e_b1ae_6d3b86e6ab56.slice. Sep 9 05:35:18.951385 systemd[1]: Created slice kubepods-besteffort-podef8121d7_77d9_4cab_818e_56bcdf112ac8.slice - libcontainer container kubepods-besteffort-podef8121d7_77d9_4cab_818e_56bcdf112ac8.slice. Sep 9 05:35:18.965278 systemd[1]: Created slice kubepods-besteffort-podbcd056d1_4179_4c9a_b7fd_1754e236f3c1.slice - libcontainer container kubepods-besteffort-podbcd056d1_4179_4c9a_b7fd_1754e236f3c1.slice. Sep 9 05:35:18.977692 systemd[1]: Created slice kubepods-burstable-podde52e7c0_304d_4417_9812_1ab2edc8f219.slice - libcontainer container kubepods-burstable-podde52e7c0_304d_4417_9812_1ab2edc8f219.slice. Sep 9 05:35:18.993149 systemd[1]: Created slice kubepods-besteffort-podff947e5b_fad0_4247_a8d1_9a94d6a84b62.slice - libcontainer container kubepods-besteffort-podff947e5b_fad0_4247_a8d1_9a94d6a84b62.slice. Sep 9 05:35:18.999742 kubelet[3331]: I0909 05:35:18.996724 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f920a1-0383-495e-80c9-59ade95d3519-config-volume\") pod \"coredns-7c65d6cfc9-d7txq\" (UID: \"81f920a1-0383-495e-80c9-59ade95d3519\") " pod="kube-system/coredns-7c65d6cfc9-d7txq" Sep 9 05:35:18.999742 kubelet[3331]: I0909 05:35:18.996771 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ff947e5b-fad0-4247-a8d1-9a94d6a84b62-calico-apiserver-certs\") pod \"calico-apiserver-77545c669c-rfntj\" (UID: \"ff947e5b-fad0-4247-a8d1-9a94d6a84b62\") " pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" Sep 9 05:35:18.999742 kubelet[3331]: I0909 05:35:18.996801 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xbv\" (UniqueName: \"kubernetes.io/projected/35c3c610-8e0e-4648-acad-34df202fc795-kube-api-access-g4xbv\") pod \"calico-kube-controllers-69b9ddd587-48l4w\" (UID: \"35c3c610-8e0e-4648-acad-34df202fc795\") " pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" Sep 9 05:35:18.999742 kubelet[3331]: I0909 05:35:18.996826 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd056d1-4179-4c9a-b7fd-1754e236f3c1-config\") pod \"goldmane-7988f88666-pbtr9\" (UID: \"bcd056d1-4179-4c9a-b7fd-1754e236f3c1\") " pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:18.999742 kubelet[3331]: I0909 05:35:18.996858 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de52e7c0-304d-4417-9812-1ab2edc8f219-config-volume\") pod \"coredns-7c65d6cfc9-8hqlv\" (UID: \"de52e7c0-304d-4417-9812-1ab2edc8f219\") " pod="kube-system/coredns-7c65d6cfc9-8hqlv" Sep 9 05:35:19.000181 kubelet[3331]: I0909 05:35:18.997300 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqm4\" (UniqueName: \"kubernetes.io/projected/81f920a1-0383-495e-80c9-59ade95d3519-kube-api-access-nlqm4\") pod \"coredns-7c65d6cfc9-d7txq\" (UID: \"81f920a1-0383-495e-80c9-59ade95d3519\") " pod="kube-system/coredns-7c65d6cfc9-d7txq" Sep 9 05:35:19.000181 kubelet[3331]: I0909 05:35:18.997350 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42j8j\" (UniqueName: \"kubernetes.io/projected/db0b27da-0f3a-412e-b1ae-6d3b86e6ab56-kube-api-access-42j8j\") pod \"calico-apiserver-77545c669c-j4l25\" (UID: \"db0b27da-0f3a-412e-b1ae-6d3b86e6ab56\") " pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" Sep 9 05:35:19.000181 kubelet[3331]: I0909 05:35:18.997388 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-backend-key-pair\") pod \"whisker-6d5c8b4b9c-5pjgf\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " pod="calico-system/whisker-6d5c8b4b9c-5pjgf" Sep 9 05:35:19.000181 kubelet[3331]: I0909 05:35:18.997414 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c3c610-8e0e-4648-acad-34df202fc795-tigera-ca-bundle\") pod \"calico-kube-controllers-69b9ddd587-48l4w\" (UID: \"35c3c610-8e0e-4648-acad-34df202fc795\") " pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" Sep 9 05:35:19.000181 kubelet[3331]: I0909 05:35:18.997449 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/bcd056d1-4179-4c9a-b7fd-1754e236f3c1-goldmane-key-pair\") pod \"goldmane-7988f88666-pbtr9\" (UID: \"bcd056d1-4179-4c9a-b7fd-1754e236f3c1\") " pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:19.000597 kubelet[3331]: I0909 05:35:18.997487 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-ca-bundle\") pod \"whisker-6d5c8b4b9c-5pjgf\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " pod="calico-system/whisker-6d5c8b4b9c-5pjgf" Sep 9 05:35:19.000597 kubelet[3331]: I0909 05:35:18.997525 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzqm9\" (UniqueName: \"kubernetes.io/projected/ff947e5b-fad0-4247-a8d1-9a94d6a84b62-kube-api-access-qzqm9\") pod \"calico-apiserver-77545c669c-rfntj\" (UID: \"ff947e5b-fad0-4247-a8d1-9a94d6a84b62\") " pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" Sep 9 05:35:19.000597 kubelet[3331]: I0909 05:35:18.997551 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxsmm\" (UniqueName: \"kubernetes.io/projected/bcd056d1-4179-4c9a-b7fd-1754e236f3c1-kube-api-access-cxsmm\") pod \"goldmane-7988f88666-pbtr9\" (UID: \"bcd056d1-4179-4c9a-b7fd-1754e236f3c1\") " pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:19.000597 kubelet[3331]: I0909 05:35:18.997577 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/db0b27da-0f3a-412e-b1ae-6d3b86e6ab56-calico-apiserver-certs\") pod \"calico-apiserver-77545c669c-j4l25\" (UID: \"db0b27da-0f3a-412e-b1ae-6d3b86e6ab56\") " pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" Sep 9 05:35:19.000597 kubelet[3331]: I0909 05:35:18.997610 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7fps\" (UniqueName: \"kubernetes.io/projected/ef8121d7-77d9-4cab-818e-56bcdf112ac8-kube-api-access-t7fps\") pod \"whisker-6d5c8b4b9c-5pjgf\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " pod="calico-system/whisker-6d5c8b4b9c-5pjgf" Sep 9 05:35:19.000998 kubelet[3331]: I0909 05:35:18.997636 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxgv\" (UniqueName: \"kubernetes.io/projected/de52e7c0-304d-4417-9812-1ab2edc8f219-kube-api-access-6mxgv\") pod \"coredns-7c65d6cfc9-8hqlv\" (UID: \"de52e7c0-304d-4417-9812-1ab2edc8f219\") " pod="kube-system/coredns-7c65d6cfc9-8hqlv" Sep 9 05:35:19.000998 kubelet[3331]: I0909 05:35:18.997819 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcd056d1-4179-4c9a-b7fd-1754e236f3c1-goldmane-ca-bundle\") pod \"goldmane-7988f88666-pbtr9\" (UID: \"bcd056d1-4179-4c9a-b7fd-1754e236f3c1\") " pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:19.004821 systemd[1]: Created slice kubepods-burstable-pod81f920a1_0383_495e_80c9_59ade95d3519.slice - libcontainer container kubepods-burstable-pod81f920a1_0383_495e_80c9_59ade95d3519.slice. Sep 9 05:35:19.210736 containerd[1938]: time="2025-09-09T05:35:19.210694338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69b9ddd587-48l4w,Uid:35c3c610-8e0e-4648-acad-34df202fc795,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:19.250720 containerd[1938]: time="2025-09-09T05:35:19.250187517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-j4l25,Uid:db0b27da-0f3a-412e-b1ae-6d3b86e6ab56,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:35:19.259856 containerd[1938]: time="2025-09-09T05:35:19.258585191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d5c8b4b9c-5pjgf,Uid:ef8121d7-77d9-4cab-818e-56bcdf112ac8,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:19.281871 containerd[1938]: time="2025-09-09T05:35:19.281823341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-pbtr9,Uid:bcd056d1-4179-4c9a-b7fd-1754e236f3c1,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:19.300769 containerd[1938]: time="2025-09-09T05:35:19.300711588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-rfntj,Uid:ff947e5b-fad0-4247-a8d1-9a94d6a84b62,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:35:19.301537 containerd[1938]: time="2025-09-09T05:35:19.301472440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hqlv,Uid:de52e7c0-304d-4417-9812-1ab2edc8f219,Namespace:kube-system,Attempt:0,}" Sep 9 05:35:19.314351 containerd[1938]: time="2025-09-09T05:35:19.314160176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d7txq,Uid:81f920a1-0383-495e-80c9-59ade95d3519,Namespace:kube-system,Attempt:0,}" Sep 9 05:35:19.458403 containerd[1938]: time="2025-09-09T05:35:19.458371014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:35:19.664412 containerd[1938]: time="2025-09-09T05:35:19.663817356Z" level=error msg="Failed to destroy network for sandbox \"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.683573 containerd[1938]: time="2025-09-09T05:35:19.673045449Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d5c8b4b9c-5pjgf,Uid:ef8121d7-77d9-4cab-818e-56bcdf112ac8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.684680 containerd[1938]: time="2025-09-09T05:35:19.684554693Z" level=error msg="Failed to destroy network for sandbox \"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.690993 containerd[1938]: time="2025-09-09T05:35:19.690934214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69b9ddd587-48l4w,Uid:35c3c610-8e0e-4648-acad-34df202fc795,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.718973 containerd[1938]: time="2025-09-09T05:35:19.718906777Z" level=error msg="Failed to destroy network for sandbox \"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.720129 kubelet[3331]: E0909 05:35:19.720047 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.721043 kubelet[3331]: E0909 05:35:19.720185 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6d5c8b4b9c-5pjgf" Sep 9 05:35:19.721043 kubelet[3331]: E0909 05:35:19.720217 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6d5c8b4b9c-5pjgf" Sep 9 05:35:19.721043 kubelet[3331]: E0909 05:35:19.720315 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6d5c8b4b9c-5pjgf_calico-system(ef8121d7-77d9-4cab-818e-56bcdf112ac8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6d5c8b4b9c-5pjgf_calico-system(ef8121d7-77d9-4cab-818e-56bcdf112ac8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a8b99e38c6538d2bafbf61646fc9897a7b6f1253511b951b31d87f1cafbff002\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6d5c8b4b9c-5pjgf" podUID="ef8121d7-77d9-4cab-818e-56bcdf112ac8" Sep 9 05:35:19.721959 containerd[1938]: time="2025-09-09T05:35:19.721833640Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-j4l25,Uid:db0b27da-0f3a-412e-b1ae-6d3b86e6ab56,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.722253 kubelet[3331]: E0909 05:35:19.722162 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.722334 kubelet[3331]: E0909 05:35:19.722300 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.722552 kubelet[3331]: E0909 05:35:19.722514 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" Sep 9 05:35:19.722685 kubelet[3331]: E0909 05:35:19.722659 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" Sep 9 05:35:19.722755 kubelet[3331]: E0909 05:35:19.722696 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" Sep 9 05:35:19.723007 kubelet[3331]: E0909 05:35:19.722553 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" Sep 9 05:35:19.723130 kubelet[3331]: E0909 05:35:19.723094 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69b9ddd587-48l4w_calico-system(35c3c610-8e0e-4648-acad-34df202fc795)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69b9ddd587-48l4w_calico-system(35c3c610-8e0e-4648-acad-34df202fc795)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25c0fd1b95a422a87d7d4253065559dc4a11dcad0ce3409ad11ccdcaa0307496\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" podUID="35c3c610-8e0e-4648-acad-34df202fc795" Sep 9 05:35:19.723410 kubelet[3331]: E0909 05:35:19.722866 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-77545c669c-j4l25_calico-apiserver(db0b27da-0f3a-412e-b1ae-6d3b86e6ab56)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-77545c669c-j4l25_calico-apiserver(db0b27da-0f3a-412e-b1ae-6d3b86e6ab56)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9cc6cf23464b6ec0a2094cc3941d2d08996c15e912ac98e52d332386d8bd365\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" podUID="db0b27da-0f3a-412e-b1ae-6d3b86e6ab56" Sep 9 05:35:19.723991 containerd[1938]: time="2025-09-09T05:35:19.723655588Z" level=error msg="Failed to destroy network for sandbox \"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.729986 containerd[1938]: time="2025-09-09T05:35:19.729928298Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-pbtr9,Uid:bcd056d1-4179-4c9a-b7fd-1754e236f3c1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.731036 kubelet[3331]: E0909 05:35:19.730810 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.731036 kubelet[3331]: E0909 05:35:19.730878 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:19.731036 kubelet[3331]: E0909 05:35:19.730914 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-pbtr9" Sep 9 05:35:19.731265 kubelet[3331]: E0909 05:35:19.730964 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-pbtr9_calico-system(bcd056d1-4179-4c9a-b7fd-1754e236f3c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-pbtr9_calico-system(bcd056d1-4179-4c9a-b7fd-1754e236f3c1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"121b08fcbd2aeb32ffa33ca29457a76d70e7f52b9bd03188d485553a405536fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-pbtr9" podUID="bcd056d1-4179-4c9a-b7fd-1754e236f3c1" Sep 9 05:35:19.742011 containerd[1938]: time="2025-09-09T05:35:19.741956133Z" level=error msg="Failed to destroy network for sandbox \"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.749075 containerd[1938]: time="2025-09-09T05:35:19.748981157Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d7txq,Uid:81f920a1-0383-495e-80c9-59ade95d3519,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.753672 kubelet[3331]: E0909 05:35:19.753334 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.753672 kubelet[3331]: E0909 05:35:19.753399 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-d7txq" Sep 9 05:35:19.753672 kubelet[3331]: E0909 05:35:19.753423 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-d7txq" Sep 9 05:35:19.754639 kubelet[3331]: E0909 05:35:19.753471 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-d7txq_kube-system(81f920a1-0383-495e-80c9-59ade95d3519)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-d7txq_kube-system(81f920a1-0383-495e-80c9-59ade95d3519)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70d7340b2140e8ccb699a27eefb1d06b6a5ecfc3cedd5557c493151ba78b79c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-d7txq" podUID="81f920a1-0383-495e-80c9-59ade95d3519" Sep 9 05:35:19.762423 containerd[1938]: time="2025-09-09T05:35:19.762326627Z" level=error msg="Failed to destroy network for sandbox \"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.765257 containerd[1938]: time="2025-09-09T05:35:19.765129123Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-rfntj,Uid:ff947e5b-fad0-4247-a8d1-9a94d6a84b62,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.765625 kubelet[3331]: E0909 05:35:19.765576 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.765725 kubelet[3331]: E0909 05:35:19.765662 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" Sep 9 05:35:19.765725 kubelet[3331]: E0909 05:35:19.765690 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" Sep 9 05:35:19.766451 kubelet[3331]: E0909 05:35:19.765761 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-77545c669c-rfntj_calico-apiserver(ff947e5b-fad0-4247-a8d1-9a94d6a84b62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-77545c669c-rfntj_calico-apiserver(ff947e5b-fad0-4247-a8d1-9a94d6a84b62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8115de5be35f82d4aa20a7f2e3374d48d1770d9913c0e8c13736926ac1d8b694\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" podUID="ff947e5b-fad0-4247-a8d1-9a94d6a84b62" Sep 9 05:35:19.766665 containerd[1938]: time="2025-09-09T05:35:19.766606179Z" level=error msg="Failed to destroy network for sandbox \"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.769160 containerd[1938]: time="2025-09-09T05:35:19.769120878Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hqlv,Uid:de52e7c0-304d-4417-9812-1ab2edc8f219,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.769360 kubelet[3331]: E0909 05:35:19.769324 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:19.769442 kubelet[3331]: E0909 05:35:19.769380 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8hqlv" Sep 9 05:35:19.769442 kubelet[3331]: E0909 05:35:19.769413 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8hqlv" Sep 9 05:35:19.769584 kubelet[3331]: E0909 05:35:19.769458 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8hqlv_kube-system(de52e7c0-304d-4417-9812-1ab2edc8f219)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8hqlv_kube-system(de52e7c0-304d-4417-9812-1ab2edc8f219)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"51890cd9407c90f38312d46cb4aebcea5d717dd1f53857d1f366220af6b753de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8hqlv" podUID="de52e7c0-304d-4417-9812-1ab2edc8f219" Sep 9 05:35:19.921849 systemd[1]: run-netns-cni\x2df05ba471\x2d1601\x2d31cb\x2d72c1\x2d167df07cefa3.mount: Deactivated successfully. Sep 9 05:35:20.182786 systemd[1]: Created slice kubepods-besteffort-pod72b2ef00_e3c5_4650_afc2_ad083fef1486.slice - libcontainer container kubepods-besteffort-pod72b2ef00_e3c5_4650_afc2_ad083fef1486.slice. Sep 9 05:35:20.186675 containerd[1938]: time="2025-09-09T05:35:20.186643108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zk5mx,Uid:72b2ef00-e3c5-4650-afc2-ad083fef1486,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:20.251500 kubelet[3331]: I0909 05:35:20.250879 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:20.270565 containerd[1938]: time="2025-09-09T05:35:20.267654784Z" level=error msg="Failed to destroy network for sandbox \"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:20.270565 containerd[1938]: time="2025-09-09T05:35:20.270329804Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zk5mx,Uid:72b2ef00-e3c5-4650-afc2-ad083fef1486,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:20.270786 kubelet[3331]: E0909 05:35:20.270699 3331 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:35:20.270860 kubelet[3331]: E0909 05:35:20.270779 3331 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:20.272291 systemd[1]: run-netns-cni\x2dba50f101\x2d5b56\x2d2344\x2dc126\x2deb11ac7f2b60.mount: Deactivated successfully. Sep 9 05:35:20.272556 kubelet[3331]: E0909 05:35:20.272512 3331 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zk5mx" Sep 9 05:35:20.272836 kubelet[3331]: E0909 05:35:20.272791 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zk5mx_calico-system(72b2ef00-e3c5-4650-afc2-ad083fef1486)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zk5mx_calico-system(72b2ef00-e3c5-4650-afc2-ad083fef1486)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67a21e8807cad30aa71fa22c023c658f736651fa9d86ca808f19fba6d5b0c5de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zk5mx" podUID="72b2ef00-e3c5-4650-afc2-ad083fef1486" Sep 9 05:35:25.442852 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3994432655.mount: Deactivated successfully. Sep 9 05:35:25.533378 containerd[1938]: time="2025-09-09T05:35:25.533320669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:25.545397 containerd[1938]: time="2025-09-09T05:35:25.545269492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 05:35:25.550680 containerd[1938]: time="2025-09-09T05:35:25.550612309Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:25.554094 containerd[1938]: time="2025-09-09T05:35:25.554020550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:25.554756 containerd[1938]: time="2025-09-09T05:35:25.554611469Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.096045569s" Sep 9 05:35:25.554756 containerd[1938]: time="2025-09-09T05:35:25.554655605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 05:35:25.611540 containerd[1938]: time="2025-09-09T05:35:25.610558749Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:35:25.654579 containerd[1938]: time="2025-09-09T05:35:25.652542972Z" level=info msg="Container a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:25.655570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1509309359.mount: Deactivated successfully. Sep 9 05:35:25.683219 containerd[1938]: time="2025-09-09T05:35:25.683160283Z" level=info msg="CreateContainer within sandbox \"f095d073850fc2dd251c675d5c3a950330ca6097e5d64401f7a08abfe60de9e7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\"" Sep 9 05:35:25.684906 containerd[1938]: time="2025-09-09T05:35:25.684308692Z" level=info msg="StartContainer for \"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\"" Sep 9 05:35:25.690604 containerd[1938]: time="2025-09-09T05:35:25.690558621Z" level=info msg="connecting to shim a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4" address="unix:///run/containerd/s/370b1a3662d3e5291c3bd9cc58a348b758551ab16fd960c032c665fa147001db" protocol=ttrpc version=3 Sep 9 05:35:25.874722 systemd[1]: Started cri-containerd-a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4.scope - libcontainer container a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4. Sep 9 05:35:25.935039 containerd[1938]: time="2025-09-09T05:35:25.934998344Z" level=info msg="StartContainer for \"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" returns successfully" Sep 9 05:35:26.177771 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:35:26.180291 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:35:26.527546 kubelet[3331]: I0909 05:35:26.527388 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-djvx2" podStartSLOduration=2.611497127 podStartE2EDuration="18.527364964s" podCreationTimestamp="2025-09-09 05:35:08 +0000 UTC" firstStartedPulling="2025-09-09 05:35:09.655266772 +0000 UTC m=+23.615563424" lastFinishedPulling="2025-09-09 05:35:25.571134611 +0000 UTC m=+39.531431261" observedRunningTime="2025-09-09 05:35:26.523808345 +0000 UTC m=+40.484105030" watchObservedRunningTime="2025-09-09 05:35:26.527364964 +0000 UTC m=+40.487661629" Sep 9 05:35:26.667847 kubelet[3331]: I0909 05:35:26.667808 3331 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-ca-bundle\") pod \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " Sep 9 05:35:26.667847 kubelet[3331]: I0909 05:35:26.667852 3331 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7fps\" (UniqueName: \"kubernetes.io/projected/ef8121d7-77d9-4cab-818e-56bcdf112ac8-kube-api-access-t7fps\") pod \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " Sep 9 05:35:26.667999 kubelet[3331]: I0909 05:35:26.667874 3331 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-backend-key-pair\") pod \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\" (UID: \"ef8121d7-77d9-4cab-818e-56bcdf112ac8\") " Sep 9 05:35:26.676336 kubelet[3331]: I0909 05:35:26.676289 3331 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ef8121d7-77d9-4cab-818e-56bcdf112ac8" (UID: "ef8121d7-77d9-4cab-818e-56bcdf112ac8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 05:35:26.691628 kubelet[3331]: I0909 05:35:26.691586 3331 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8121d7-77d9-4cab-818e-56bcdf112ac8-kube-api-access-t7fps" (OuterVolumeSpecName: "kube-api-access-t7fps") pod "ef8121d7-77d9-4cab-818e-56bcdf112ac8" (UID: "ef8121d7-77d9-4cab-818e-56bcdf112ac8"). InnerVolumeSpecName "kube-api-access-t7fps". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 05:35:26.691744 kubelet[3331]: I0909 05:35:26.691704 3331 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ef8121d7-77d9-4cab-818e-56bcdf112ac8" (UID: "ef8121d7-77d9-4cab-818e-56bcdf112ac8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 05:35:26.692254 systemd[1]: var-lib-kubelet-pods-ef8121d7\x2d77d9\x2d4cab\x2d818e\x2d56bcdf112ac8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:35:26.697254 systemd[1]: var-lib-kubelet-pods-ef8121d7\x2d77d9\x2d4cab\x2d818e\x2d56bcdf112ac8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dt7fps.mount: Deactivated successfully. Sep 9 05:35:26.768572 kubelet[3331]: I0909 05:35:26.768372 3331 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-backend-key-pair\") on node \"ip-172-31-26-176\" DevicePath \"\"" Sep 9 05:35:26.768572 kubelet[3331]: I0909 05:35:26.768415 3331 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8121d7-77d9-4cab-818e-56bcdf112ac8-whisker-ca-bundle\") on node \"ip-172-31-26-176\" DevicePath \"\"" Sep 9 05:35:26.768875 kubelet[3331]: I0909 05:35:26.768826 3331 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7fps\" (UniqueName: \"kubernetes.io/projected/ef8121d7-77d9-4cab-818e-56bcdf112ac8-kube-api-access-t7fps\") on node \"ip-172-31-26-176\" DevicePath \"\"" Sep 9 05:35:27.492354 kubelet[3331]: I0909 05:35:27.492008 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:27.497873 systemd[1]: Removed slice kubepods-besteffort-podef8121d7_77d9_4cab_818e_56bcdf112ac8.slice - libcontainer container kubepods-besteffort-podef8121d7_77d9_4cab_818e_56bcdf112ac8.slice. Sep 9 05:35:27.623928 systemd[1]: Created slice kubepods-besteffort-pod5db875a7_1adb_4132_ad56_051d1aa7a6db.slice - libcontainer container kubepods-besteffort-pod5db875a7_1adb_4132_ad56_051d1aa7a6db.slice. Sep 9 05:35:27.775797 kubelet[3331]: I0909 05:35:27.775631 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drn4b\" (UniqueName: \"kubernetes.io/projected/5db875a7-1adb-4132-ad56-051d1aa7a6db-kube-api-access-drn4b\") pod \"whisker-958bb795b-bjqsh\" (UID: \"5db875a7-1adb-4132-ad56-051d1aa7a6db\") " pod="calico-system/whisker-958bb795b-bjqsh" Sep 9 05:35:27.775797 kubelet[3331]: I0909 05:35:27.775698 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db875a7-1adb-4132-ad56-051d1aa7a6db-whisker-ca-bundle\") pod \"whisker-958bb795b-bjqsh\" (UID: \"5db875a7-1adb-4132-ad56-051d1aa7a6db\") " pod="calico-system/whisker-958bb795b-bjqsh" Sep 9 05:35:27.775797 kubelet[3331]: I0909 05:35:27.775725 3331 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5db875a7-1adb-4132-ad56-051d1aa7a6db-whisker-backend-key-pair\") pod \"whisker-958bb795b-bjqsh\" (UID: \"5db875a7-1adb-4132-ad56-051d1aa7a6db\") " pod="calico-system/whisker-958bb795b-bjqsh" Sep 9 05:35:27.927152 containerd[1938]: time="2025-09-09T05:35:27.926989579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-958bb795b-bjqsh,Uid:5db875a7-1adb-4132-ad56-051d1aa7a6db,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:28.182023 kubelet[3331]: I0909 05:35:28.181884 3331 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8121d7-77d9-4cab-818e-56bcdf112ac8" path="/var/lib/kubelet/pods/ef8121d7-77d9-4cab-818e-56bcdf112ac8/volumes" Sep 9 05:35:28.610155 systemd-networkd[1819]: vxlan.calico: Link UP Sep 9 05:35:28.610168 systemd-networkd[1819]: vxlan.calico: Gained carrier Sep 9 05:35:28.614201 (udev-worker)[4574]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:35:28.646176 (udev-worker)[4776]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:35:28.646593 systemd-networkd[1819]: calic2ab00ff208: Link UP Sep 9 05:35:28.646863 systemd-networkd[1819]: calic2ab00ff208: Gained carrier Sep 9 05:35:28.658292 (udev-worker)[4779]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:35:28.660149 (udev-worker)[4780]: Network interface NamePolicy= disabled on kernel command line. Sep 9 05:35:28.707201 containerd[1938]: 2025-09-09 05:35:28.000 [INFO][4687] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:35:28.707201 containerd[1938]: 2025-09-09 05:35:28.070 [INFO][4687] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0 whisker-958bb795b- calico-system 5db875a7-1adb-4132-ad56-051d1aa7a6db 909 0 2025-09-09 05:35:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:958bb795b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-26-176 whisker-958bb795b-bjqsh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic2ab00ff208 [] [] }} ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-" Sep 9 05:35:28.707201 containerd[1938]: 2025-09-09 05:35:28.070 [INFO][4687] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.707201 containerd[1938]: 2025-09-09 05:35:28.502 [INFO][4702] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" HandleID="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Workload="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.506 [INFO][4702] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" HandleID="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Workload="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035e620), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-176", "pod":"whisker-958bb795b-bjqsh", "timestamp":"2025-09-09 05:35:28.502382253 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.506 [INFO][4702] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.508 [INFO][4702] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.511 [INFO][4702] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.532 [INFO][4702] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" host="ip-172-31-26-176" Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.551 [INFO][4702] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.560 [INFO][4702] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.563 [INFO][4702] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:28.708277 containerd[1938]: 2025-09-09 05:35:28.567 [INFO][4702] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.567 [INFO][4702] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" host="ip-172-31-26-176" Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.570 [INFO][4702] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54 Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.578 [INFO][4702] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" host="ip-172-31-26-176" Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.591 [INFO][4702] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.65/26] block=192.168.74.64/26 handle="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" host="ip-172-31-26-176" Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.591 [INFO][4702] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.65/26] handle="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" host="ip-172-31-26-176" Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.591 [INFO][4702] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:28.709299 containerd[1938]: 2025-09-09 05:35:28.592 [INFO][4702] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.65/26] IPv6=[] ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" HandleID="k8s-pod-network.52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Workload="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.709710 containerd[1938]: 2025-09-09 05:35:28.621 [INFO][4687] cni-plugin/k8s.go 418: Populated endpoint ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0", GenerateName:"whisker-958bb795b-", Namespace:"calico-system", SelfLink:"", UID:"5db875a7-1adb-4132-ad56-051d1aa7a6db", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"958bb795b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"whisker-958bb795b-bjqsh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic2ab00ff208", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:28.709710 containerd[1938]: 2025-09-09 05:35:28.621 [INFO][4687] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.65/32] ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.709912 containerd[1938]: 2025-09-09 05:35:28.622 [INFO][4687] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2ab00ff208 ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.709912 containerd[1938]: 2025-09-09 05:35:28.648 [INFO][4687] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:28.710127 containerd[1938]: 2025-09-09 05:35:28.656 [INFO][4687] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0", GenerateName:"whisker-958bb795b-", Namespace:"calico-system", SelfLink:"", UID:"5db875a7-1adb-4132-ad56-051d1aa7a6db", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"958bb795b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54", Pod:"whisker-958bb795b-bjqsh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic2ab00ff208", MAC:"ce:9c:ad:d3:06:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:28.710255 containerd[1938]: 2025-09-09 05:35:28.695 [INFO][4687] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" Namespace="calico-system" Pod="whisker-958bb795b-bjqsh" WorkloadEndpoint="ip--172--31--26--176-k8s-whisker--958bb795b--bjqsh-eth0" Sep 9 05:35:29.104146 containerd[1938]: time="2025-09-09T05:35:29.102772218Z" level=info msg="connecting to shim 52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54" address="unix:///run/containerd/s/3d7170cf3ac2ff6808eb37a248b8afbbc701de69a57a6fbefd552a71dd673166" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:29.159065 systemd[1]: Started cri-containerd-52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54.scope - libcontainer container 52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54. Sep 9 05:35:29.295592 containerd[1938]: time="2025-09-09T05:35:29.295539175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-958bb795b-bjqsh,Uid:5db875a7-1adb-4132-ad56-051d1aa7a6db,Namespace:calico-system,Attempt:0,} returns sandbox id \"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54\"" Sep 9 05:35:29.322319 containerd[1938]: time="2025-09-09T05:35:29.322269670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:35:30.412644 systemd-networkd[1819]: calic2ab00ff208: Gained IPv6LL Sep 9 05:35:30.555305 containerd[1938]: time="2025-09-09T05:35:30.555253890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:30.558426 containerd[1938]: time="2025-09-09T05:35:30.558366751Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 05:35:30.560390 containerd[1938]: time="2025-09-09T05:35:30.559315690Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:30.563562 containerd[1938]: time="2025-09-09T05:35:30.562975950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:30.564195 containerd[1938]: time="2025-09-09T05:35:30.564166473Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.241838339s" Sep 9 05:35:30.564297 containerd[1938]: time="2025-09-09T05:35:30.564278601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 05:35:30.574567 containerd[1938]: time="2025-09-09T05:35:30.574515991Z" level=info msg="CreateContainer within sandbox \"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:35:30.588498 containerd[1938]: time="2025-09-09T05:35:30.587182684Z" level=info msg="Container 0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:30.601554 containerd[1938]: time="2025-09-09T05:35:30.601504843Z" level=info msg="CreateContainer within sandbox \"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776\"" Sep 9 05:35:30.602098 containerd[1938]: time="2025-09-09T05:35:30.602041449Z" level=info msg="StartContainer for \"0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776\"" Sep 9 05:35:30.603409 containerd[1938]: time="2025-09-09T05:35:30.603377622Z" level=info msg="connecting to shim 0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776" address="unix:///run/containerd/s/3d7170cf3ac2ff6808eb37a248b8afbbc701de69a57a6fbefd552a71dd673166" protocol=ttrpc version=3 Sep 9 05:35:30.605739 systemd-networkd[1819]: vxlan.calico: Gained IPv6LL Sep 9 05:35:30.633709 systemd[1]: Started cri-containerd-0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776.scope - libcontainer container 0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776. Sep 9 05:35:30.690227 containerd[1938]: time="2025-09-09T05:35:30.690188651Z" level=info msg="StartContainer for \"0a9701db240a1e12f375e31a29d7b9b12948906aa0e31b901d066f74469b8776\" returns successfully" Sep 9 05:35:30.691994 containerd[1938]: time="2025-09-09T05:35:30.691721665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:35:31.176143 containerd[1938]: time="2025-09-09T05:35:31.176092120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hqlv,Uid:de52e7c0-304d-4417-9812-1ab2edc8f219,Namespace:kube-system,Attempt:0,}" Sep 9 05:35:31.317640 systemd-networkd[1819]: calid53eb3d52fa: Link UP Sep 9 05:35:31.317895 systemd-networkd[1819]: calid53eb3d52fa: Gained carrier Sep 9 05:35:31.339556 containerd[1938]: 2025-09-09 05:35:31.223 [INFO][4915] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0 coredns-7c65d6cfc9- kube-system de52e7c0-304d-4417-9812-1ab2edc8f219 832 0 2025-09-09 05:34:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-26-176 coredns-7c65d6cfc9-8hqlv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid53eb3d52fa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-" Sep 9 05:35:31.339556 containerd[1938]: 2025-09-09 05:35:31.223 [INFO][4915] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.339556 containerd[1938]: 2025-09-09 05:35:31.261 [INFO][4926] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" HandleID="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.262 [INFO][4926] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" HandleID="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5020), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-26-176", "pod":"coredns-7c65d6cfc9-8hqlv", "timestamp":"2025-09-09 05:35:31.261419356 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.262 [INFO][4926] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.262 [INFO][4926] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.262 [INFO][4926] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.271 [INFO][4926] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" host="ip-172-31-26-176" Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.276 [INFO][4926] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.282 [INFO][4926] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.284 [INFO][4926] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:31.341313 containerd[1938]: 2025-09-09 05:35:31.287 [INFO][4926] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.287 [INFO][4926] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" host="ip-172-31-26-176" Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.290 [INFO][4926] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1 Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.302 [INFO][4926] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" host="ip-172-31-26-176" Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.308 [INFO][4926] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.66/26] block=192.168.74.64/26 handle="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" host="ip-172-31-26-176" Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.308 [INFO][4926] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.66/26] handle="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" host="ip-172-31-26-176" Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.308 [INFO][4926] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:31.343172 containerd[1938]: 2025-09-09 05:35:31.309 [INFO][4926] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.66/26] IPv6=[] ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" HandleID="k8s-pod-network.d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.313 [INFO][4915] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"de52e7c0-304d-4417-9812-1ab2edc8f219", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 34, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"coredns-7c65d6cfc9-8hqlv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid53eb3d52fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.313 [INFO][4915] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.66/32] ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.313 [INFO][4915] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid53eb3d52fa ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.316 [INFO][4915] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.316 [INFO][4915] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"de52e7c0-304d-4417-9812-1ab2edc8f219", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 34, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1", Pod:"coredns-7c65d6cfc9-8hqlv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid53eb3d52fa", MAC:"b2:53:9c:92:4b:a7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:31.343663 containerd[1938]: 2025-09-09 05:35:31.335 [INFO][4915] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8hqlv" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--8hqlv-eth0" Sep 9 05:35:31.387338 containerd[1938]: time="2025-09-09T05:35:31.387262601Z" level=info msg="connecting to shim d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1" address="unix:///run/containerd/s/02ac6794f05992bb40cc51867394dae31dff015731fc73c6347e9d0440ae3d34" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:31.425277 systemd[1]: Started cri-containerd-d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1.scope - libcontainer container d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1. Sep 9 05:35:31.480144 containerd[1938]: time="2025-09-09T05:35:31.480107341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8hqlv,Uid:de52e7c0-304d-4417-9812-1ab2edc8f219,Namespace:kube-system,Attempt:0,} returns sandbox id \"d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1\"" Sep 9 05:35:31.484539 containerd[1938]: time="2025-09-09T05:35:31.484459682Z" level=info msg="CreateContainer within sandbox \"d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:35:31.511684 containerd[1938]: time="2025-09-09T05:35:31.511634051Z" level=info msg="Container 9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:31.523041 containerd[1938]: time="2025-09-09T05:35:31.522987931Z" level=info msg="CreateContainer within sandbox \"d8d44b20586faf70c35bbfebe4e1834164fb6c935f81bb807db14c964b023ff1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d\"" Sep 9 05:35:31.523711 containerd[1938]: time="2025-09-09T05:35:31.523665345Z" level=info msg="StartContainer for \"9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d\"" Sep 9 05:35:31.525460 containerd[1938]: time="2025-09-09T05:35:31.525412338Z" level=info msg="connecting to shim 9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d" address="unix:///run/containerd/s/02ac6794f05992bb40cc51867394dae31dff015731fc73c6347e9d0440ae3d34" protocol=ttrpc version=3 Sep 9 05:35:31.549965 systemd[1]: Started cri-containerd-9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d.scope - libcontainer container 9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d. Sep 9 05:35:31.605002 containerd[1938]: time="2025-09-09T05:35:31.604900569Z" level=info msg="StartContainer for \"9d14a626302e0c1c929c2d285bd306dc71fbefd33310a12de75f8445cd70b74d\" returns successfully" Sep 9 05:35:32.179198 containerd[1938]: time="2025-09-09T05:35:32.178526242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-rfntj,Uid:ff947e5b-fad0-4247-a8d1-9a94d6a84b62,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:35:32.205981 containerd[1938]: time="2025-09-09T05:35:32.205940784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-j4l25,Uid:db0b27da-0f3a-412e-b1ae-6d3b86e6ab56,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:35:32.206969 containerd[1938]: time="2025-09-09T05:35:32.206935550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zk5mx,Uid:72b2ef00-e3c5-4650-afc2-ad083fef1486,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:32.211847 containerd[1938]: time="2025-09-09T05:35:32.211575704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-pbtr9,Uid:bcd056d1-4179-4c9a-b7fd-1754e236f3c1,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:32.606893 systemd-networkd[1819]: cali749bade64bd: Link UP Sep 9 05:35:32.608470 systemd-networkd[1819]: cali749bade64bd: Gained carrier Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.320 [INFO][5019] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0 calico-apiserver-77545c669c- calico-apiserver ff947e5b-fad0-4247-a8d1-9a94d6a84b62 835 0 2025-09-09 05:35:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:77545c669c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-26-176 calico-apiserver-77545c669c-rfntj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali749bade64bd [] [] }} ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.321 [INFO][5019] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.467 [INFO][5055] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" HandleID="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.467 [INFO][5055] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" HandleID="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003159c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-26-176", "pod":"calico-apiserver-77545c669c-rfntj", "timestamp":"2025-09-09 05:35:32.467597168 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.468 [INFO][5055] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.468 [INFO][5055] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.468 [INFO][5055] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.485 [INFO][5055] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.511 [INFO][5055] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.524 [INFO][5055] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.528 [INFO][5055] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.532 [INFO][5055] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.532 [INFO][5055] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.535 [INFO][5055] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204 Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.552 [INFO][5055] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.567 [INFO][5055] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.67/26] block=192.168.74.64/26 handle="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.567 [INFO][5055] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.67/26] handle="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" host="ip-172-31-26-176" Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.567 [INFO][5055] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:32.685879 containerd[1938]: 2025-09-09 05:35:32.567 [INFO][5055] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.67/26] IPv6=[] ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" HandleID="k8s-pod-network.3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.599 [INFO][5019] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0", GenerateName:"calico-apiserver-77545c669c-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff947e5b-fad0-4247-a8d1-9a94d6a84b62", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77545c669c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"calico-apiserver-77545c669c-rfntj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali749bade64bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.600 [INFO][5019] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.67/32] ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.600 [INFO][5019] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali749bade64bd ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.612 [INFO][5019] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.615 [INFO][5019] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0", GenerateName:"calico-apiserver-77545c669c-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff947e5b-fad0-4247-a8d1-9a94d6a84b62", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77545c669c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204", Pod:"calico-apiserver-77545c669c-rfntj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali749bade64bd", MAC:"aa:ab:0f:5a:b7:2f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:32.687792 containerd[1938]: 2025-09-09 05:35:32.652 [INFO][5019] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-rfntj" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--rfntj-eth0" Sep 9 05:35:32.793456 systemd-networkd[1819]: cali83d9c0e70a4: Link UP Sep 9 05:35:32.795017 systemd-networkd[1819]: cali83d9c0e70a4: Gained carrier Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.351 [INFO][5027] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0 calico-apiserver-77545c669c- calico-apiserver db0b27da-0f3a-412e-b1ae-6d3b86e6ab56 837 0 2025-09-09 05:35:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:77545c669c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-26-176 calico-apiserver-77545c669c-j4l25 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali83d9c0e70a4 [] [] }} ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.351 [INFO][5027] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.497 [INFO][5071] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" HandleID="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.497 [INFO][5071] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" HandleID="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003d2030), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-26-176", "pod":"calico-apiserver-77545c669c-j4l25", "timestamp":"2025-09-09 05:35:32.497170335 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.497 [INFO][5071] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.568 [INFO][5071] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.568 [INFO][5071] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.601 [INFO][5071] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.648 [INFO][5071] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.688 [INFO][5071] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.697 [INFO][5071] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.704 [INFO][5071] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.704 [INFO][5071] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.718 [INFO][5071] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.730 [INFO][5071] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.752 [INFO][5071] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.68/26] block=192.168.74.64/26 handle="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.753 [INFO][5071] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.68/26] handle="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" host="ip-172-31-26-176" Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.753 [INFO][5071] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:32.885890 containerd[1938]: 2025-09-09 05:35:32.753 [INFO][5071] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.68/26] IPv6=[] ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" HandleID="k8s-pod-network.5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Workload="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.758 [INFO][5027] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0", GenerateName:"calico-apiserver-77545c669c-", Namespace:"calico-apiserver", SelfLink:"", UID:"db0b27da-0f3a-412e-b1ae-6d3b86e6ab56", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77545c669c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"calico-apiserver-77545c669c-j4l25", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83d9c0e70a4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.759 [INFO][5027] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.68/32] ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.759 [INFO][5027] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali83d9c0e70a4 ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.808 [INFO][5027] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.823 [INFO][5027] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0", GenerateName:"calico-apiserver-77545c669c-", Namespace:"calico-apiserver", SelfLink:"", UID:"db0b27da-0f3a-412e-b1ae-6d3b86e6ab56", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77545c669c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf", Pod:"calico-apiserver-77545c669c-j4l25", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83d9c0e70a4", MAC:"6e:0c:05:e3:0d:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:32.889918 containerd[1938]: 2025-09-09 05:35:32.877 [INFO][5027] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" Namespace="calico-apiserver" Pod="calico-apiserver-77545c669c-j4l25" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--apiserver--77545c669c--j4l25-eth0" Sep 9 05:35:32.921510 containerd[1938]: time="2025-09-09T05:35:32.919512410Z" level=info msg="connecting to shim 3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204" address="unix:///run/containerd/s/0299492c4431792736511bb38303df498c07e11f58fc443fac051282ebecc9b2" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:32.964347 kubelet[3331]: I0909 05:35:32.929293 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8hqlv" podStartSLOduration=40.927079879 podStartE2EDuration="40.927079879s" podCreationTimestamp="2025-09-09 05:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:35:32.901964636 +0000 UTC m=+46.862261300" watchObservedRunningTime="2025-09-09 05:35:32.927079879 +0000 UTC m=+46.887376542" Sep 9 05:35:33.059800 containerd[1938]: time="2025-09-09T05:35:33.059747337Z" level=info msg="connecting to shim 5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf" address="unix:///run/containerd/s/ecf3af09ec8c357967b639643080ecc45164dd0b368476c910a6a8325350c4a6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:33.100761 systemd-networkd[1819]: calid53eb3d52fa: Gained IPv6LL Sep 9 05:35:33.134725 systemd-networkd[1819]: cali39b07abc9fc: Link UP Sep 9 05:35:33.139721 systemd[1]: Started cri-containerd-3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204.scope - libcontainer container 3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204. Sep 9 05:35:33.143570 systemd-networkd[1819]: cali39b07abc9fc: Gained carrier Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.440 [INFO][5045] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0 csi-node-driver- calico-system 72b2ef00-e3c5-4650-afc2-ad083fef1486 721 0 2025-09-09 05:35:09 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-26-176 csi-node-driver-zk5mx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali39b07abc9fc [] [] }} ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.440 [INFO][5045] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.709 [INFO][5080] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" HandleID="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Workload="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.715 [INFO][5080] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" HandleID="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Workload="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031d660), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-176", "pod":"csi-node-driver-zk5mx", "timestamp":"2025-09-09 05:35:32.709789811 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.715 [INFO][5080] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.754 [INFO][5080] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.754 [INFO][5080] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.788 [INFO][5080] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.821 [INFO][5080] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.931 [INFO][5080] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.950 [INFO][5080] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.970 [INFO][5080] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.976 [INFO][5080] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:32.983 [INFO][5080] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757 Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:33.022 [INFO][5080] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:33.074 [INFO][5080] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.69/26] block=192.168.74.64/26 handle="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:33.074 [INFO][5080] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.69/26] handle="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" host="ip-172-31-26-176" Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:33.075 [INFO][5080] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:33.223491 containerd[1938]: 2025-09-09 05:35:33.075 [INFO][5080] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.69/26] IPv6=[] ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" HandleID="k8s-pod-network.32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Workload="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.086 [INFO][5045] cni-plugin/k8s.go 418: Populated endpoint ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72b2ef00-e3c5-4650-afc2-ad083fef1486", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"csi-node-driver-zk5mx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali39b07abc9fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.087 [INFO][5045] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.69/32] ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.087 [INFO][5045] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali39b07abc9fc ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.148 [INFO][5045] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.152 [INFO][5045] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72b2ef00-e3c5-4650-afc2-ad083fef1486", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757", Pod:"csi-node-driver-zk5mx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali39b07abc9fc", MAC:"d6:13:de:5c:cc:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:33.224401 containerd[1938]: 2025-09-09 05:35:33.201 [INFO][5045] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" Namespace="calico-system" Pod="csi-node-driver-zk5mx" WorkloadEndpoint="ip--172--31--26--176-k8s-csi--node--driver--zk5mx-eth0" Sep 9 05:35:33.249886 systemd[1]: Started cri-containerd-5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf.scope - libcontainer container 5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf. Sep 9 05:35:33.329036 containerd[1938]: time="2025-09-09T05:35:33.326935918Z" level=info msg="connecting to shim 32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757" address="unix:///run/containerd/s/216ae2f72e70907cefd1c608da605d9c7e2ab03465e2492b8b4643659daf87f7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:33.337145 systemd-networkd[1819]: cali745a6aca065: Link UP Sep 9 05:35:33.339033 systemd-networkd[1819]: cali745a6aca065: Gained carrier Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:32.500 [INFO][5056] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0 goldmane-7988f88666- calico-system bcd056d1-4179-4c9a-b7fd-1754e236f3c1 834 0 2025-09-09 05:35:08 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-26-176 goldmane-7988f88666-pbtr9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali745a6aca065 [] [] }} ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:32.501 [INFO][5056] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:32.822 [INFO][5092] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" HandleID="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Workload="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:32.822 [INFO][5092] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" HandleID="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Workload="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5a60), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-176", "pod":"goldmane-7988f88666-pbtr9", "timestamp":"2025-09-09 05:35:32.822662554 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:32.823 [INFO][5092] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.075 [INFO][5092] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.079 [INFO][5092] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.150 [INFO][5092] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.183 [INFO][5092] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.199 [INFO][5092] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.212 [INFO][5092] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.230 [INFO][5092] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.233 [INFO][5092] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.262 [INFO][5092] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773 Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.277 [INFO][5092] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.311 [INFO][5092] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.70/26] block=192.168.74.64/26 handle="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.311 [INFO][5092] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.70/26] handle="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" host="ip-172-31-26-176" Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.311 [INFO][5092] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:33.406231 containerd[1938]: 2025-09-09 05:35:33.311 [INFO][5092] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.70/26] IPv6=[] ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" HandleID="k8s-pod-network.61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Workload="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.328 [INFO][5056] cni-plugin/k8s.go 418: Populated endpoint ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"bcd056d1-4179-4c9a-b7fd-1754e236f3c1", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"goldmane-7988f88666-pbtr9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali745a6aca065", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.328 [INFO][5056] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.70/32] ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.328 [INFO][5056] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali745a6aca065 ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.340 [INFO][5056] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.340 [INFO][5056] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"bcd056d1-4179-4c9a-b7fd-1754e236f3c1", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773", Pod:"goldmane-7988f88666-pbtr9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali745a6aca065", MAC:"2e:03:84:10:a8:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:33.409363 containerd[1938]: 2025-09-09 05:35:33.377 [INFO][5056] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" Namespace="calico-system" Pod="goldmane-7988f88666-pbtr9" WorkloadEndpoint="ip--172--31--26--176-k8s-goldmane--7988f88666--pbtr9-eth0" Sep 9 05:35:33.443745 systemd[1]: Started cri-containerd-32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757.scope - libcontainer container 32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757. Sep 9 05:35:33.493697 containerd[1938]: time="2025-09-09T05:35:33.493619440Z" level=info msg="connecting to shim 61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773" address="unix:///run/containerd/s/49128dd45bb1bd994aba57fb9cff2ff9f81d9627db8b6a831f11dad2d61cc182" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:33.553209 systemd[1]: Started cri-containerd-61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773.scope - libcontainer container 61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773. Sep 9 05:35:33.675981 containerd[1938]: time="2025-09-09T05:35:33.675916234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zk5mx,Uid:72b2ef00-e3c5-4650-afc2-ad083fef1486,Namespace:calico-system,Attempt:0,} returns sandbox id \"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757\"" Sep 9 05:35:33.763006 containerd[1938]: time="2025-09-09T05:35:33.762963823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-pbtr9,Uid:bcd056d1-4179-4c9a-b7fd-1754e236f3c1,Namespace:calico-system,Attempt:0,} returns sandbox id \"61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773\"" Sep 9 05:35:33.839505 containerd[1938]: time="2025-09-09T05:35:33.838375384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-j4l25,Uid:db0b27da-0f3a-412e-b1ae-6d3b86e6ab56,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf\"" Sep 9 05:35:33.885802 containerd[1938]: time="2025-09-09T05:35:33.883333108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77545c669c-rfntj,Uid:ff947e5b-fad0-4247-a8d1-9a94d6a84b62,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204\"" Sep 9 05:35:34.062589 systemd-networkd[1819]: cali83d9c0e70a4: Gained IPv6LL Sep 9 05:35:34.124686 systemd-networkd[1819]: cali749bade64bd: Gained IPv6LL Sep 9 05:35:34.252701 systemd-networkd[1819]: cali39b07abc9fc: Gained IPv6LL Sep 9 05:35:34.376714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount984101377.mount: Deactivated successfully. Sep 9 05:35:34.417660 containerd[1938]: time="2025-09-09T05:35:34.417472434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:34.419537 containerd[1938]: time="2025-09-09T05:35:34.419351632Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 05:35:34.421980 containerd[1938]: time="2025-09-09T05:35:34.421942074Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:34.427538 containerd[1938]: time="2025-09-09T05:35:34.427350613Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:34.428582 containerd[1938]: time="2025-09-09T05:35:34.428394786Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.736637845s" Sep 9 05:35:34.428582 containerd[1938]: time="2025-09-09T05:35:34.428441220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 05:35:34.430877 containerd[1938]: time="2025-09-09T05:35:34.430748680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:35:34.432404 containerd[1938]: time="2025-09-09T05:35:34.432370168Z" level=info msg="CreateContainer within sandbox \"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:35:34.448772 containerd[1938]: time="2025-09-09T05:35:34.446006889Z" level=info msg="Container 4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:34.461496 containerd[1938]: time="2025-09-09T05:35:34.461200079Z" level=info msg="CreateContainer within sandbox \"52f96fac661b018dbce8d684022863b70e45ae0a1f65809d859df744702ccf54\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10\"" Sep 9 05:35:34.462406 containerd[1938]: time="2025-09-09T05:35:34.462376775Z" level=info msg="StartContainer for \"4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10\"" Sep 9 05:35:34.463884 containerd[1938]: time="2025-09-09T05:35:34.463850564Z" level=info msg="connecting to shim 4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10" address="unix:///run/containerd/s/3d7170cf3ac2ff6808eb37a248b8afbbc701de69a57a6fbefd552a71dd673166" protocol=ttrpc version=3 Sep 9 05:35:34.495780 systemd[1]: Started cri-containerd-4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10.scope - libcontainer container 4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10. Sep 9 05:35:34.569867 containerd[1938]: time="2025-09-09T05:35:34.569822460Z" level=info msg="StartContainer for \"4e97c102fb5f9c57c12875477f6f643c71f9bdb91a9c3c1aedb46a8fe1522b10\" returns successfully" Sep 9 05:35:35.176444 containerd[1938]: time="2025-09-09T05:35:35.176397755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69b9ddd587-48l4w,Uid:35c3c610-8e0e-4648-acad-34df202fc795,Namespace:calico-system,Attempt:0,}" Sep 9 05:35:35.176931 containerd[1938]: time="2025-09-09T05:35:35.176707510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d7txq,Uid:81f920a1-0383-495e-80c9-59ade95d3519,Namespace:kube-system,Attempt:0,}" Sep 9 05:35:35.213263 systemd-networkd[1819]: cali745a6aca065: Gained IPv6LL Sep 9 05:35:35.332463 systemd[1]: Started sshd@9-172.31.26.176:22-147.75.109.163:56808.service - OpenSSH per-connection server daemon (147.75.109.163:56808). Sep 9 05:35:35.534120 systemd-networkd[1819]: cali3725bb81be7: Link UP Sep 9 05:35:35.536932 systemd-networkd[1819]: cali3725bb81be7: Gained carrier Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.287 [INFO][5374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0 calico-kube-controllers-69b9ddd587- calico-system 35c3c610-8e0e-4648-acad-34df202fc795 826 0 2025-09-09 05:35:09 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69b9ddd587 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-26-176 calico-kube-controllers-69b9ddd587-48l4w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3725bb81be7 [] [] }} ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.288 [INFO][5374] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.428 [INFO][5397] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" HandleID="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Workload="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.428 [INFO][5397] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" HandleID="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Workload="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038fb10), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-26-176", "pod":"calico-kube-controllers-69b9ddd587-48l4w", "timestamp":"2025-09-09 05:35:35.428562969 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.428 [INFO][5397] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.428 [INFO][5397] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.428 [INFO][5397] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.453 [INFO][5397] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.468 [INFO][5397] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.475 [INFO][5397] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.478 [INFO][5397] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.483 [INFO][5397] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.483 [INFO][5397] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.485 [INFO][5397] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.493 [INFO][5397] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.509 [INFO][5397] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.71/26] block=192.168.74.64/26 handle="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.509 [INFO][5397] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.71/26] handle="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" host="ip-172-31-26-176" Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.510 [INFO][5397] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:35.599026 containerd[1938]: 2025-09-09 05:35:35.510 [INFO][5397] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.71/26] IPv6=[] ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" HandleID="k8s-pod-network.b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Workload="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.520 [INFO][5374] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0", GenerateName:"calico-kube-controllers-69b9ddd587-", Namespace:"calico-system", SelfLink:"", UID:"35c3c610-8e0e-4648-acad-34df202fc795", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69b9ddd587", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"calico-kube-controllers-69b9ddd587-48l4w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3725bb81be7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.520 [INFO][5374] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.71/32] ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.520 [INFO][5374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3725bb81be7 ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.538 [INFO][5374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.543 [INFO][5374] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0", GenerateName:"calico-kube-controllers-69b9ddd587-", Namespace:"calico-system", SelfLink:"", UID:"35c3c610-8e0e-4648-acad-34df202fc795", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 35, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69b9ddd587", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b", Pod:"calico-kube-controllers-69b9ddd587-48l4w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3725bb81be7", MAC:"72:ec:e8:a7:0a:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:35.600025 containerd[1938]: 2025-09-09 05:35:35.585 [INFO][5374] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" Namespace="calico-system" Pod="calico-kube-controllers-69b9ddd587-48l4w" WorkloadEndpoint="ip--172--31--26--176-k8s-calico--kube--controllers--69b9ddd587--48l4w-eth0" Sep 9 05:35:35.613309 sshd[5406]: Accepted publickey for core from 147.75.109.163 port 56808 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:35.616717 kubelet[3331]: I0909 05:35:35.616088 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-958bb795b-bjqsh" podStartSLOduration=3.502832406 podStartE2EDuration="8.616060429s" podCreationTimestamp="2025-09-09 05:35:27 +0000 UTC" firstStartedPulling="2025-09-09 05:35:29.316547683 +0000 UTC m=+43.276844325" lastFinishedPulling="2025-09-09 05:35:34.429775703 +0000 UTC m=+48.390072348" observedRunningTime="2025-09-09 05:35:35.615466666 +0000 UTC m=+49.575763344" watchObservedRunningTime="2025-09-09 05:35:35.616060429 +0000 UTC m=+49.576357091" Sep 9 05:35:35.619938 sshd-session[5406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:35.634777 systemd-logind[1878]: New session 10 of user core. Sep 9 05:35:35.639798 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:35:35.704630 containerd[1938]: time="2025-09-09T05:35:35.704570367Z" level=info msg="connecting to shim b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b" address="unix:///run/containerd/s/c88632394b53c65d409c69ca9efcddfc7438fad9e4e818dd23c61918ec3828c5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:35.807126 systemd-networkd[1819]: cali178cf1b07b2: Link UP Sep 9 05:35:35.807884 systemd-networkd[1819]: cali178cf1b07b2: Gained carrier Sep 9 05:35:35.877577 systemd[1]: Started cri-containerd-b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b.scope - libcontainer container b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b. Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.292 [INFO][5372] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0 coredns-7c65d6cfc9- kube-system 81f920a1-0383-495e-80c9-59ade95d3519 836 0 2025-09-09 05:34:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-26-176 coredns-7c65d6cfc9-d7txq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali178cf1b07b2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.293 [INFO][5372] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.425 [INFO][5399] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" HandleID="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.429 [INFO][5399] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" HandleID="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380110), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-26-176", "pod":"coredns-7c65d6cfc9-d7txq", "timestamp":"2025-09-09 05:35:35.425733276 +0000 UTC"}, Hostname:"ip-172-31-26-176", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.434 [INFO][5399] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.509 [INFO][5399] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.509 [INFO][5399] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-26-176' Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.554 [INFO][5399] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.581 [INFO][5399] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.612 [INFO][5399] ipam/ipam.go 511: Trying affinity for 192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.622 [INFO][5399] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.651 [INFO][5399] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.656 [INFO][5399] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.682 [INFO][5399] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.709 [INFO][5399] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.750 [INFO][5399] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.72/26] block=192.168.74.64/26 handle="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.751 [INFO][5399] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.72/26] handle="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" host="ip-172-31-26-176" Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.751 [INFO][5399] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:35:35.896796 containerd[1938]: 2025-09-09 05:35:35.751 [INFO][5399] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.72/26] IPv6=[] ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" HandleID="k8s-pod-network.18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Workload="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.785 [INFO][5372] cni-plugin/k8s.go 418: Populated endpoint ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"81f920a1-0383-495e-80c9-59ade95d3519", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 34, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"", Pod:"coredns-7c65d6cfc9-d7txq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali178cf1b07b2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.786 [INFO][5372] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.72/32] ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.786 [INFO][5372] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali178cf1b07b2 ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.813 [INFO][5372] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.821 [INFO][5372] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"81f920a1-0383-495e-80c9-59ade95d3519", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 34, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-26-176", ContainerID:"18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f", Pod:"coredns-7c65d6cfc9-d7txq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali178cf1b07b2", MAC:"5e:15:4e:10:72:7b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:35:35.901763 containerd[1938]: 2025-09-09 05:35:35.885 [INFO][5372] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d7txq" WorkloadEndpoint="ip--172--31--26--176-k8s-coredns--7c65d6cfc9--d7txq-eth0" Sep 9 05:35:36.043255 containerd[1938]: time="2025-09-09T05:35:36.042522720Z" level=info msg="connecting to shim 18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f" address="unix:///run/containerd/s/861c4f0e71522ad1f8344613346a5f19791b25c8688e53405829dfcffb14f008" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:35:36.161052 systemd[1]: Started cri-containerd-18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f.scope - libcontainer container 18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f. Sep 9 05:35:36.423103 containerd[1938]: time="2025-09-09T05:35:36.423053507Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69b9ddd587-48l4w,Uid:35c3c610-8e0e-4648-acad-34df202fc795,Namespace:calico-system,Attempt:0,} returns sandbox id \"b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b\"" Sep 9 05:35:36.443752 containerd[1938]: time="2025-09-09T05:35:36.443410940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d7txq,Uid:81f920a1-0383-495e-80c9-59ade95d3519,Namespace:kube-system,Attempt:0,} returns sandbox id \"18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f\"" Sep 9 05:35:36.489560 containerd[1938]: time="2025-09-09T05:35:36.489197512Z" level=info msg="CreateContainer within sandbox \"18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:35:36.591960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1270953082.mount: Deactivated successfully. Sep 9 05:35:36.622271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1512084671.mount: Deactivated successfully. Sep 9 05:35:36.624921 containerd[1938]: time="2025-09-09T05:35:36.623896790Z" level=info msg="Container b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:36.684537 containerd[1938]: time="2025-09-09T05:35:36.684150443Z" level=info msg="CreateContainer within sandbox \"18705dd8327f06406a9880654e8e9fd99e328144ad0b78d30aaf088b9639f27f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284\"" Sep 9 05:35:36.696785 containerd[1938]: time="2025-09-09T05:35:36.696445176Z" level=info msg="StartContainer for \"b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284\"" Sep 9 05:35:36.704778 containerd[1938]: time="2025-09-09T05:35:36.704333458Z" level=info msg="connecting to shim b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284" address="unix:///run/containerd/s/861c4f0e71522ad1f8344613346a5f19791b25c8688e53405829dfcffb14f008" protocol=ttrpc version=3 Sep 9 05:35:36.835046 systemd[1]: Started cri-containerd-b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284.scope - libcontainer container b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284. Sep 9 05:35:36.940044 containerd[1938]: time="2025-09-09T05:35:36.939052423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:36.940693 systemd-networkd[1819]: cali3725bb81be7: Gained IPv6LL Sep 9 05:35:36.946508 containerd[1938]: time="2025-09-09T05:35:36.944069186Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 05:35:36.946894 containerd[1938]: time="2025-09-09T05:35:36.946846855Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:36.957261 containerd[1938]: time="2025-09-09T05:35:36.956616275Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:36.961053 containerd[1938]: time="2025-09-09T05:35:36.961002259Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.530112811s" Sep 9 05:35:36.961053 containerd[1938]: time="2025-09-09T05:35:36.961056479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 05:35:36.970327 containerd[1938]: time="2025-09-09T05:35:36.968168591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:35:36.971579 containerd[1938]: time="2025-09-09T05:35:36.971539009Z" level=info msg="CreateContainer within sandbox \"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:35:37.020217 containerd[1938]: time="2025-09-09T05:35:37.018012142Z" level=info msg="Container db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:37.044396 containerd[1938]: time="2025-09-09T05:35:37.044350397Z" level=info msg="CreateContainer within sandbox \"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87\"" Sep 9 05:35:37.045120 sshd[5424]: Connection closed by 147.75.109.163 port 56808 Sep 9 05:35:37.046081 sshd-session[5406]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:37.054496 containerd[1938]: time="2025-09-09T05:35:37.053988947Z" level=info msg="StartContainer for \"db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87\"" Sep 9 05:35:37.062258 systemd[1]: sshd@9-172.31.26.176:22-147.75.109.163:56808.service: Deactivated successfully. Sep 9 05:35:37.068095 containerd[1938]: time="2025-09-09T05:35:37.067960573Z" level=info msg="connecting to shim db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87" address="unix:///run/containerd/s/216ae2f72e70907cefd1c608da605d9c7e2ab03465e2492b8b4643659daf87f7" protocol=ttrpc version=3 Sep 9 05:35:37.068785 systemd-networkd[1819]: cali178cf1b07b2: Gained IPv6LL Sep 9 05:35:37.068992 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:35:37.072901 systemd-logind[1878]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:35:37.078568 systemd-logind[1878]: Removed session 10. Sep 9 05:35:37.096661 containerd[1938]: time="2025-09-09T05:35:37.096437074Z" level=info msg="StartContainer for \"b0eec51e4ef9139b9b56a83ff57ecb884bb381b2754f54ee43d0b2e6b73ae284\" returns successfully" Sep 9 05:35:37.135639 systemd[1]: Started cri-containerd-db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87.scope - libcontainer container db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87. Sep 9 05:35:37.266856 containerd[1938]: time="2025-09-09T05:35:37.266699264Z" level=info msg="StartContainer for \"db0c145e0becc90a10c965f44d1fa6acf8c946fddf5491b466bd3381f7b1cb87\" returns successfully" Sep 9 05:35:37.713157 kubelet[3331]: I0909 05:35:37.712805 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-d7txq" podStartSLOduration=45.712780608 podStartE2EDuration="45.712780608s" podCreationTimestamp="2025-09-09 05:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:35:37.68600686 +0000 UTC m=+51.646303517" watchObservedRunningTime="2025-09-09 05:35:37.712780608 +0000 UTC m=+51.673077270" Sep 9 05:35:38.060416 kubelet[3331]: I0909 05:35:38.060154 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:38.480172 containerd[1938]: time="2025-09-09T05:35:38.480088395Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" id:\"ee1598be2055615f6b2fc5924c5f5e65ade1695e83f4fcd3235167b0b2683827\" pid:5631 exited_at:{seconds:1757396138 nanos:479727763}" Sep 9 05:35:38.633944 containerd[1938]: time="2025-09-09T05:35:38.633858300Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" id:\"44c87a9768536deaad2e43471553d5fa673fc1046001927f49140501d7d128a8\" pid:5656 exited_at:{seconds:1757396138 nanos:633449005}" Sep 9 05:35:39.133907 ntpd[1871]: Listen normally on 7 vxlan.calico 192.168.74.64:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 7 vxlan.calico 192.168.74.64:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 8 vxlan.calico [fe80::648e:66ff:fe17:f28c%4]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 9 calic2ab00ff208 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 10 calid53eb3d52fa [fe80::ecee:eeff:feee:eeee%8]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 11 cali749bade64bd [fe80::ecee:eeff:feee:eeee%9]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 12 cali83d9c0e70a4 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 13 cali39b07abc9fc [fe80::ecee:eeff:feee:eeee%11]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 14 cali745a6aca065 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 15 cali3725bb81be7 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 9 05:35:39.135123 ntpd[1871]: 9 Sep 05:35:39 ntpd[1871]: Listen normally on 16 cali178cf1b07b2 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 9 05:35:39.134020 ntpd[1871]: Listen normally on 8 vxlan.calico [fe80::648e:66ff:fe17:f28c%4]:123 Sep 9 05:35:39.134083 ntpd[1871]: Listen normally on 9 calic2ab00ff208 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 9 05:35:39.134143 ntpd[1871]: Listen normally on 10 calid53eb3d52fa [fe80::ecee:eeff:feee:eeee%8]:123 Sep 9 05:35:39.134182 ntpd[1871]: Listen normally on 11 cali749bade64bd [fe80::ecee:eeff:feee:eeee%9]:123 Sep 9 05:35:39.134221 ntpd[1871]: Listen normally on 12 cali83d9c0e70a4 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 9 05:35:39.134260 ntpd[1871]: Listen normally on 13 cali39b07abc9fc [fe80::ecee:eeff:feee:eeee%11]:123 Sep 9 05:35:39.134297 ntpd[1871]: Listen normally on 14 cali745a6aca065 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 9 05:35:39.134336 ntpd[1871]: Listen normally on 15 cali3725bb81be7 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 9 05:35:39.134371 ntpd[1871]: Listen normally on 16 cali178cf1b07b2 [fe80::ecee:eeff:feee:eeee%14]:123 Sep 9 05:35:40.714126 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1077095047.mount: Deactivated successfully. Sep 9 05:35:41.701579 containerd[1938]: time="2025-09-09T05:35:41.700993464Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:41.739471 containerd[1938]: time="2025-09-09T05:35:41.704386583Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 05:35:41.741001 containerd[1938]: time="2025-09-09T05:35:41.710198148Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:41.741001 containerd[1938]: time="2025-09-09T05:35:41.718434462Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.749448235s" Sep 9 05:35:41.741001 containerd[1938]: time="2025-09-09T05:35:41.740616911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 05:35:41.742149 containerd[1938]: time="2025-09-09T05:35:41.742113012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:41.743764 containerd[1938]: time="2025-09-09T05:35:41.743326062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:35:41.748016 containerd[1938]: time="2025-09-09T05:35:41.747980686Z" level=info msg="CreateContainer within sandbox \"61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:35:41.791060 containerd[1938]: time="2025-09-09T05:35:41.791007580Z" level=info msg="Container fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:41.813935 containerd[1938]: time="2025-09-09T05:35:41.813870553Z" level=info msg="CreateContainer within sandbox \"61179e69f0ece7dc4447dcc140aa0f35709af55ee655d1451aa2e0df00f90773\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\"" Sep 9 05:35:41.815249 containerd[1938]: time="2025-09-09T05:35:41.815216322Z" level=info msg="StartContainer for \"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\"" Sep 9 05:35:41.818316 containerd[1938]: time="2025-09-09T05:35:41.818278585Z" level=info msg="connecting to shim fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b" address="unix:///run/containerd/s/49128dd45bb1bd994aba57fb9cff2ff9f81d9627db8b6a831f11dad2d61cc182" protocol=ttrpc version=3 Sep 9 05:35:41.871186 systemd[1]: Started cri-containerd-fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b.scope - libcontainer container fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b. Sep 9 05:35:41.981129 containerd[1938]: time="2025-09-09T05:35:41.980987929Z" level=info msg="StartContainer for \"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" returns successfully" Sep 9 05:35:42.082912 systemd[1]: Started sshd@10-172.31.26.176:22-147.75.109.163:41976.service - OpenSSH per-connection server daemon (147.75.109.163:41976). Sep 9 05:35:42.340570 sshd[5719]: Accepted publickey for core from 147.75.109.163 port 41976 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:42.343780 sshd-session[5719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:42.349950 systemd-logind[1878]: New session 11 of user core. Sep 9 05:35:42.361807 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:35:42.888636 containerd[1938]: time="2025-09-09T05:35:42.888522301Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"68519267a7b10ee131609f6698ae2ac36e57b21a4b654f8c287abd684a98edb9\" pid:5745 exit_status:1 exited_at:{seconds:1757396142 nanos:881935411}" Sep 9 05:35:43.162127 sshd[5722]: Connection closed by 147.75.109.163 port 41976 Sep 9 05:35:43.162679 sshd-session[5719]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:43.171608 systemd[1]: sshd@10-172.31.26.176:22-147.75.109.163:41976.service: Deactivated successfully. Sep 9 05:35:43.176842 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:35:43.179747 systemd-logind[1878]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:35:43.182845 systemd-logind[1878]: Removed session 11. Sep 9 05:35:43.861055 containerd[1938]: time="2025-09-09T05:35:43.861012962Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"9a4ccb6bf35d87b7eca61041d6108f74d8fb06cd8046e7eb47623c34732dfad0\" pid:5772 exit_status:1 exited_at:{seconds:1757396143 nanos:860688052}" Sep 9 05:35:44.912342 containerd[1938]: time="2025-09-09T05:35:44.912300600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"ad8817b978da770930c868a0a33e35918c9acfaa3dea0c434f4f80bf6d5fd6f0\" pid:5800 exit_status:1 exited_at:{seconds:1757396144 nanos:911984390}" Sep 9 05:35:45.601977 containerd[1938]: time="2025-09-09T05:35:45.601919041Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:45.605402 containerd[1938]: time="2025-09-09T05:35:45.605312139Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 05:35:45.616017 containerd[1938]: time="2025-09-09T05:35:45.615820374Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:45.619099 containerd[1938]: time="2025-09-09T05:35:45.619056888Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:45.619503 containerd[1938]: time="2025-09-09T05:35:45.619454634Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.875681003s" Sep 9 05:35:45.619579 containerd[1938]: time="2025-09-09T05:35:45.619517975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:35:45.641183 containerd[1938]: time="2025-09-09T05:35:45.641140534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:35:45.643996 containerd[1938]: time="2025-09-09T05:35:45.643176793Z" level=info msg="CreateContainer within sandbox \"5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:35:45.662326 containerd[1938]: time="2025-09-09T05:35:45.658457610Z" level=info msg="Container 02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:45.708794 containerd[1938]: time="2025-09-09T05:35:45.708739640Z" level=info msg="CreateContainer within sandbox \"5d05df6e112bc165cae2d0637725e0ebf2d7f855685d51bdeccb5379cb0581cf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667\"" Sep 9 05:35:45.710675 containerd[1938]: time="2025-09-09T05:35:45.709382894Z" level=info msg="StartContainer for \"02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667\"" Sep 9 05:35:45.711685 containerd[1938]: time="2025-09-09T05:35:45.711638367Z" level=info msg="connecting to shim 02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667" address="unix:///run/containerd/s/ecf3af09ec8c357967b639643080ecc45164dd0b368476c910a6a8325350c4a6" protocol=ttrpc version=3 Sep 9 05:35:45.739117 systemd[1]: Started cri-containerd-02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667.scope - libcontainer container 02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667. Sep 9 05:35:45.842979 containerd[1938]: time="2025-09-09T05:35:45.842936244Z" level=info msg="StartContainer for \"02ee7be790e63dbb6ee501d600a775439455c1618d5138bc16757932fea68667\" returns successfully" Sep 9 05:35:46.089576 containerd[1938]: time="2025-09-09T05:35:46.089471175Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:46.093505 containerd[1938]: time="2025-09-09T05:35:46.092646925Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:35:46.095688 containerd[1938]: time="2025-09-09T05:35:46.095629564Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 454.44799ms" Sep 9 05:35:46.095688 containerd[1938]: time="2025-09-09T05:35:46.095691563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:35:46.109980 containerd[1938]: time="2025-09-09T05:35:46.109939503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:35:46.116261 containerd[1938]: time="2025-09-09T05:35:46.116210930Z" level=info msg="CreateContainer within sandbox \"3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:35:46.170954 containerd[1938]: time="2025-09-09T05:35:46.170900238Z" level=info msg="Container 53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:46.209963 containerd[1938]: time="2025-09-09T05:35:46.209919634Z" level=info msg="CreateContainer within sandbox \"3084307e3950570fa36f1dc74cc8e13e1dc5927da79745289bb18baf0ca2c204\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35\"" Sep 9 05:35:46.508188 containerd[1938]: time="2025-09-09T05:35:46.506589655Z" level=info msg="StartContainer for \"53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35\"" Sep 9 05:35:46.511245 containerd[1938]: time="2025-09-09T05:35:46.511202466Z" level=info msg="connecting to shim 53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35" address="unix:///run/containerd/s/0299492c4431792736511bb38303df498c07e11f58fc443fac051282ebecc9b2" protocol=ttrpc version=3 Sep 9 05:35:46.563717 systemd[1]: Started cri-containerd-53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35.scope - libcontainer container 53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35. Sep 9 05:35:46.763915 containerd[1938]: time="2025-09-09T05:35:46.763345324Z" level=info msg="StartContainer for \"53e48645d576b5e3b6b3f95a61bea699c55bdf29af9785c825fd1f43e39f4d35\" returns successfully" Sep 9 05:35:47.027119 kubelet[3331]: I0909 05:35:47.026966 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-pbtr9" podStartSLOduration=31.045741533 podStartE2EDuration="39.015777668s" podCreationTimestamp="2025-09-09 05:35:08 +0000 UTC" firstStartedPulling="2025-09-09 05:35:33.772859035 +0000 UTC m=+47.733155689" lastFinishedPulling="2025-09-09 05:35:41.742895168 +0000 UTC m=+55.703191824" observedRunningTime="2025-09-09 05:35:42.791388718 +0000 UTC m=+56.751685393" watchObservedRunningTime="2025-09-09 05:35:47.015777668 +0000 UTC m=+60.976074336" Sep 9 05:35:47.030898 kubelet[3331]: I0909 05:35:47.027343 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-77545c669c-rfntj" podStartSLOduration=30.827566135 podStartE2EDuration="43.027320594s" podCreationTimestamp="2025-09-09 05:35:04 +0000 UTC" firstStartedPulling="2025-09-09 05:35:33.909575321 +0000 UTC m=+47.869871968" lastFinishedPulling="2025-09-09 05:35:46.109329775 +0000 UTC m=+60.069626427" observedRunningTime="2025-09-09 05:35:47.015016587 +0000 UTC m=+60.975313251" watchObservedRunningTime="2025-09-09 05:35:47.027320594 +0000 UTC m=+60.987617257" Sep 9 05:35:48.043568 kubelet[3331]: I0909 05:35:48.043527 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:48.204202 systemd[1]: Started sshd@11-172.31.26.176:22-147.75.109.163:41986.service - OpenSSH per-connection server daemon (147.75.109.163:41986). Sep 9 05:35:48.505507 sshd[5899]: Accepted publickey for core from 147.75.109.163 port 41986 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:48.508593 sshd-session[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:48.515895 systemd-logind[1878]: New session 12 of user core. Sep 9 05:35:48.523923 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:35:49.014167 kubelet[3331]: I0909 05:35:49.014109 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:49.934035 containerd[1938]: time="2025-09-09T05:35:49.933974692Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"c627b6b52513fb7a727c08c3e8bc1bab8093e29b2f531e53e6d3f70c55c134cd\" pid:5932 exited_at:{seconds:1757396149 nanos:933151641}" Sep 9 05:35:49.950210 sshd[5902]: Connection closed by 147.75.109.163 port 41986 Sep 9 05:35:49.951687 sshd-session[5899]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:49.963303 systemd[1]: sshd@11-172.31.26.176:22-147.75.109.163:41986.service: Deactivated successfully. Sep 9 05:35:49.977111 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:35:49.980758 systemd-logind[1878]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:35:50.007183 systemd[1]: Started sshd@12-172.31.26.176:22-147.75.109.163:50256.service - OpenSSH per-connection server daemon (147.75.109.163:50256). Sep 9 05:35:50.016813 systemd-logind[1878]: Removed session 12. Sep 9 05:35:50.103340 kubelet[3331]: I0909 05:35:50.103262 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-77545c669c-j4l25" podStartSLOduration=34.32980383 podStartE2EDuration="46.092496713s" podCreationTimestamp="2025-09-09 05:35:04 +0000 UTC" firstStartedPulling="2025-09-09 05:35:33.878358843 +0000 UTC m=+47.838655493" lastFinishedPulling="2025-09-09 05:35:45.641051722 +0000 UTC m=+59.601348376" observedRunningTime="2025-09-09 05:35:47.058037705 +0000 UTC m=+61.018334372" watchObservedRunningTime="2025-09-09 05:35:50.092496713 +0000 UTC m=+64.052793381" Sep 9 05:35:50.316096 sshd[5948]: Accepted publickey for core from 147.75.109.163 port 50256 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:50.322097 sshd-session[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:50.337620 systemd-logind[1878]: New session 13 of user core. Sep 9 05:35:50.344716 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:35:51.684953 sshd[5954]: Connection closed by 147.75.109.163 port 50256 Sep 9 05:35:51.690154 sshd-session[5948]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:51.726329 systemd[1]: sshd@12-172.31.26.176:22-147.75.109.163:50256.service: Deactivated successfully. Sep 9 05:35:51.731788 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:35:51.735991 systemd-logind[1878]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:35:51.746217 systemd[1]: Started sshd@13-172.31.26.176:22-147.75.109.163:50266.service - OpenSSH per-connection server daemon (147.75.109.163:50266). Sep 9 05:35:51.759077 systemd-logind[1878]: Removed session 13. Sep 9 05:35:52.066988 sshd[5970]: Accepted publickey for core from 147.75.109.163 port 50266 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:52.071987 sshd-session[5970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:52.093588 systemd-logind[1878]: New session 14 of user core. Sep 9 05:35:52.094749 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:35:52.327392 containerd[1938]: time="2025-09-09T05:35:52.326824554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:52.335070 containerd[1938]: time="2025-09-09T05:35:52.334997375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 05:35:52.337503 containerd[1938]: time="2025-09-09T05:35:52.336445108Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:52.340587 containerd[1938]: time="2025-09-09T05:35:52.340504496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:52.343577 containerd[1938]: time="2025-09-09T05:35:52.343531521Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 6.232971526s" Sep 9 05:35:52.344230 containerd[1938]: time="2025-09-09T05:35:52.343705881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 05:35:52.403392 containerd[1938]: time="2025-09-09T05:35:52.399086608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:35:52.631587 containerd[1938]: time="2025-09-09T05:35:52.631448539Z" level=info msg="CreateContainer within sandbox \"b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:35:52.671648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3233455725.mount: Deactivated successfully. Sep 9 05:35:52.681673 containerd[1938]: time="2025-09-09T05:35:52.681627199Z" level=info msg="Container 44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:52.712776 containerd[1938]: time="2025-09-09T05:35:52.712720890Z" level=info msg="CreateContainer within sandbox \"b49ddc25482b704a2df296704e3e7ec58164a32b7a5e70bb300d69cf36a0660b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\"" Sep 9 05:35:52.724168 containerd[1938]: time="2025-09-09T05:35:52.724121392Z" level=info msg="StartContainer for \"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\"" Sep 9 05:35:52.731996 containerd[1938]: time="2025-09-09T05:35:52.731942732Z" level=info msg="connecting to shim 44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136" address="unix:///run/containerd/s/c88632394b53c65d409c69ca9efcddfc7438fad9e4e818dd23c61918ec3828c5" protocol=ttrpc version=3 Sep 9 05:35:52.856569 sshd[5973]: Connection closed by 147.75.109.163 port 50266 Sep 9 05:35:52.859342 sshd-session[5970]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:52.867150 systemd[1]: sshd@13-172.31.26.176:22-147.75.109.163:50266.service: Deactivated successfully. Sep 9 05:35:52.871423 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:35:52.876036 systemd-logind[1878]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:35:52.879051 systemd-logind[1878]: Removed session 14. Sep 9 05:35:52.925822 systemd[1]: Started cri-containerd-44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136.scope - libcontainer container 44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136. Sep 9 05:35:53.018924 containerd[1938]: time="2025-09-09T05:35:53.018864048Z" level=info msg="StartContainer for \"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\" returns successfully" Sep 9 05:35:53.177675 kubelet[3331]: I0909 05:35:53.173813 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-69b9ddd587-48l4w" podStartSLOduration=28.234885788 podStartE2EDuration="44.173787135s" podCreationTimestamp="2025-09-09 05:35:09 +0000 UTC" firstStartedPulling="2025-09-09 05:35:36.428389886 +0000 UTC m=+50.388686528" lastFinishedPulling="2025-09-09 05:35:52.367291219 +0000 UTC m=+66.327587875" observedRunningTime="2025-09-09 05:35:53.166322101 +0000 UTC m=+67.126618762" watchObservedRunningTime="2025-09-09 05:35:53.173787135 +0000 UTC m=+67.134083794" Sep 9 05:35:54.441113 containerd[1938]: time="2025-09-09T05:35:54.441054664Z" level=info msg="TaskExit event in podsandbox handler container_id:\"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\" id:\"04b06e6b92130d39a9eea1c936ca6f34254734b23272581ee57f0b2676c7d364\" pid:6042 exited_at:{seconds:1757396154 nanos:398685848}" Sep 9 05:35:54.840556 containerd[1938]: time="2025-09-09T05:35:54.840306103Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:54.844939 containerd[1938]: time="2025-09-09T05:35:54.844139882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 05:35:54.848279 containerd[1938]: time="2025-09-09T05:35:54.847739418Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:54.855086 containerd[1938]: time="2025-09-09T05:35:54.855036714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:35:54.880023 containerd[1938]: time="2025-09-09T05:35:54.879859294Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.475215137s" Sep 9 05:35:54.880023 containerd[1938]: time="2025-09-09T05:35:54.879910103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 05:35:54.901133 containerd[1938]: time="2025-09-09T05:35:54.900568216Z" level=info msg="CreateContainer within sandbox \"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:35:54.963861 containerd[1938]: time="2025-09-09T05:35:54.963813280Z" level=info msg="Container ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:54.983392 containerd[1938]: time="2025-09-09T05:35:54.982682893Z" level=info msg="CreateContainer within sandbox \"32b658285114bb88e7e301d5944c9390ea91d450a84b25e59d7b35e63dba0757\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40\"" Sep 9 05:35:54.985148 containerd[1938]: time="2025-09-09T05:35:54.983816725Z" level=info msg="StartContainer for \"ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40\"" Sep 9 05:35:55.034032 containerd[1938]: time="2025-09-09T05:35:55.033926304Z" level=info msg="connecting to shim ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40" address="unix:///run/containerd/s/216ae2f72e70907cefd1c608da605d9c7e2ab03465e2492b8b4643659daf87f7" protocol=ttrpc version=3 Sep 9 05:35:55.119913 systemd[1]: Started cri-containerd-ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40.scope - libcontainer container ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40. Sep 9 05:35:55.231690 containerd[1938]: time="2025-09-09T05:35:55.231641284Z" level=info msg="StartContainer for \"ad93c7c77aacbc1fd1ece90d5c3168f88c152d7b38ae55565349c459ea6b3b40\" returns successfully" Sep 9 05:35:55.835843 kubelet[3331]: I0909 05:35:55.835270 3331 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:35:56.273286 kubelet[3331]: I0909 05:35:56.273192 3331 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zk5mx" podStartSLOduration=26.06777743 podStartE2EDuration="47.273170919s" podCreationTimestamp="2025-09-09 05:35:09 +0000 UTC" firstStartedPulling="2025-09-09 05:35:33.679128122 +0000 UTC m=+47.639424776" lastFinishedPulling="2025-09-09 05:35:54.884521623 +0000 UTC m=+68.844818265" observedRunningTime="2025-09-09 05:35:56.270283513 +0000 UTC m=+70.230580175" watchObservedRunningTime="2025-09-09 05:35:56.273170919 +0000 UTC m=+70.233467577" Sep 9 05:35:56.675090 kubelet[3331]: I0909 05:35:56.662673 3331 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:35:56.682739 kubelet[3331]: I0909 05:35:56.682709 3331 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:35:57.888878 systemd[1]: Started sshd@14-172.31.26.176:22-147.75.109.163:50282.service - OpenSSH per-connection server daemon (147.75.109.163:50282). Sep 9 05:35:58.142070 sshd[6092]: Accepted publickey for core from 147.75.109.163 port 50282 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:58.144596 sshd-session[6092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:58.152024 systemd-logind[1878]: New session 15 of user core. Sep 9 05:35:58.154780 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:35:59.314600 sshd[6095]: Connection closed by 147.75.109.163 port 50282 Sep 9 05:35:59.316790 sshd-session[6092]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:59.323639 systemd[1]: sshd@14-172.31.26.176:22-147.75.109.163:50282.service: Deactivated successfully. Sep 9 05:35:59.326884 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:35:59.329114 systemd-logind[1878]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:35:59.331402 systemd-logind[1878]: Removed session 15. Sep 9 05:35:59.361523 systemd[1]: Started sshd@15-172.31.26.176:22-147.75.109.163:50298.service - OpenSSH per-connection server daemon (147.75.109.163:50298). Sep 9 05:35:59.545940 sshd[6106]: Accepted publickey for core from 147.75.109.163 port 50298 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:35:59.547378 sshd-session[6106]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:59.552534 systemd-logind[1878]: New session 16 of user core. Sep 9 05:35:59.559245 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:36:00.515949 sshd[6109]: Connection closed by 147.75.109.163 port 50298 Sep 9 05:36:00.516446 sshd-session[6106]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:00.529650 systemd[1]: sshd@15-172.31.26.176:22-147.75.109.163:50298.service: Deactivated successfully. Sep 9 05:36:00.541935 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:36:00.546572 systemd-logind[1878]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:36:00.565567 systemd[1]: Started sshd@16-172.31.26.176:22-147.75.109.163:38142.service - OpenSSH per-connection server daemon (147.75.109.163:38142). Sep 9 05:36:00.572251 systemd-logind[1878]: Removed session 16. Sep 9 05:36:00.795364 sshd[6119]: Accepted publickey for core from 147.75.109.163 port 38142 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:00.797602 sshd-session[6119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:00.807139 systemd-logind[1878]: New session 17 of user core. Sep 9 05:36:00.813904 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:36:05.293806 sshd[6122]: Connection closed by 147.75.109.163 port 38142 Sep 9 05:36:05.304193 sshd-session[6119]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:05.360990 systemd[1]: sshd@16-172.31.26.176:22-147.75.109.163:38142.service: Deactivated successfully. Sep 9 05:36:05.369858 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:36:05.371258 systemd[1]: session-17.scope: Consumed 783ms CPU time, 73M memory peak. Sep 9 05:36:05.375556 systemd-logind[1878]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:36:05.388568 systemd[1]: Started sshd@17-172.31.26.176:22-147.75.109.163:38146.service - OpenSSH per-connection server daemon (147.75.109.163:38146). Sep 9 05:36:05.397226 systemd-logind[1878]: Removed session 17. Sep 9 05:36:05.694395 sshd[6142]: Accepted publickey for core from 147.75.109.163 port 38146 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:05.697307 sshd-session[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:05.707198 systemd-logind[1878]: New session 18 of user core. Sep 9 05:36:05.715393 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:36:06.933377 sshd[6146]: Connection closed by 147.75.109.163 port 38146 Sep 9 05:36:06.934113 sshd-session[6142]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:06.945639 systemd[1]: sshd@17-172.31.26.176:22-147.75.109.163:38146.service: Deactivated successfully. Sep 9 05:36:06.948552 systemd-logind[1878]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:36:06.951955 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:36:06.973544 systemd-logind[1878]: Removed session 18. Sep 9 05:36:06.974535 systemd[1]: Started sshd@18-172.31.26.176:22-147.75.109.163:38152.service - OpenSSH per-connection server daemon (147.75.109.163:38152). Sep 9 05:36:07.224378 sshd[6156]: Accepted publickey for core from 147.75.109.163 port 38152 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:07.226233 sshd-session[6156]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:07.232533 systemd-logind[1878]: New session 19 of user core. Sep 9 05:36:07.236653 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:36:07.610579 sshd[6159]: Connection closed by 147.75.109.163 port 38152 Sep 9 05:36:07.612202 sshd-session[6156]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:07.621311 systemd[1]: sshd@18-172.31.26.176:22-147.75.109.163:38152.service: Deactivated successfully. Sep 9 05:36:07.629948 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:36:07.634791 systemd-logind[1878]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:36:07.640154 systemd-logind[1878]: Removed session 19. Sep 9 05:36:09.251391 containerd[1938]: time="2025-09-09T05:36:09.251337116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" id:\"b2f4608509580c7b92e3be67d8e015948531de4607f24d024fb4090541098bdc\" pid:6184 exited_at:{seconds:1757396169 nanos:12200908}" Sep 9 05:36:12.651819 systemd[1]: Started sshd@19-172.31.26.176:22-147.75.109.163:48588.service - OpenSSH per-connection server daemon (147.75.109.163:48588). Sep 9 05:36:12.997090 sshd[6205]: Accepted publickey for core from 147.75.109.163 port 48588 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:13.000020 sshd-session[6205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:13.008562 systemd-logind[1878]: New session 20 of user core. Sep 9 05:36:13.014179 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:36:13.680813 sshd[6208]: Connection closed by 147.75.109.163 port 48588 Sep 9 05:36:13.682731 sshd-session[6205]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:13.689960 systemd-logind[1878]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:36:13.691209 systemd[1]: sshd@19-172.31.26.176:22-147.75.109.163:48588.service: Deactivated successfully. Sep 9 05:36:13.697044 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:36:13.701285 systemd-logind[1878]: Removed session 20. Sep 9 05:36:18.722736 systemd[1]: Started sshd@20-172.31.26.176:22-147.75.109.163:48592.service - OpenSSH per-connection server daemon (147.75.109.163:48592). Sep 9 05:36:18.968125 sshd[6222]: Accepted publickey for core from 147.75.109.163 port 48592 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:18.970907 sshd-session[6222]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:18.979425 systemd-logind[1878]: New session 21 of user core. Sep 9 05:36:18.985765 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 05:36:19.809443 containerd[1938]: time="2025-09-09T05:36:19.809393073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\" id:\"8e18b9ad52681817ca2a0d030851ae6e3909f7d9d1c0bc0ac178067c3f12bed7\" pid:6260 exited_at:{seconds:1757396179 nanos:800238188}" Sep 9 05:36:20.204767 sshd[6225]: Connection closed by 147.75.109.163 port 48592 Sep 9 05:36:20.208344 sshd-session[6222]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:20.220795 systemd[1]: sshd@20-172.31.26.176:22-147.75.109.163:48592.service: Deactivated successfully. Sep 9 05:36:20.227117 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 05:36:20.230601 systemd-logind[1878]: Session 21 logged out. Waiting for processes to exit. Sep 9 05:36:20.232950 systemd-logind[1878]: Removed session 21. Sep 9 05:36:20.446172 containerd[1938]: time="2025-09-09T05:36:20.446114950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"3c1846ccc06d8577714a64624c2b5729609551d433d0e618816f93f12949264e\" pid:6248 exited_at:{seconds:1757396180 nanos:445302344}" Sep 9 05:36:25.245346 systemd[1]: Started sshd@21-172.31.26.176:22-147.75.109.163:50038.service - OpenSSH per-connection server daemon (147.75.109.163:50038). Sep 9 05:36:25.554746 sshd[6285]: Accepted publickey for core from 147.75.109.163 port 50038 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:25.557906 sshd-session[6285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:25.565164 systemd-logind[1878]: New session 22 of user core. Sep 9 05:36:25.576705 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 05:36:26.383675 sshd[6288]: Connection closed by 147.75.109.163 port 50038 Sep 9 05:36:26.385717 sshd-session[6285]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:26.395418 systemd-logind[1878]: Session 22 logged out. Waiting for processes to exit. Sep 9 05:36:26.395961 systemd[1]: sshd@21-172.31.26.176:22-147.75.109.163:50038.service: Deactivated successfully. Sep 9 05:36:26.402079 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 05:36:26.406863 systemd-logind[1878]: Removed session 22. Sep 9 05:36:26.979135 containerd[1938]: time="2025-09-09T05:36:26.979090929Z" level=info msg="TaskExit event in podsandbox handler container_id:\"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\" id:\"21ee587bb7c0f22d01d00b5c358414ef242dc6c97f67d1aade34d433cc7b06aa\" pid:6310 exited_at:{seconds:1757396186 nanos:978658150}" Sep 9 05:36:31.348128 containerd[1938]: time="2025-09-09T05:36:31.348082934Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"ef42abaa8a24b07852d1bd19b89821d373284c6c243cb668d860b85a7627fc8a\" pid:6331 exited_at:{seconds:1757396191 nanos:330997275}" Sep 9 05:36:31.422527 systemd[1]: Started sshd@22-172.31.26.176:22-147.75.109.163:32922.service - OpenSSH per-connection server daemon (147.75.109.163:32922). Sep 9 05:36:31.699663 sshd[6342]: Accepted publickey for core from 147.75.109.163 port 32922 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:31.709429 sshd-session[6342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:31.728570 systemd-logind[1878]: New session 23 of user core. Sep 9 05:36:31.734985 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 05:36:33.138512 sshd[6345]: Connection closed by 147.75.109.163 port 32922 Sep 9 05:36:33.146212 sshd-session[6342]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:33.164046 systemd-logind[1878]: Session 23 logged out. Waiting for processes to exit. Sep 9 05:36:33.165803 systemd[1]: sshd@22-172.31.26.176:22-147.75.109.163:32922.service: Deactivated successfully. Sep 9 05:36:33.170002 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 05:36:33.174123 systemd-logind[1878]: Removed session 23. Sep 9 05:36:38.179900 systemd[1]: Started sshd@23-172.31.26.176:22-147.75.109.163:32938.service - OpenSSH per-connection server daemon (147.75.109.163:32938). Sep 9 05:36:38.547311 sshd[6374]: Accepted publickey for core from 147.75.109.163 port 32938 ssh2: RSA SHA256:k1gUnX9WA3dyp6ylgbUnG2K6cUpm99lcEZsxDzZ5bM4 Sep 9 05:36:38.549119 sshd-session[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:36:38.563350 systemd-logind[1878]: New session 24 of user core. Sep 9 05:36:38.573386 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 05:36:39.199254 containerd[1938]: time="2025-09-09T05:36:39.199178838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" id:\"983473089e68bde400124cc817bbc212c41cc46fb6697eb5cad8a307560ea473\" pid:6367 exited_at:{seconds:1757396199 nanos:194195570}" Sep 9 05:36:39.750609 sshd[6381]: Connection closed by 147.75.109.163 port 32938 Sep 9 05:36:39.752707 sshd-session[6374]: pam_unix(sshd:session): session closed for user core Sep 9 05:36:39.762735 systemd-logind[1878]: Session 24 logged out. Waiting for processes to exit. Sep 9 05:36:39.766178 systemd[1]: sshd@23-172.31.26.176:22-147.75.109.163:32938.service: Deactivated successfully. Sep 9 05:36:39.770355 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 05:36:39.773885 systemd-logind[1878]: Removed session 24. Sep 9 05:36:49.529634 containerd[1938]: time="2025-09-09T05:36:49.529400296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"44b0b721dd84e95ae2bf4e788f8a98584691c92271313dd037d8632750916136\" id:\"45a385d5f5ad7b6779c3145be8deea0fef300aff866cf548740aa0c0b984fe12\" pid:6426 exit_status:1 exited_at:{seconds:1757396209 nanos:515593558}" Sep 9 05:36:49.837908 containerd[1938]: time="2025-09-09T05:36:49.837791177Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb55aaf7f275d8d4355639ccb50bdd2f3557611ede45fabd3998a1cec2e0471b\" id:\"99b3f646ad28d5ffb7a9364d4fdf7bb20bfd900a333b97f3d9cad65adb35efe1\" pid:6425 exited_at:{seconds:1757396209 nanos:837310818}" Sep 9 05:36:53.238771 systemd[1]: cri-containerd-37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03.scope: Deactivated successfully. Sep 9 05:36:53.239643 systemd[1]: cri-containerd-37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03.scope: Consumed 3.300s CPU time, 81.8M memory peak, 117.1M read from disk. Sep 9 05:36:53.372918 containerd[1938]: time="2025-09-09T05:36:53.372867367Z" level=info msg="TaskExit event in podsandbox handler container_id:\"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\" id:\"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\" pid:3165 exit_status:1 exited_at:{seconds:1757396213 nanos:334815712}" Sep 9 05:36:53.383788 containerd[1938]: time="2025-09-09T05:36:53.383677417Z" level=info msg="received exit event container_id:\"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\" id:\"37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03\" pid:3165 exit_status:1 exited_at:{seconds:1757396213 nanos:334815712}" Sep 9 05:36:53.517581 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03-rootfs.mount: Deactivated successfully. Sep 9 05:36:53.569026 systemd[1]: cri-containerd-9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e.scope: Deactivated successfully. Sep 9 05:36:53.569402 systemd[1]: cri-containerd-9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e.scope: Consumed 13.024s CPU time, 104.8M memory peak, 93.4M read from disk. Sep 9 05:36:53.575186 containerd[1938]: time="2025-09-09T05:36:53.574252570Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" id:\"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" pid:3836 exit_status:1 exited_at:{seconds:1757396213 nanos:573684277}" Sep 9 05:36:53.575186 containerd[1938]: time="2025-09-09T05:36:53.574769420Z" level=info msg="received exit event container_id:\"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" id:\"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" pid:3836 exit_status:1 exited_at:{seconds:1757396213 nanos:573684277}" Sep 9 05:36:53.644736 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e-rootfs.mount: Deactivated successfully. Sep 9 05:36:54.282946 kubelet[3331]: I0909 05:36:54.281221 3331 scope.go:117] "RemoveContainer" containerID="9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e" Sep 9 05:36:54.290161 kubelet[3331]: I0909 05:36:54.290040 3331 scope.go:117] "RemoveContainer" containerID="37a2307e24a0e4b1fafcd24cb164bb2fc5821ce35696ccaead44d47fb5849d03" Sep 9 05:36:54.393258 containerd[1938]: time="2025-09-09T05:36:54.392938879Z" level=info msg="CreateContainer within sandbox \"645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 9 05:36:54.396957 containerd[1938]: time="2025-09-09T05:36:54.396905079Z" level=info msg="CreateContainer within sandbox \"fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 9 05:36:54.577562 containerd[1938]: time="2025-09-09T05:36:54.576358781Z" level=info msg="Container df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:36:54.581219 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount755785479.mount: Deactivated successfully. Sep 9 05:36:54.599078 containerd[1938]: time="2025-09-09T05:36:54.599036421Z" level=info msg="Container de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:36:54.604014 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4229771123.mount: Deactivated successfully. Sep 9 05:36:54.631580 containerd[1938]: time="2025-09-09T05:36:54.631525122Z" level=info msg="CreateContainer within sandbox \"fceb566f03e2b94aef1682456db4c9ac3299b4f280bd69c962232c50a7abef73\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\"" Sep 9 05:36:54.634883 containerd[1938]: time="2025-09-09T05:36:54.634837915Z" level=info msg="StartContainer for \"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\"" Sep 9 05:36:54.640028 containerd[1938]: time="2025-09-09T05:36:54.638809474Z" level=info msg="CreateContainer within sandbox \"645ed9c4e2c18ff79297e40d12f752843e77d1a79cb8ab7913f2be36db45992f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129\"" Sep 9 05:36:54.640980 containerd[1938]: time="2025-09-09T05:36:54.640954091Z" level=info msg="StartContainer for \"de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129\"" Sep 9 05:36:54.642321 containerd[1938]: time="2025-09-09T05:36:54.642294867Z" level=info msg="connecting to shim de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129" address="unix:///run/containerd/s/ea8a0118e6402a07b7056d0dd061ce0e082a6d16344d5bd7f4b1c08d7d3d5ef9" protocol=ttrpc version=3 Sep 9 05:36:54.643396 containerd[1938]: time="2025-09-09T05:36:54.643366535Z" level=info msg="connecting to shim df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643" address="unix:///run/containerd/s/c480ea042eb7c08968efffcfea3c4048cb8658240b370d426708b3299823d7ce" protocol=ttrpc version=3 Sep 9 05:36:54.732695 systemd[1]: Started cri-containerd-de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129.scope - libcontainer container de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129. Sep 9 05:36:54.740682 systemd[1]: Started cri-containerd-df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643.scope - libcontainer container df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643. Sep 9 05:36:54.846948 containerd[1938]: time="2025-09-09T05:36:54.846830239Z" level=info msg="StartContainer for \"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\" returns successfully" Sep 9 05:36:54.849737 containerd[1938]: time="2025-09-09T05:36:54.849226755Z" level=info msg="StartContainer for \"de3000e22cae9769850faf90d2a75ed843e9d67fd748541fce78abf03c0f3129\" returns successfully" Sep 9 05:36:58.902957 kubelet[3331]: E0909 05:36:58.899114 3331 controller.go:195] "Failed to update lease" err="Put \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 9 05:36:58.950758 systemd[1]: cri-containerd-3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73.scope: Deactivated successfully. Sep 9 05:36:58.951887 systemd[1]: cri-containerd-3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73.scope: Consumed 2.151s CPU time, 36.6M memory peak, 85.9M read from disk. Sep 9 05:36:58.959647 containerd[1938]: time="2025-09-09T05:36:58.956157035Z" level=info msg="received exit event container_id:\"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\" id:\"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\" pid:3167 exit_status:1 exited_at:{seconds:1757396218 nanos:955315809}" Sep 9 05:36:58.959647 containerd[1938]: time="2025-09-09T05:36:58.956332378Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\" id:\"3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73\" pid:3167 exit_status:1 exited_at:{seconds:1757396218 nanos:955315809}" Sep 9 05:36:59.076194 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73-rootfs.mount: Deactivated successfully. Sep 9 05:36:59.251068 kubelet[3331]: I0909 05:36:59.251033 3331 scope.go:117] "RemoveContainer" containerID="3b11f58122a420ea7cc3d28c3c3067d9fbe72756d1c913114800c2363cf07f73" Sep 9 05:36:59.253625 containerd[1938]: time="2025-09-09T05:36:59.253582178Z" level=info msg="CreateContainer within sandbox \"de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 9 05:36:59.273921 containerd[1938]: time="2025-09-09T05:36:59.271636064Z" level=info msg="Container faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:36:59.289672 containerd[1938]: time="2025-09-09T05:36:59.289607306Z" level=info msg="CreateContainer within sandbox \"de8350f0a681682988b29f544cf2a05bc78728e20611d6d6f0583bb56d30784c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a\"" Sep 9 05:36:59.290171 containerd[1938]: time="2025-09-09T05:36:59.290140755Z" level=info msg="StartContainer for \"faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a\"" Sep 9 05:36:59.291269 containerd[1938]: time="2025-09-09T05:36:59.291212603Z" level=info msg="connecting to shim faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a" address="unix:///run/containerd/s/22ef77054589c9dca60092ef46de6bb3918d2315c34859fbac6e8644db76c05a" protocol=ttrpc version=3 Sep 9 05:36:59.317798 systemd[1]: Started cri-containerd-faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a.scope - libcontainer container faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a. Sep 9 05:36:59.378065 containerd[1938]: time="2025-09-09T05:36:59.378008047Z" level=info msg="StartContainer for \"faeeaf6e8aa4bc1dc57a11ec66c5a9f131268aba8d7c5af301c7f86af899738a\" returns successfully" Sep 9 05:37:06.514362 systemd[1]: cri-containerd-df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643.scope: Deactivated successfully. Sep 9 05:37:06.515469 systemd[1]: cri-containerd-df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643.scope: Consumed 304ms CPU time, 75.1M memory peak, 40.5M read from disk. Sep 9 05:37:06.516825 containerd[1938]: time="2025-09-09T05:37:06.516108765Z" level=info msg="received exit event container_id:\"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\" id:\"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\" pid:6507 exit_status:1 exited_at:{seconds:1757396226 nanos:515812824}" Sep 9 05:37:06.516825 containerd[1938]: time="2025-09-09T05:37:06.516708468Z" level=info msg="TaskExit event in podsandbox handler container_id:\"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\" id:\"df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643\" pid:6507 exit_status:1 exited_at:{seconds:1757396226 nanos:515812824}" Sep 9 05:37:06.544576 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643-rootfs.mount: Deactivated successfully. Sep 9 05:37:07.362862 kubelet[3331]: I0909 05:37:07.362819 3331 scope.go:117] "RemoveContainer" containerID="9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e" Sep 9 05:37:07.363466 kubelet[3331]: I0909 05:37:07.363036 3331 scope.go:117] "RemoveContainer" containerID="df20ada8456aebb6929dac7de17cc210c97d098a5489cbc08f5066ebb0cf5643" Sep 9 05:37:07.397697 kubelet[3331]: E0909 05:37:07.378885 3331 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-58fc44c59b-zpshc_tigera-operator(0978dcef-6c7f-4a80-956d-5d8cd66b4805)\"" pod="tigera-operator/tigera-operator-58fc44c59b-zpshc" podUID="0978dcef-6c7f-4a80-956d-5d8cd66b4805" Sep 9 05:37:07.472636 containerd[1938]: time="2025-09-09T05:37:07.472578955Z" level=info msg="RemoveContainer for \"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\"" Sep 9 05:37:07.488922 containerd[1938]: time="2025-09-09T05:37:07.488865915Z" level=info msg="RemoveContainer for \"9fc12a00cf163214dbac6f03e12c473308eb0b46a939cf45d9ce13566c2e8c5e\" returns successfully" Sep 9 05:37:08.542917 containerd[1938]: time="2025-09-09T05:37:08.542864178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7a935f19da868b5a950c7c20d19b09767d48df7322f7f65c9cd8eb4082003a4\" id:\"07a602ad24c49ec2225fa099e2643f3ded1ee227685dc7d4432aa8305bd1cbf8\" pid:6629 exited_at:{seconds:1757396228 nanos:542437444}" Sep 9 05:37:08.915502 kubelet[3331]: E0909 05:37:08.915369 3331 controller.go:195] "Failed to update lease" err="Put \"https://172.31.26.176:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-26-176?timeout=10s\": context deadline exceeded"